We are having issues w/ a stored proc call in our application. When I run the proc through a query window in Mgmt Studio, it comes back in .3 seconds. However, when the application runs the proc (w/ the same parameters), it takes 30 seconds for the proc to complete. When I run a trace with the proc call through Mgmt Studio, it says 4000 read, but through the app, it says 4 million reads. What is happening?? (app uses Hibernate 3.2/ Java 1.5/ JDBC)
I am able to run a query which runs FAst in QA but slow in theapplication.It takes about 16 m in QA but 1000 ms on theApplication.What I wanted to know is why would the query take a longtime in the application when it runs fast on SQL server?How should we try debugging it?Ajay
I have a very complex Stored Procedure called by a Job that is Scheduled to run every night. It's execution takes sometimes 1 or 2 hours and sometimes 7 hours or more.
So, if it is running for more than 4 hours I stop the Job and I run the procedure from a Query Window and it never takes more than 2 hours.
Can anyone help me identify the problem ? I want to run from the Job and not to worry about it.
Some more information: - It is SQL 2000 Enterprise with SP4 in a Cluster (It happens the same way in any node). - The SQL Server and SQL Agent services run using a Domain Account that have full Administrative access. - When I connect to a Query Window I also use a Windows Account.
- There is no locks or process bloking or being blocked while the job is running. - Using the Task Manager the processor activity is ok, no more than 30 % in any processor.
Some users complain that there computers run slow around noon. later it is fine. Anybody can tell me what is the problem? We have entivirus software installed on. Also i am runing back up database on the server. Many thanks.
When i execute a stored procedure it generally takes about half a second to run but sometimes it takes 20 to 30 seconds. I am the only one using the server so I know it is not due to other traffic. I have looked at Profiler and nothing looks out of the ordinary. Another observation is that the slow ones are always near eachother. I will have about 10 fast executions and then 3 slow ones and then back to fast ones. Has anyone seen anything like this before?
Hello , I'm a newbie,I programmed a store procedure,and when it runs in query analyzer, I get results very fast,but when I add it in a job,then this job run very very slow, (I add records and delete records in the SP), WHY?
I have SQL Server and OLAP Server running on an NT Server box, and a pc running NT Workstation which has OLAP Manager installed on it. I can register the OLAP Server just fine in OLAP Manager, but it takes about 30sec to a minute for the pc to actually connect to the server.
We've also installed SQL Server and OLAP Server on a Win 2000 machine, and the OLAP connection is faster.
Could this be related to a network routing issue, OLAP or SQL server install issue?
I am using SSIS packages for data transfer, When i run the package on virtual server it takes more time as when run on a PC. After analysing i found that Package when run on Virtual server takes time in startup around (50 sec) or so.Could anyone help me with a little bit of detail description as to why it runs slow.
Hi. I've got a report with 4 different sections - the datasets coming from some tables that are populated via a stored procedure. I'd love it if the the first thing this report did was run that stored procedure and then the data would be available for the actual reporting piece. Is that possible? And if so, how do I make it work?
I have one drop and create procedure script in one sql session and execute the procedure in other session. Now when i am compiling the procedure in first session, it gets completed successfully.
Now when i move to other session and try to execute the procedure, it gives me error saying :
"Invalid column name 'ColumnName'."
Now when I execute the procedure in same session in which procedure was compiled, it runs successfully. Once run, I go back to my other session and it runs again.
When I launch Outlook, it takes forever for the program to finally open. With any inbound email, it stops processing whatever is underway at the time....and frequently there is a 2-3 second lag between keyboard input and what appears on the screen. SQLserver is usually consuming upwards of 1-gb of memory....help. Mike
SServer PC: Win SBS 2003 with 2.6 GHz processor and 1GB RAMSQL Server 2000 v 2000.8.00.76 (sp3)MS Office 2k3MSJet ms04-014 (latest ost sp8)MDAC v2.8 RTMADO 2.1vb6.exe / ADO 2.0I think this is a SQL Server/ADO problem as I have 2 applications withsame problem.My access database uses a timer based function to insert records intoSQL Server using ADO and stored procedures. Access also uses DAO ,Jet/ odbc to linked tables on SQL Server for many other tasks/forms.All is well when Access 1st run but after a few hours or so the Accessapp grinds to a halt.Upon checking the task manager the mem usuage upto 160MB and handlecount upto 86,000 ! (cpu process % is low).After the "Access Fail" if I stop/start access only, performance isnot returned, I have to stop/start SQL Server.It would seem that allconnections from this PC to SQL server are badly affected, it is nottied to the client application that had the problem.As I could not work out where the problem was I took the Accessfunctionality into a VB6 app, using ADO 2.0, thinking this shouldsimplify matters with Jet and ODBC out of the way.I now have the same problem with the number of handles increasing withevery new timer based function.* code snippet example *If Not OpenConnection Then 'we have not been able to open aconnection to SQL serverCall procLog("Connection failed to SQL server")Exit FunctionEnd If'gVar.cnnSQL is my public ADODB.ConnectionSet cmdSQL = New ADODB.CommandWith cmdSQL.ActiveConnection = gVar.cnnSQL.CommandText = "MyDB.dbo.insert_tblMyData".CommandType = adCmdStoredProc.Execute RecordsAffected:=lngRecs, _Parameters:=Array(lngID, dtDate,intCategory,strNationality,strNotes,strName)End With* code snippet *** After the "Access Fail" if I look at one of my clients, running thesame Access app on another PC, it seems normally responsive when usingone my bound forms to browse the data from same SQL Server **Any ideas anyone ?
Hello All, I have two procedures being run one after the other. when I run proc1 it runs for about 15 min. Now the proc2 is dependent on proc1, when I run proc2 it runs for 45 min. If I run both the proc's simultaneously through .net code it takes more than 1 hour. Can anyone of you tell me where would be the problem.
Dear All,Finally I completed my project. Thanks all you helped me to do it.Now I have the biggest problem. In my application the Data grid is filled with Data from SQL server table which has large number of records. When I run my queries in SS Management Studio it runs very fast. To fill data to datagrid it takes lots of time. How can I reduce this time. How can I increase the performance of my Application.Thanks,Janaka
I have a stored proc which creates a couple of temp tables, then bounces them against a production table an updates a column in the production table.
This stored proc takes about 10 minutes to run and update about 20,000 rows. If I execute each statement seperately in an ISQL window, it all runs in under 2 minutes.
Any ideas on why this is happening are GREATLY appreciated.
we just moved a database from a shared SQL 2000 server to SQLExpress on a VPS server. Every thing seems to work great, really no performance loss at all, except for one stored proc that is giving us problems. It's function is to update a history table, and then update the production table. On the old server, it would take less than a second to run this, now it takes anywhere from 45 seconds to 1 minute or it times out. This database is used on a classic asp web app. ANY help at all on this would be appreciated as I am pulling my hair out trying to figure out what's wrong here. here is the proc.
------------------------------------------ code ---------------------------------------------- set ANSI_NULLS ON set QUOTED_IDENTIFIER ON SET NOCOUNT ON go
We are using - MSSQL Server 2000 on windows 2000 advanced server. PB 7.0 for client server front end tools.
Prior to few days our application works fine, rightnow it get slow or not responding after few (4/5) transaction. We don't know wht it is, If anyone has same experience so that is helpfull for us.
We are having SQL2000 Advance Server. 4 processor with hypherthreading, Memory 4 GB and it is a high transactional OLTP server. We are also running Transaction Replication on that server.
We recently bought Double Take and implemented on the server with File Difference with block check sum option for Disaster Recovery Purpose. The Queue and the Log file folder is on local drive where the system doesn't use that folder except double take.
We are replicating from Source to Taget thru the WAN (DS3).We are replicating approx. 200 Gig of Data but just the difference. Now the CPU usage is normal,Memory utilization is normal, but the network is a major problem as the Applications connecting to the Server timesout and the applications running very slow.
We have set just like the Tech support recommended.
I would appreciate, if someone give us some recommendations to run the double take without any problem.
Recently our database has been migrated from SQL 2000 to SQL 2005 on a new server(machine) with windows 2003(previously windows 2000). If the database is retained on the same machine but with a named instance of 2005, the application(websphere 5.1) is behaving normal whereas if i configure the aplication to the new server it is running slow for some of the queries but not all.
This change will have to be implemented in production very soon. Any advise will be of great benefit
I've got a football (soccer for the yanks!) predictions league website that is driven by and Access database. It basically calculates points scored for a user getting certain predictions correct. This is the URL:
http://www.pool-predictions.co.uk/home/index.asp
There are two sections of the site however that have almost ground to halt now that more users have registered throught the season. The players section and league table section have gone progressively slower to load throughout the year and almost taking 2 minutes to load.
All the calculations are performed in the Access database Ive written and there are Access SQL queries to get the data out.
My question is, is how can I speed the bloody thing up! ! Somone has alos suggested to me that I use stored procedures and SQL Server to speed things up? Ive never used SQL Server before so I am bit scared about using it (Im only a hobbyist), and I dont even know what a SP is or does. How easy will it be upgrading the whole thing to SQL Server and will it be worth the hassle, bearing in mind I expect my userbase to keep growing? Do SP help speed things up significantly? Would appreciate some advice!
I am attempting to write a Windows service that watches a database for uploaded files to import. When a new file is found, the corresponding SSIS package is run from the file system with variables passed through. I started development as a Windows app and copied the functionality to a service.
The app runs fine. The service does not. I get a "Failure" each time a package is executed. Everything is identical behind the scenes with the obvious exceptions that OnStart and OnStop handlers are buttons in the app. I added a script task at the beginning of one of the SSIS packages to notify me that it is even running at all. It doesn't even hit that initial task.
Again, the app will run all packages just fine. The data is imported and the results return as "Success."
The following is the code executing the package. Any help is appreciated. I've been banging my head on this one for a few days now. (Is there a tag to format a code sample?)
Dim pkgLocation As String Dim pkg As New Package Dim app As New Application Dim pkgResults As DTSExecResult
I'm developing a desktop application using CAB and DevExpress 6.3 controls. I use ReportViewer control to render .rdlc reports under local mode. I loaded data source from a dataset, template definition from a string, called the Refresh method of ReportViewer.LocalReport, so far, all are OK. But once I call the RefreshReport method of ReportViewer, the whole Application slows down, even after I closed the report viewer form. Operations like switch between menu items, open a new form and resize the main form become double or triple slow, CPU usage is keeping in 100% while doing those operations... I've no clue how refreshing report affects all those UI drawing... However, if I break down and jump over the RefreshReport method, an empty report form shows and every thing keeps OK. After all, all the reports can be rendered and shown correctly finally...
And seems there is nothing to do with the templates and data source, the templates are very simple, and event if I render the reports with empty data tables, it also slows down my application. I tried to create and run the form in a new thread, tried to upgrade to ReportViewer 9.0, but both seem useless.
Could any body help me on this problem? Thanks in advance.
Hello All, I’m looking for a solution to timeouts that occur when I’m executing a stored procedure from my web application. Most of the SPs will run from 3 to 15 minutes, and, unfortunately, modifying/optimizing them isn’t an option at the moment. I tried setting the CommandTimeout to 0 with no luck. Unless, I didn’t use it properly. Here’s my code: 1 2 try 3 { 4 string dbConn = ConfigurationManager.ConnectionStrings["ConStringNTMTLDEV"].ToString(); 5 OleDbConnection connection = new OleDbConnection(dbConn); 6 7 lbl_SearchResult.Text = dbConn; 8 9 //OleDbDataAdapter adapter = new OleDbDataAdapter(); 10 OleDbCommand cmd = new OleDbCommand("SP_CallHistoryLookUp", connection); 11 cmd.CommandType = CommandType.StoredProcedure; 12 13 cmd.Parameters.Add(new OleDbParameter("@phoneNumber", "1234567890")); 14 cmd.Parameters.Add(new OleDbParameter("@email", "123@123.com")); 15 cmd.Parameters.Add(new OleDbParameter("@WebUser", "123")); 16 connection.Open(); 17 cmd.CommandTimeout = 0; 18 cmd.ExecuteNonQuery(); 19 cmd.Dispose(); 20 connection.Close(); 21 }catch(OleDbException ex) 22 { 23 lbl_SearchResult.Text += "<br/> Something went wrong </br>"; 24 lbl_SearchResult.Text += ex.Message.ToString(); 25 } 26 27 28 Is it possible to launch a stored procedure and close the connection without waiting for a result?Would the stored procedure still run on the SQL server? I’m using MSSQL 7. Would you have any examples that would solve this problem? Thank you for your help. R.
I have a VB6 SP6 MDAC 2.8 application talking to SQL Server 2000. Once I'veinstalled this application on my local machine I have been able to move the..exe file to a file server and it runs just fine from there for all my localusers. This is very handy for updating the application without having toreinstall it on each user's machine each time. They just use a shortcutpointing to the file server .exe fileMy problem has become that we have some users at remote locations that VPNinto our network who also want to use this application. When it's installedlocally on their machines, or even on file servers at their locations,everything works fine. However, when they run the .exe off of my fileserver the SQL Server connections times out on the initial connection openafter the default 30 seconds every time. Run from their own desktops orfile servers it connects within 2 seconds every time. This is on bothWindows 2000 and XP Pro machines. Even when I tell the shortcut to usetheir local drive for the working directory the same problem happens.This goes against everything I've seen with server-based files for 15 years.Once a file is loaded in the computer for execution, why does it matterwhere it came from? What "baggage" can an .exe file carry with it that willcause it to not execute when hosted on one server, but run just fine fromthe desktop or another, closer server? The only known difference is thatthe connection speed is much better when I run it locally here to that fileserver.Help.
I have a stored proceedure (which I will tag on at the end for those interested) which is taking at least 15 minutes to run when executed, but completes in 1 minute when the tsql statement is run in Query Analyser. Why is this?
I suspect that it may be connected to table indexing, but why then is this bypassed when QA is used?
Any advice appreciated.
Derek
************************************************** *********************** IF EXISTS (SELECT * FROM sysobjects WHERE id = object_id(N'dbo.sp_ValidateAIGL') AND OBJECTPROPERTY(id, N'IsProcedure') = 1) DROP PROCEDURE dbo.sp_ValidateAIGL GO
-- *** Global Non Program Specific Data Errors *** -- CHECK - that there are records in the DEB_IGL_PAYROLL_OUTPUT file.....none and the routine failed... IF NOT EXISTS(SELECT * FROM tbl_OUT_Payroll WHERE IGLProgramID = @IGLProgramID) INSERT INTO #IGLErrors SELECT NULL, 100, 'No records were processed by the IGL run!'
SELECT * FROM #IGLErrors
-- CHECK - search for any records where the employee's EXPENSE_CODE is NULL INSERT INTO #IGLErrors SELECT DISTINCT NULLIF(EmpNo, ''), 2, 'Employee "' + COALESCE(NULLIF(RTRIM(EmpNo), ''), '<Missing Employee>') + '" (Organisation Unit - ' + COALESCE(RTRIM(OrgUnitCode), '<No Organisation Unit>') + ') does not have a EXPENSE_CODE Code.' FROM tbl_OUT_Payroll WHERE NULLIF(ExpenseCode, '') IS NULL ANDIGLProgramID = @IGLProgramID
SELECT * FROM #IGLErrors -- CHECK - check that the BALANCE of DEBITs match the balance of CREDITs IF (SELECT SUM(Cash) FROM tbl_OUT_Payroll WHERE IsCredit = 1 AND IGLProgramID = @IGLProgramID) <> (SELECT SUM(Cash) FROM tbl_OUT_Payroll WHERE IsCredit = 0 AND IGLProgramID = @IGLProgramID) INSERT INTO #IGLErrors SELECT NULL, 3, 'The total cash value for DEBIT elements does not match the total cash for CREDIT elements.'
SELECT * FROM #IGLErrors -- *** Program 1 and 2 errors *** IF @IGLProgramID IN (1, 2) BEGIN -- CHECK - search for any records where the employee's COST_CENTRE is NULL INSERT INTO #IGLErrors SELECT DISTINCT NULLIF(EmpNo, ''), 1, 'Employee "' + NULLIF(RTRIM(EmpNo), '') + '" (Organisation Unit = ' + RTRIM(OrgUnitCode) + ') does not have a COST_CENTRE Code.' FROM tbl_OUT_Payroll WHERE NULLIF(CostCenter, '') IS NULL ANDIGLProgramID = @IGLProgramID
SELECT * FROM #IGLErrors
-- Check for EMPLOYEEs that were not transfered to the PAYROLL output (usually caused by missing ORG_UNITs or not picked up in -- the DEB_VIEW_APPOINTEE view...) INSERT INTO #IGLErrors SELECT DISTINCT EMP_NO, 11, 'Employee "' + RTRIM(EMP_NO) + '" was excluded from the summary. Check their Organisation Unit codes!' FROM PSELive.dbo.COSTING_OUTPUT WHERENOT EMP_NO IN (SELECT DISTINCT EmpNo FROM tbl_OUT_Payroll WHERE IGLProgramID = @IGLProgramID) ANDPERIOD_NO = @PeriodNo ANDTAX_YEAR = @TaxYear
SELECT * FROM #IGLErrors
-- Check that there are no ELEMENTS in the COSTING_OUTPUT table that don't exist in the tbl_IGLElements table INSERT INTO #IGLErrors SELECT DISTINCT ELEMENT, 12, 'Element "' + RTRIM(ELEMENT) + '" does not exist in the IGL Interface Elements table!' FROM PSELive.dbo.COSTING_OUTPUT WHERE ELEMENT NOT IN ( SELECT DISTINCT Element FROM tbl_IGLElements ) ANDPERIOD_NO = @PeriodNo
SELECT * FROM #IGLErrors
END
-- *** Add a error to indicate the number of errors *** IF EXISTS (SELECT * FROM #IGLErrors) INSERT INTO #IGLErrors SELECT 0, 0, 'Warning, there are ' + CAST(Count(*) AS VarChar(5)) + ' recorded errors!' FROM#IGLErrors
-- Transfer the records to the ErrorsLog table ready for the user to view... DELETE FROM tbl_SYSErrorsLog INSERT INTO tbl_SYSErrorsLog (IGLProgramID, OutputLogID, KeyField, ErrorID, Description) SELECT@ProgramLogID, @IGLPeriodID, KeyField, ErrorID, Description FROM #IGLErrors ORDER BY ErrorID
DROP TABLE #IGLErrors
SELECT * FROM tbl_SYSErrorsLog ORDER BY ErrorID
--SET NOCOUNT OFF
GO GRANT EXECUTE ON dbo.sp_ValidateAIGL TO Public GO
Hi All,Quick question, I have always heard it best practice to check for exist, ifso, drop, then create the proc. I just wanted to know why that's a bestpractice. I am trying to put that theory in place at my work, but they areasking for a good reason to do this before actually implementing. All Icould think of was that so when you're creating a proc you won't get anerror if the procedure already exists, but doesn't it also have to do withCompilation and perhaps Execution. Does anyone have a good argument fordoing stored procs this way? All feedback is appreciated.TIA,~CK
I have an ASP that has been working fine for several months, but itsuddenly broke. I wonder if windows update has installed some securitypatch that is causing it.The problem is that I am calling a stored procedure via an ASP(classic, not .NET) , but nothing happens. The procedure doesn't work,and I don't get any error messages.I've tried dropping and re-creating the user and permissions, to noavail. If it was a permissions problem, there would be an errormessage. I trace the calls in Profiler, and it has no complaints. Thedatabase is getting the stored proc call.I finally got it to work again, but this is not a viable solution forour production environment:1. response.write the SQL call to the stored procedure from the ASPand copy the text to the clipboard.2. log in to QueryAnalyzer using the same user as used by the ASP.3. paste and run the SQL call to the stored proc in query analyzer.After I have done this, it not only works in Query Analyzer, but thenthe ASP works too. It continues to work, even after I reboot themachine. This is truly bizzare and has us stumped. My hunch is thatwindows update installed something that has created this issue, but Ihave not been able to track it down.
have SQL Server 2005 std edition SP1 installed on Windows 2003 Std edition .Configured Transactional (single Publisher and no clustered environment.) Replication past two months working fine, Now 1.Distrib.exe application err is coming.
Due to which my job is failing (Distributor to Subscriber). Iam attaching thw file. Thanks Sandeep
Is it possible to retrieve the resultset of a stored procedure from another procedure in sql server 2000. Basically I am calling proc2 from the inside of proc1. proc2 returns 2 resultsets. I want to process these two resultsets from within proc1.
I am working with a large application and am trying to track down a bug. I believe an error that occurs in the stored procedure isbubbling back up to the application and is causing the application not to run. Don't ask why, but we do not have some of the sourcecode that was used to build the application, so I am not able to trace into the code. So basically I want to examine the stored procedure. If I run the stored procedure through Query Analyzer, I get the following error message: Msg 2758, Level 16, State 1, Procedure GetPortalSettings, Line 74RAISERROR could not locate entry for error 60002 in sysmessages. (1 row(s) affected) (1 row(s) affected) I don't know if the error message is sufficient enough to cause the application from not running? Does anyone know? If the RAISERROR occursmdiway through the stored procedure, does the stored procedure terminate execution? Also, Is there a way to trace into a stored procedure through Query Analyzer? -------------------------------------------As a side note, below is a small portion of my stored proc where the error is being raised: SELECT @PortalPermissionValue = isnull(max(PermissionValue),0)FROM Permission, PermissionType, #GroupsWHERE Permission.ResourceId = @PortalIdAND Permission.PartyId = #Groups.PartyIdAND Permission.PermissionTypeId = PermissionType.PermissionTypeId IF @PortalPermissionValue = 0BEGIN RAISERROR (60002, 16, 1) return -3END
Currently I have two SSIS jobs on my machine. The problem I'm having is, only one of the jobs executes succesfully, the other one fails for incorrect user login. Both jobs use the same configuration database and all the packages on both jobs have the protection level set to "DontSaveSensitive". Both jobs have been deployed in the exact same manner, yet only one succeceeds and the other fails.
I have a vbs script to try to prove that I can perform vbs scripting in either a job step or a dts package The script is
dim rs, sql,adoconn, adocommand, dataconnstring
set adoconn = createObject("ADODB.CONNECTION") Set rs = CreateObject("ADODB.Recordset") set adocommand = CreateObject("ADODB.Command") adoconn.ConnectionString = "Provider=SQLOLEDB;Server=myserver;Database=pubs;U id=myuser;Pwd=mypass;"
adoconn.Open sql = "select * from import" rs.Open sql, adoconn,adOpenForwardOnly while rs.EOF = false sql = "insert into zz default values" adocommand.ActiveConnection = adoconn adocommand.CommandText = sql adocommand.Execute
rs.MoveNext wend
rs.Close adoconn.Close set adoconn = nothing
However, when I run this from windows explorer it works fine, but when I try to run it as an activeX script, I get the error ActiveX scripting: Function not found
As a cmdexec step in a job, I use the line c:inetpubwwwrootvbsvbstest1.vbs It failed with the error The process could not be created for step 1 of job 0x119BDBD264AD9B4597A9302786F0E250 (reason: %1 is not a valid Win32 application). The step failed.
What is wrong with the vbs script ?, or do I need to invoke it a different way ?
I have jobs (DTS packages) for several different tasks that I would like to run sequentially, rather than trying to estimate how long each will take and schedule them individually.