Hi. I'm new to SQL Server programming. I'm using it at my new job for the first time and catching on quickly. I've run into a problem where I run a request via interactive query and it times out. It's really annoying because it seems to happy a lot. Is this just a memory issue with my machine (it has 2.8 GB of RAM) or there a way around it?
I have been hunting for a way to trace sql requests going to/from the sql express server (from VS.2005 using asp.net). I.E., I want to see the raw SQL commands that asp.net is generating.
I'm monitoring one of our servers, and on the whole it is performingwell. However, I'm puzzled by the number of LockRequests/sec thatPerfmon is recording. We frequently see values exceeding 50,000 andthe current peak is 533,616 (the average, as I type this, is 35,102).There are only 40 users on the system.sp_lock shows nothing like this number of locks; it shows somethingof the order of 30 locks maximum for each execution, which seems farmore reasonable.SQL Profiler seems to back up the Perfmon values; it records hundredsof "Lock Acquired" events per second during these peak periods.For example:NO. OFLOCKS OBJECT STARTTIME===== ====== ======================678User_T2005-04-15 09:03:22.863931User_T2005-04-15 09:03:22.877924User_T2005-04-15 09:03:22.89316EnquiryUser_T2005-04-15 09:03:22.893961User_T2005-04-15 09:03:22.910820User_T2005-04-15 09:03:22.9234NULL2005-04-15 09:03:22.923828User_T2005-04-15 09:03:22.9404NULL2005-04-15 09:03:22.940734User_T2005-04-15 09:03:22.957Can anyone think why such a comparatively small system should generatethese numbers of locks? Why does sp_lock NOT show the same level oflocking?
one production server DB is getting slower... (SQL2K sp4 ent) 1. dbcc checkdb(abc) with 0 errors 2. no blocking 3. no abnormal in sql log and event system log 4. only found one crypt32 auto update fail last night.
the Lock Requests/sec is kind high from 5400 to 15400 on 0.01 scale. max is from 57611 to 155411.
Could I increase the locks assigned to sql and what is the best way for it?
Hello, You help is very much appreciated. I have an Intranet environment that contains tens of apps that use a specific database. I want to create an Intranet test environment (which will use a test database), by basically copying the existing Intranet environment to a new location.
* I cannot change the connection strings in the applications level since each app stores it differently (some store it in Web.Config, some hard coded, etc....).
Then, have a layer that sits between the Intranet Apps and SQL SERVER (or inside SQL SERVER itself), that will intercept the connections requests. Once intercepted, the code will check if the connection request comes from Intranet Test environemnt, and if so, change the connection string on the fly to point to the test database.
The question: Is there a way in SQL SERVER 2005, to intercept connections, and then, based on the application that requested the connection, modify the connection on the fly (so that if the calling app is from test_Intranet – intercept the connection and change it to point to a test database, otherwise leave the connection as it is (live database).
i have an aplication with any problems, the aplication is hang and in the log is
SQL Server has encountered 1 occurrence(s) of IO requests taking longer than 15 seconds to complete on file [d:Program FilesMicrosoft SQL ServerMSSQLdata empdb.mdf] in database [tempdb] (2). The OS file handle is 0x000003DC. The offset of the latest long IO is: 0x00000016930000 and the aplication log is
E R R: [Microsoft][ODBC SQL Server Driver][Shared Memory]ConnectionWrite (WrapperWrite()). NUMERO -2147467259 FUENTE Microsoft OLE DB Provider for ODBC Drivers
I have a scenario where we want something like a message bus. We can have multiple clients sending in events and would like a way to
i) persist those events
ii) Send out the notifications to the subcribers (this would be some C# services)
What is the best way to do this? I know that we can build something from scratch using WCF Publish-Subscribe. But i was interested in knowing if I can leverage SQL Service Broker to do the work.
One of my client want to shift his reporting to SSRS. He wants to submit his report requests to SSRS. He wants the report should run at the reporting server side (not on client system) and output should be exported to a folder on server.
I am some what successful in submitting the report requests using subscriptions. But I have no idea to retrieve the requests from SSRS with the execution status.
I have to display all submitted reprot requests with execution status (success or fail) in a data grid and user selects one report to see the out file. Where can I get this information?
And how can I create a subscription that runs immediatly after submission. (without any delay)?
Hello, I have a bit of a problem. Some of my pages has way to many database requests, some pages upto 70 requests,and the loading time can be all the way up to 12seconds because of the many requests. So i am trying to optimize it abit.What i want to know if how to send several ID's to the database and collect the result from all of the ID's, i have a stored procedure like this: public DataTable User_GetUsernameTypeTwo(string strUsername){OpenDBCon();mCom.Parameters.Clear();mCom.CommandText = "Profile_GetUsernameTypeTwo";mCom.Parameters.Add(new SqlParameter("@UserName", SqlDbType.NVarChar, 20));mCom.Parameters["@UserName"].Value = strUsername;myAdap = new SqlDataAdapter();myTable = new DataTable();myAdap.SelectCommand = mCom;myAdap.Fill(myTable);CloseDBCon();return myTable;} To this i send a username and in return i get info about the user that then is presented on the page. If i have a gridview on the page with 20 posts, then it needs to make 20 requests to this file in order to collect all the info for all the users. Then i have one just like the one above that collects the photos of all the users. So my first thought was that i merge these two together, witch works perfect, reduces the database calls by around 40%. Then i thought, perhaps can i get all the info from all the users in 1 request, to reduce the load on the database even more, my problem is: How can i rewrite the one above to accept lets say 50 usernames (1 username can max have 20characters), and return the info from all of them. Since i use a stored procedure i cant with ease use WHILE (username = xxx) AND (username = yyy) AND (username = ccc), then i would have to add 50 Parameters and add each username to a seperate parameter and then look if the parameter is null or if it contains a username. So i need to find another way to send the usernames, so the sqlserver can add them to the where clause, perhpas using IN? This is the sql server store procedure (it works perfect, when its one user i am requesting info about), i am using FULL OUTER JOIN because ALL fields i request can be NULL: ALTER PROCEDURE [dbo].[Profile_GetUserpopupInfo] -- Add the parameters for the stored procedure here@UserName nvarchar(20)ASBEGIN-- SET NOCOUNT ON added to prevent extra result sets from-- interfering with SELECT statements.SET NOCOUNT ON;SELECT settings_username.color, settings_username.pic1, settings_username.pic2, profile_profilephoto.imageurl, profile_profilephoto.alttext FROM settings_username FULL OUTER JOIN profile_profilephoto ON profile_profilephoto.username = settings_username.username AND (profile_profilephoto.approved = 1) WHERE (settings_username.username = @UserName)END Anyone got any idea of how to send a bunch of usernames to this stored proc? (it must be efficient since there is alot of users) Patrick
We are experiencing a situation where the SRS (2000 SP2) report server will no longer render reports. In the log file, there are many instances of
w3wp!runningjobs!434!3/23/2007-10:12:57:: i INFO: Adding: 8 running jobs to the databasew3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:13:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsExpired; found expired requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsClientConnected; found orphaned requestw3wp!runningjobs!434!3/23/2007-10:14:57:: i INFO: RunningJobContext.IsExpired; found expired request
What could be causing this? The reports are making queries through an OLE DB provider. There are no scheduled jobs, and the load doesn't seem that heavy.
I have an application that calculates a bunch of numbers and then inserts them into a table (or updates a record in the table if it exists). In my test environment it is issuing 100 insert or update requests to the server per second and it could run for several hours.After a the first several hundred requests, the SQL server is bogging down (processor at 90-100%) and the application slows down while it waits on SQL to update the database.What would be the best way to optimize this app? Would it help to loop through all my insert/update requests and then send them as one big batch of statements to the server (per 1000 requests or something)? Is there a better way of doing this?Thanks!
I have been working as Sybase DBA for 5+ years, and I would very much like to add MS SQL Server to my resume. Given the common roots of the two RDBMS, it seems that the learning curve would not be as sharp as if I were going to learn Oracle or DB2. Can anyone out there know of any books that are geared toward learning MS SQL Server from a Sybase DBAs perspective?
For Bulk Load requests in SQL server, Are there any specific profiler event? Like the one we have for RPC RPC:Starting and for Batch Requests, we have SQL:BatchStarting.
Are Bulk Load requests that are being monitored through Profiler captured as SQL:Batch... events at the backend?
Are there any new features added in 2012 or 2014 to identify a Bulk request submitted through bcp.exe utility or any other sqlbulkcopy program?
Working on this issue ... SSIS package trying to load data from DB2 to SQL ....
IBM OLE DB Provider for DB2" Hresult: 0x8004D01C Description: " SQL1224N The database manager is not able to accept new requests, has terminated all requests in progress, or has terminated the specified request because of an error or a forced interrupt. SQLSTATE=55032
SSIS Error Code DTS_E_PRIMEOUTPUTFAILED - The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC0202009. The component returned a failure code when the pipeline engine called PrimeOutput().
The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
some times i need to take db in single user mode , which can be done using following
ALTER DATABASE xyz SET SINGLE_USER WITH NO_WAIT
this has other options but each one has some problems like if any single window is open at any location etc etc.
Q1)so the need is i want to honor running transaction from msss or any application then i want db to get into single mode so that i can perform my task.
Q2) some times i need to restart db as it becomes slow , i am trying to find the reason primarily i have seen it is memory.so i want to restart my db honoring all requests.
Hi I have a query which executes in 1 second and returning 14 rows whenever I execute it through Management Studio but when I execute it from a web page I am getting a timeout message and the stack trace is pointing towards the line which calls the stored procedure. The query is quite complex and there quite a few joins on tables and views including one to a lookup table, whenever I take out join onto the look up table the web page runs ok without timing out. I join the Lookup table (Parts) using a Left Outer Join similar to below. The syntax of the Join is fine but I cant find why it would cause the query to time out when executed from a webpage but is fine when I execute it from Management Studio select * from tables/view LEFT OUTER JOIN Parts on Parts.PartNumber=tables/View.PartNumber I've tried recompiling the Parts table but it didnt make any difference. Any help would be much appreciated. Cheers, Frankie
I use SQL Server to provide data to asp web pages and have recently started to get ODBC time outs throughout the day.
The environment is as follows: Server with dual PII processors & 512MB RAM running: - SQL Server 7 - IIS
I have a number of asp based web sites hosted on this box, but only 1 of them seems to be affected by the time out problem. I have checked the resources on the server (NT Task Mgr - Memoey & Processor) and everything seems fine - in fact the resources are hardly being touched!!! Within a few minutes the problem disappears completely without me doing anything.
Am I missing something here? Should I look elsewhere other than SQL - maybe IIS ??? Any suggestions / pointers would be very much appreciated.
We are using asp's and tables sucessfully but when we click a link to an exe on a page it is slow, to the point of timing out. Very slow. Any help would be appreciated. Email me tperry@kpmg.com
What could be the reason for my view to timeout? I thought it was because of the number or records, but i guessed wrong. The view is grabbing data from a UDF i have created.
I am trying to use statistics to get the time it takes to run a sql function. When I use SET STATISTICS TIME ON it returns multiple results (one for each insert statement in my loop). Is there any way to get results for the ENTIRE function? Here is the loop that I am timing.--> (It simply populates a calendar table)
SET NOCOUNT ON DECLARE @Counter INT DECLARE @ActualDateDATETIME DECLARE @FirstDateDATETIME SET @Counter = 1 SET @FirstDate = '1/1/1900' SET @ActualDate = @FirstDate WHILE @Counter < 43830 BEGIN INSERT INTO Calendar(ActualDate) values(@ActualDate) SET @ActualDate = DATEADD(day, @Counter, @FirstDate) SET @Counter = @Counter + 1 END
Can anyone tell me as to how I can Timeout a transaction. For example I have transactiopn that pulls in data from a remote source. So if the time of extraction exceeds sometime say 1hr..I need to rollback the transaction.
Cannot use Set Lock_Time out because it can be used only for timing out a waiting process. Here I want to timeout the executing process
Hi all,i have 97 honda CR-V with about 100,000 KM on it. i checked a doc onHonda web site and it said that 60,000 miles is point to change timingbelt. compared with other Honda cars, it is pretty low. is it correct? ifyes, do you know the reason??thanks for your help,jj
I wrote a query using the query analizer and tested it before turning it into a Stored Procedure. It worked fine an the execution time was acceptable (25 secs, since there was a lot of data to analize)
When I executed the recently created stored procedure, the execution time happened to be three or four times higher. (1 min, 38 secs)
It was the exact same code, i was logged in the same database server, the parameters were the same in btoh cases. So, my question is as follows:
Why is it that executing a script and executing a stored procedure with the exact same script differ so much in timing?
I see the following error on the sql server log of one of our sql servers running sql 2000 with sp4. SQL Server has encountered 1964 occurrence(s) of IO requests taking longer than 15 seconds to complete on file [h: empdb.mdf] in database [tempdb] (2). The OS file handle is 0x00000534. The offset of the latest long IO is: 0x0000002b09e000
Any idea as to what might be causing this error. Appreciate any comments.
I am creating an index on a table wit 35 million records but I get the error 'TT_ObjPerformance' table- Unable to create index 'IX_TT_ObjPerformance_CACode'. Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. How can I get the index created? ThanksSQL Server newbie
I have a site that is experiencing the issue described in this doc. I'm unsure if the fix is going to blow anything up as I am unfamiliar with anything they are recommending.
http://support.microsoft.com/kb/931279
exec sp_configure 'affinity mask', 0x00000003 GO reconfigure GO