I have a .Net application that calls an stored procedure. When it does, the execution goes and never ends (I have to kill the windows process). When I call the sp from within the Management Studio, it also never ends executing and I have to cancel the query. But, when I call it immediately after, it takes 45 seconds to complete.
Now, the sp has several parts and I have made that it prints a message at the end of each part so that I can read where it stops. Strange enough, it completes all parts except the last one, which has the form INSERT INTO myLocalTable SELECT * FROM MyRemoteTable. But if I execute the Select independetly, I discover that it brings no rows! Now, many of the @@rowcount printed after the execution of the other parts shows zero rows involved or just a few. I am not using cursors, each part is an UPDATE statement or an INSERT.
TestMachine1 runs SQL2005 SP2 and has as linked server myRemoteServer (SQL2000) server. The stored procedure in TestMachine1 inserts rows to a table in myRemoteServer and brings back some rows.
I have a delete query on a linked server that never ends. I can do a select where ID = x and it returns lickity split but when I do a delete where id = x it never comes back? Any Clues???
Can I roll back certain query(insert/update) execution in one page if query (insert/update) in other page execution fails in asp.net.( I am using sqlserver 2000 as back end) scenario In a webpage1, I have insert query into master table and Page2 I have insert query to store data in sub table. I need to rollback the insert command execution for sub table ,if insert command to master table in web page1 is failed. (Query in webpage2 executes first, then only the query in webpage1) Can I use System. Transaction to solve this? Thanks in advance
Seems that parameter @Sucursales is always the same number. When the query loops in the while statement does it set @Sucursales to the result in the select statement (in red)? Or continue looping from the while statement?
DECLARE @Sucursales INT
SET @Sucursales = (SELECT COUNT(*) FROM CatSucursales)
BEGIN
WHILE @Sucursales > 0
INSERT INTO Compras(Fecha,NumOC,CveProveedor,Importe,CveDepartamento)
SELECT
Fecha,
NumOC,
CveProveedor,
CONVERT(money,SUM(Cantidad * Costo)) AS Importe,
CD.CveDepartamento
FROM
ComprasDetalle CD
INNER JOIN
( SELECT
CveProveedor,
Nombre,
CveDepartamento
FROM
CatProveedores CP
INNER JOIN CatDepartamentos CD ON
CP.CveSucursal = CD.CveSucursal
WHERE
CP.CveSucursal = @Sucursales
)VirCP ON
CD.Proveedor = VirCP.Nombre
AND CD.CveDepartamento = VirCP.CveDepartamento
WHERE
Fecha = DATEADD(dd, DATEDIFF(dd, 0, GETDATE())+0, -1)
GROUP BY Fecha, NumOC, CveProveedor, Nombre, CD.CveDepartamento
Hi, we've got this problem with some particular jobs: they look as they ran correctly, but actually they didn't made it all through their duties. The problem is that this job is calling a sequence of DTS, where there is a DTS with an ActiveX control which modifies another DTS before launching and some other tasks: the error happens there, . Launched from the DTS we get the error, from the job no...any idea how we could get the correct job information? Thank you Daniele
I have strange situation with one stored procedure on MS SQL Server 2000, sp3: - It's some old procedure that worked well for several years - It still works at other locations (even on MS SQL Server 7) - Now sometimes this procedure just stops working over ODBC and OLE-DB connection. From Query Analyzer it works always ok! Over ODBC looks like client doesn't receive information, that procedure has completed.
I put some insert at the end of the procedure (for debbuging): procedure executes insert (it's the last statement in procedure), I can select inserted values from table, but client just doesn't get answer from server. I looked all processes at SQL Server and none of them is blocked, there are no locks that prevents procedure execution,...
I found out that recreating procedure makes it working again (for all clients). Well, at least for few days or even weeks.
Hi all I'm using Sql server 2000 and sometimes i need to run my Queries in Query analizer before using them in my application just to test them...BUT most of the time when i run a query in query-analizer for second time ,query analizer populates the result (records) more quicker then the first time. Apparently it caches the query !!! i don't know but for some reasons i dont't want this , so how can i prevent Query-analizer from doing so? Thanks in advance. Regards.
I have a question about how Query Notification works and the objects and data created when a new subscription is created.
Here is what we are doing:
- When the application starts, we call the SqlDependency start method:
SqlDependency.Start( <connection string> )
- For each query we want to monitor, we run the following:
SqlConnection oConnection = new SqlConnection( <connection string> ); oConnection.Open(); SqlCommand oCommand = new SqlCommand( <sql statement> , oConnection); SqlDependency oDependency = new SqlDependency(oCommand); oDependency.OnChange += new OnChangeEventHandler( <handler delegate> ); SqlDataReader objReader = oCommand.ExecuteReader(); objReader.Close();
- When the application shuts down, we call the SqlDependency stop method:
SqlDependency.Stop( <connection string> )
We have observed that when we create a new subscription, the following things happen: 1. A row is added to sys.DM_QN_Subscriptions. 2. query_notification_% internal table is created. 3. SQLQueryNotificationStoredProcedure% stored procedure is created. 4. SQLQueryNotificationService% Service Queue is created. 5. queue_messages% internal table is created.
When the application ends, the stored procedure, service queue and queue messages internal table are dropped. The row in sys.DM_QN_Subscriptions and the query_notification_% persist.
When the application is restarted, a new row is added to sys.DM_QN_Subscriptions, but the existing query_notification_% internal table is used.
Is this normal behavior? Or, is there something we should be doing to clean up?
What I am wondering is if there is a threshold where these persistent items become a problem with locking, performance, other?
I now support a SQL Server "master" wrapper package that runs several other packages that simply never ends. When I run it in Visual Studio all the tasks complete, but the status stays running. I'm am trying to get this package to run via TWS/OPC and it doesn't end there either.
Does anyone know what the problem is? Has anyone heard of this?
We have a database that's using merge replication between two servers, and we need to insert a lot (about 1GB) of data into it.
The servers, however, are separated by a 192k WAN connection, so it's impractical to rely on the merge replication to send the data across to the subscriber.
Is there a way to insert the data at both ends? I can get the data out there on a DVD or a laptop easily enough. Can I load the data into both copies of the database and tell the merge agent that it's not to be replicated?
I have a problem concerning the execution status of a data driven subscription. The problem is that I“m creating a datamart and rendering several thousands of pdf based reports after that. I“ve found out how to start the subscription using stored procedures with DTS.
Now I want to take further actions after the report building is done. But how do I find that out?
I've created a simple package that contains only one task that is an execute sql task. When I run only this single task from Business Intelligence development studio it runs successfully. But when I run the whole package (also from Business intlligence studio), the package fails.
The data source I access is ODBC. I'm sure the real reason for the error is the bad ODBC driver of the data source but this can't be changed. So I need to know what is different from running only a task in a package to running the whole package. If I knew that I might be able to adjust some setting and make it work.
So in a script task for one of my packages I have a connection manager to an dtsConfig OLE DB.
This is the code
Dim ConnectionString As String = Dts.Connections("db_stage").ConnectionString Dim sqlConnection As SqlConnection = New SqlConnection(ConnectionString )
I get a login failed for user...But if I hardcode the connectionString, including the password this works.
1) Why is it that the ConnectionString from the connection manager omits this password? 2) Since this is an OLE DB, is there anyway to set the Data Source Designer to omit the "Provider=ABCDED.1" section?
SELECT *,isnull(dbo.fn_1(a),'') f_a, isnull(dbo.1(b),'') f_b FROM tablea with(nolock) WHERE ID = 12345 and ID<10000000 and dbo.fn_1(7)='asdfasdf' and Active='Y'
Does it effect performance as i am using a udf here? Also let me know the order in which the conditions will be applied.
I want to know is there any way by which we can come to know which query is going to be executed by the Engine prior to its execution. I think if this is possible then this will affect the performance. But It will be very helpful for me.
Setting up a test AlwaysOn Availability Group for one database.
However, whenever I restore the database to the replica server and join it, it ends up with my user account as the owner of the database.
Obviously I do not want a user account as the database owner, but since it is read-only I cannot modify it directly. If I were able to fail the AG over to the replica, I could change the owner then, but I cannot due to business requirements. this AG is to essentially serve as a replacement to log shipping.
I tried doing the backups and restores using EXECUTE AS login = 'sa', and yet it still shows up as my user account.
I have to send updated Employee list from employee master table to a particular email ID on every last date of Month and when a new employee is added / deleted / edited.Ā Also need to send this as an Excel fileĀ
I tried the following but "Invalid Object name dbo.tbl_EmployeeMaster" error coming while inserting a new employee.
USE [eXact] GO SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER TRIGGER [dbo].[trg_Email]
I have sql query to search for fields in a rather big view. If I execute the query in sql server enterprise manager, the results will be displayed in less than 6 seconds. However, if I execute it using asp.net, it will take very long (more than 2 minutes).
The query is a simple one like "SELECT * FROM myview WHERE name LIKE '%Microsoft%'". And the code I use to execute it in asp.net is
Dim dsRtn As DataSet Dim objConnection As OleDbConnection Try objConnection = GetOleDbConnection() objConnection.Open() Dim objDataAdapter As New OleDbDataAdapter(strSearch, objConnection) Dim objDataSet As New DataSet() objDataAdapter.Fill(objDataSet, strTableName) dsRtn = objDataSet Catch ex As Exception dsRtn = Nothing Finally If objConnection.State = ConnectionState.Open Then objConnection.Close() End If End Try
Where strSearch is the sql search string.
I don't have any problem using such code for other queries.
Could somebody suggest the cause of the problem and how to solve it? Thanks!
I am having a query where I am connecting to eight different tables using joins. When I join one table to another the speed of the execution becomes less. Even on my local server it is taking nearly 2 to 3 minutes to execute the query. How can I increase the speed of execution of my query.
I wanted to know whether we have an execution plan enabled in SQL 6.5 as we have it in SQL 7.0 and SQL 2000 . I.e when we execute a query and if we enable ' show execution plan 'then it creates a map and shows the vital statistics . If that is available on SQL 6.5 then i am missing that tool .
How can i have it installed on my SQL 6.5 server ??
I have a procedure (used to create a report) and was used in sql 7.0 service pack 3.
Problem is that we are upgrading to SQL 2000 and this procedure now takes 1 minute and 30 seconds to execute vs. 10 seconds previously.
Everything is same between the sql 7 and sql 2000 server. i.e. database size, indexes, hardware etc.
I looked at the query execution plan and it seems to do a sort which is taking majority of the resources on sql 2000 even though there is no sort stmt issued in the procedure itself.
Any help would be appreciated?? I am more curious to find out why this is the case when all the variables are same between the two servers yet sql 2000 performance is much worse than sql 7.0. It should be the other way around!!
I am using a store procedure and in this sp i am having a simple select statement. Now i found that when i executes this sp in query analyzer it takes about 8-10 min to show the output. Table is having thousands of records. I can rebuild indexes on table, but apart from this what else i can do to speed up the query.
I know there is something like we can use indexes explicitly in sql query. Is it true? if yes plz show me how to use it, by giving example
Also is there any other way to run the query much faster.
We have SQL Server 2000 and int is an Oracle linked server. I'm trying to run the following query...
SELECT DISTINCT a.auf_nr AS OrderNo, e.ku_name AS Customer, d.bestell_dat AS OrdDate, d.liefer_dat AS DelvDate, CAST(SUM(b.anz) AS FLOAT) Qty, CAST(SUM((CAST(c.breite AS FLOAT) / 1000 * CAST(c.hoehe AS FLOAT) / 1000) * b.anz) AS FLOAT) SQM, CAST(SUM(a.liefer_offen) - (SUM(a.anz) - SUM(b.anz)) AS FLOAT) AvailDelv, CAST(SUM(a.liefer_anz) AS FLOAT) Delvd, CAST(SUM(c.sum_brutto*a.anz) AS FLOAT) Value
FROM liorder..LIORDER.AUF_STAT a, liorder..LIORDER.AUF_LIP_STATUS b, liorder..LIORDER.AUF_POS c, liorder..LIORDER.AUF_KOPF d, liorder..LIORDER.KUST_ADR e
WHERE a.auf_nr = b.auf_nr and b.auf_nr = c.auf_nr and c.auf_nr = d.auf_nr and d.kunr = e.ku_nr and a.auf_pos = b.auf_pos and b.auf_pos = c.auf_pos and b.lip_status = 7 and c.ver_art !='V' and a.history = 0 and a.rg_stat != 2 and e.ku_name IS not null and e.ku_vk_ek = 0 and d.bestell_dat BETWEEN '01/01/2005' and '12/17/2005'
GROUP BY a.auf_nr, d.liefer_dat, b.lip_status, d.bestell_dat, e.ku_name, d.kopf_tour, d.kopf_firma
HAVING CAST(SUM(a.liefer_offen)-(SUM(a.anz)-SUM(b.anz)) AS FLOAT) > 0
..and it takes around 2 minutes to show the results even if the date range is of the same date. I even tried to use an indexed column but I still get the same slow execution time. I even tried to create a UDF so that the WHERE clause would be resolved remotely on the Oracle DB but still the same. Is there anyway I can do it in much more efficient and faster way?
can anybody help How to solve the below error.While calling the sp from the front application the below error is thrown.but if i executed the sp in backend No error is thrown ,resultset is produced.
Error thrown: The query has been canceled because the estimated cost of this query (7) exceeds the configured threshold of 6. Contact the system administrator.
I have a strong feeling that this isn't possible but I thought I might as well ask...
I'm developing a database application (SQL Server 2000 backend) where the client and the server is separated over a slow network connection (satellite). There are parts in the application where I will have to query a large resultset so I was wondering if there is a way to determine the percent complete of a query so I can put a progress bar on the interface so the user can see it loading data instead of having a frozen form.
I thought about spliting up the query into different ones and update the bar once each separate one is complete but I'd rather not do that because in the application development environment I'm working in, I have to close the resultset and reopen it every time I do a query... unless this isn't a big deal but I'm under the impression its something to avoid.
Hi there - i'm hoping someone can help me!I'm having a problem with a live database that i'm running on MSDE - Itseems to have slowed down quite considerably from the test environment(even when all the data is the same). The is notably different on oneparticular query that takes 1 sec on the test machine and almost 1 minon the live machineThe total number of user connections on the live machine is normally 4or so (found out through the Performance monitor). So I can't see thatit's MSDE's performance throttler...Has anybody got any ideas on things i can check for??Many thanksJames