ADO Inserts Running Slow
Oct 13, 1999
Hi list,
I'm a long time lurker on this list and really enjoy the discussions, although I rarely get a chance to participate.
Here is my situation: We are importing chunks of data (500 records at a time) from a C++ interface. The records have to be transformed before inserting into the target table which I am doing using a stored proc which is working fine. The records are in memory in C++ and the programmer is looping through the records building inserts into a temp table through ADO (which my proc picks up). The server business object is using the connection.execute method which is inserting one record at a time. That part of the process is taking over 15 seconds for 500 records which is the bulk of the total time.
My question is: Using ADO is there a better way to insert these records into the temp table? I see mention of a recordset interface but my programmers are new to ADO and since I am the DBA and have never used ADO, I am not sure what to tell them.
Any insight would be greatly appreciated.
shawn
View 2 Replies
ADVERTISEMENT
Dec 22, 2004
Hi,
I am new to the windows world. We use Informatica on UNIX for ETL process. We have a requirement to load approx. 200,000 rows to a MS SQL Server table . The table is not that big and it is a heap table (no indexes). Inserts are taking 69 rows/per minute. We are using DataDirect Closed 4.10 SQL Server ODBC driver.
SQL Profiler tells us that is is doing a row by row processing and using sp_execute procedure.
Is there a way we can speed up the ODBC process?
-Thanks in advance
srv
SQL Server Version:
Microsoft SQL Server 2000 - 8.00.818 (Intel X86) May 31 2003 16:08:15 Copyright (c) 1988-2003 Microsoft Corporation Standard Edition on Windows NT 5.0 (Build 2195: Service Pack 4)
View 3 Replies
View Related
Apr 5, 2007
Hello all. I've got a problem with really slow INSERTs on one (and only one) of the tables in a database. For example, using SQL Management Studio, it takes 4 minutes and 48 seconds to insert 25 rows. There are only about 8 columns in the table and only about 1500 records. All the other tables in the database are very fast for inserts.
Another odd thing uniquely associated with INSERTs on this table: prior to inserting the 25 new rows of data, SQL Management Studio tells me that it inserted 463 rows of data which I know did not happen. Here's the INSERT statement:
INSERT INTO FieldOps(StudySiteID
, QA_StructureID
, Notes
, PersonID)
SELECT DISTINCT StudySiteKey
, QA_StructureKey
, SampleComments1
, '25'
FROM ScriptOutput_Nitrate
WHERE (ScriptOutput_Nitrate.StudySiteKey IS NOT NULL)
and SQL Management Studio (eventually) says:
(463 row(s) affected)
(463 row(s) affected)
(25 row(s) affected)
The table has an index on the primary key (INT data type with auto increment). I tried running the following code to fix things but it made no difference:
USE [master]
GO
ALTER DATABASE [FieldData] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
GO
use FieldData
GO
DBCC CHECKTABLE ('FieldOps', REPAIR_REBUILD) With ALL_ERRORMSGS
GO
USE [master]
GO
ALTER DATABASE [FieldData] SET MULTI_USER WITH ROLLBACK IMMEDIATE
GO
I'm guessing that the problem might be related to the index (??). I don't know... Does anyone here have a suggestion as to what I should do to fix this problem.
View 9 Replies
View Related
Jul 23, 2005
I'm inserting 2 million+ records from a C# routine which starts outvery fast and gradually slows down. Each insert is through a storedprocedure with no transactions involved.If I stop and restart the process it immediately speeds up and thengradually slows down again. But closing and re-opening the connectionevery 10000 records didn't help.Stopping and restarting the process is obviously clearing up someresource on SQL Server (or DTC??), but what? How can I clean up thatresource manually?
View 9 Replies
View Related
Nov 29, 2000
We are inserting into a table, which includes an identity primary key column. When the table gets really large (i.e. 1.5 million records), the performance of the inserts reduce.
I noticed that when we insert into the table an exclusive lock on the table is obtained. Do inserts into tables with identities always lock the table?
Given the table size is unavoidable, does anyone have a suggestion to improve the performance?
Thanks,
Matt
View 6 Replies
View Related
Jul 26, 2002
Below given query is being executed on a Sql 2k box with 4CPU and 2GB RAM
testXX.DB_GRP.dbo.group1-----> is a sql 7 box with single CPU and 512MB RAM
The result set is abt 30,000 rows .
This whole Process is taking abt 5 mins to do the Insert Process.
Is there a way to optimise the query and bring down the execution time
insert into testXX.DB_GRP.dbo.group1
select num, group_num,group_desc from group2
where id = 20
---------------
If we just run the
select num, group_num,group_desc from group2
where id = 20
it takes 10 secs to execute this selct statement so i was wondering why it takes 5 mins to do the insert process across the network thru linked server query.
Any help would be appreciated?
Thanks,
MK
View 3 Replies
View Related
Aug 10, 2007
I'm relatively new to compact framework and SQL Server Compact so bear with me if I've got an obvious thing I've forgotten.
I've written my own database helper layer. My idea is to generate SQL Insert statements dynamically based on what is in the contents of each object. Peformance however is horrible. I try to do 1800 inserts and it takes about 50 seconds on the device (Release Build, outside of the IDE).
I pass in a list of objects to be inserted which derive from a ModelBase class. ModelBase includes some ORM information (what table the object goes into. what fields are mapped to which properties). I generate one SQLCeCommand object, one sql string (new params for each insert), and am using SqlCeResultResultSets.
What can I do to make this run faster? Thank you
Code Snippet
public bool Insert(List<ModelBase> recs)
{
SqlCeConnection con = new SqlCeConnection(connectionString);
try
{
con.Open();
con.BeginTransaction();
}
catch
{
return false;
}
SqlCeCommand cmd = new SqlCeCommand();
cmd.Connection = con;
String sqlString = "INSERT INTO [" + recs[0].tableName + "] (";
String sqlString2 = ") VALUES (";
PropertyInfo[] props = recs[0].GetType().GetProperties();
for (int x = 0; x < props.Length; x++)
{
PropertyInfo pi = props[x];
if (recs[0].fieldMap.ContainsKey(pi.Name))
{
sqlString += "[" + recs[0].fieldMap[pi.Name] + "]";
sqlString2 += "@" + pi.Name;
sqlString += ", ";
sqlString2 += ", ";
}
}
sqlString = sqlString.Substring(0, sqlString.LastIndexOf(", "));
sqlString2 = sqlString2.Substring(0, sqlString2.LastIndexOf(", "));
sqlString += sqlString2 + ")";
cmd.CommandText = sqlString;
cmd.CommandType = CommandType.Text;
foreach (ModelBase rec in recs)
{
cmd.Parameters.Clear();
for (int x = 0; x < props.Length; x++)
{
PropertyInfo pi = props[x];
if (rec.fieldMap.ContainsKey((pi.Name)))
{
if (pi.GetValue(rec, null) == null)
cmd.Parameters.AddWithValue("@" + pi.Name, DBNull.Value);
else
cmd.Parameters.AddWithValue("@" + pi.Name, pi.GetValue(rec, null));
}
}
try
{
SqlCeResultSet rs = cmd.ExecuteResultSet(ResultSetOptions.None);
}
catch(Exception e)
{
cmd.Connection.Close();
cmd.Connection.Dispose();
Program.log.Error(e.Message,e);
return false;
}
}
con.Close();
return true;
}
View 7 Replies
View Related
May 20, 2008
Our company records live sales into an SQL Server database. The same tables that store the sale information are also used by a reporting interface to query sales figures, but occasionally a SELECT generated by the reports is so slow that it causes the INSERT operations to time out and fail.
We've tried increasing the CommandTimeout property in .NET of the application performing the inserts, but despite it being set to 90 seconds, it's still not enough to prevent the occasional sale from failing to record which is a big no-no for us. We've run the Database Tuning Advisor on the stored procedure that generates the SELECT for the reports, and it is now fully optimized but still this isn't enough. The tables are quite massive (millions of rows) and the SELECT requires JOINs to other tables, some of which are in a separate database on the same server.
Is there a solution to this problem, aside from increasing the CommandTimeout property to the point where no timeout errors occur? My concern is that doing this could increase the number of concurrent connections and we'd hit another limit, so it's not really solving the problem. Is there a way to configure SQL Server to always favour INSERTs over SELECTs? The reporting users won't really care if the reports are slower but it's critical to get these INSERTs up to 100% reliability.
I'm not a DBA (just a developer) so I don't have an intricate knowledge of databases and this problem is a bit beyond my level of expertise. Obviously we don't want to re-design our entire system, but we'll do whatever necessary to ensure we aren't failing to record sales.
One idea I had, which may be awful I don't know, is to submit these sales to an MSMQ and write some software that will read from the queue and insert the records from there instead. We could then deal with the timeout issue by just re-submitting the sale until it is accepted, then removing it from the queue.
View 4 Replies
View Related
Oct 23, 2007
Hi all,
This managed application was written to run on a Symbol 3090 Win CE 5.0 scanning device. We are using the symbol provided classes to access the scanning interface, and SQL Compact database on the device to collect the scanned data, and then using merge replication to synchronize scanned data when the device is docked. The problem we have experienced seems to be releated to the performance when inserting and updating records in the database.
We have tested some randomly generated 1000 records and inserting/updatating into a database. At first the time to commit a record increases when the database is flushing into the memory (The flush interval in the connection string property is 10 seconds by default). and then as the database size grows increasing the time to commit every single record which is causing the application to perform slowly as they scan items into the database. However, the device program memory remains consistant as they are scan items. From our tests, I found the time to execute either a update/insert command on 2MB sqlMobile database (upto 10000 records, depending on the size of the columns) is taking nearly 2 to 2 and half seconds to complete. Below is the only code I am executing,
If Not sqlObj.UpdateItem(1061022, itemNo, 1) Then
sqlObj.InsertResultSet(1061022, itemNo, itemObj.Style, itemObj.Color, itemObj.Size, itemObj.Description, 0, 1)
End If
For the notes, I am using prepared updated command and resultset.insert methods to perform update and insert commands into the database.
Any help on this issue is highly appreciated.
Thanks
Ravi.
View 1 Replies
View Related
Apr 13, 1999
One of the developers here wants to run a PB script consisting of 11
inserts to 11 respective tables at the same time. This is on a development
box.
Currently, we are running on MSSQL 6.5 ,sp4. We have very general
parameters set, nothing special. We have max async io set to the default
The inserts run serially now. What I need to do is run them in parallel.
My questions are:
1) How do I do this -- run in parallel?
2) What parameters do I need to set?
3)What else would I need to do?
Any information you can provide will be greatly appeciated. Thanks.
David Spaisman
View 1 Replies
View Related
Aug 15, 2006
Hi,Before stepping into ado.net code to perform an insert or update, the insert / update has already taken place, just on starting the debugger. I use VS 2005 on SQL Server 2000. This did not happen with VS 2003 and SQL Server 2000.Anyone else encountered this?
View 2 Replies
View Related
Aug 8, 2000
I have a dts job set up to transfer 550,000 records from a dbf file into a sql server. If I just let it run, there is a 9-10 minute delay, then it starts. If I try to schedule a job, it fails completely. I looked up ways to get it to execute quicker, mainly going to the advanced tab of the transform arrow and making the inserts 1000 at a time, the table locked, turning constraints off. Any advice on how to speed it up or why the job is failing?
View 7 Replies
View Related
Apr 27, 2002
Hi All
Last week, we upgraded to sql2000 from 6.5. Everything went on fine. we re-compiled all the procedures. When i try to run a procedure, its running for long time - more than 10 hours (in sql 6.5 it runs for 50 mins). do i need to set any procedure cache?.
Also, the server CPU usuage is constantly high - more than 80%. It was fine till last week.
Any suggestions?
Thanks in advance
Jaya
View 2 Replies
View Related
Aug 30, 2006
i have written a SP, but it eventually runs slower. i have to run this SP 500 times.
do you know what is causing that?
View 13 Replies
View Related
Jul 1, 2007
when i turn on my pc, it takes about 10 minutes from turning on to actually be able to use.
View 4 Replies
View Related
May 2, 2001
Our server is running. There are no locks, and server has been rebooted but the problem is still there. This has been going on for some time now. I intend to restart the server. Does anybody have a quick solution, please help. Thanks for your assistance!!
View 4 Replies
View Related
May 17, 2001
Please what do I look out for 6.5 if I want to troubleshoot for slow running.
Thanks.
View 2 Replies
View Related
Jun 6, 2001
Hi,
When first time I start my sql server is running faster. After 10 to 15 days later, sql sever performance is very slow. After I restart SQL service, to become normal.
Thanks
Mohan
View 1 Replies
View Related
Apr 14, 2004
I have linked three SQL Servers together, and have written a stored proc that has a cursor that joined three tables from one of the linked servers. When I pull the SQL out of the cursor definition and run it in a query window it runs fine, but when I run the stored proc that simply steps through the same select result set it is too slow for words. It also throws a warning about serial isolation levels. Is there any way I can fix this.
David
View 1 Replies
View Related
Mar 2, 2004
Hi,
Our main production server has started running slow, it is a dual zeon thingy with plenty of ram so hardware is not an issue.
Basically a service connects to the database and executes a few stored procs, the only way I can get the system up to speed again is to recompile one of the SPs but that is only a temporary fix.
Anyone had a similar thing?
Can anyone give me help on performance tuning in SQL server 2000.
Thanks
View 2 Replies
View Related
May 13, 2004
Hi guys.
I have a DTS package ON SQL 2000 which transfer data from AS400 to SQL 2000 using an ODBC Client Access 5.1 (The DSN was configured by a sysdmin on the AS400 so it is configured properly).
When i execute the package manualy (by right click and "execute package") the package runns fine and ruterns data in no time (Eproximatly 30000 rows in 15 sec).
The problem starts when a job executes the same packagee!!!
When i start the job, the DTS package is running Very Very Slow!!!!
sometime it takes Hours to return a few rows! and it seems that it is stuck.
The SQLAgent is running as a NT Account with Administrator rights, and has full access to the AS400!! so the problem is not the Agent.
by monitoring the AS400, i have noticed that the job/DTS is retreaving the first fetch quickly , and then it is in a waiting status
i have tried everything and cant seem to get this problem fixed
Does anyone know what could be the problem?
I Need Help Quick!!!
Thank You
Gil
View 5 Replies
View Related
Jan 30, 2008
Hello All,
I have two procedures being run one after the other.
when I run proc1 it runs for about 15 min.
Now the proc2 is dependent on proc1, when I run proc2 it runs for 45 min.
If I run both the proc's simultaneously through .net code it takes more than 1 hour. Can anyone of you tell me where would be the problem.
Thanks in Advance.
View 6 Replies
View Related
Oct 23, 2007
Hi, when the following function is used within the where statement of a select statement it runs very slowly, anyone know why?
ALTER FUNCTION [dbo].[fn_GetLastDayOfLVPeriod] ( @pInputDate DATETIME )
RETURNS DATETIME
BEGIN
DECLARE @vLVDPeriod varchar(4)
DECLARE @vLVDYear varchar(4)
DECLARE @vOutputDate DATETIME
SELECT @vLVDPeriod = LVDPeriod
FROM dbo.tblMIDates
WHERE CONVERT(varchar(8), Date, 112) = CONVERT(varchar(8), @pInputDate, 112)
SELECT @vLVDyear = LVDYear
FROM dbo.tblMIDates
WHERE CONVERT(varchar(8), Date, 112) = CONVERT(varchar(8), @pInputDate, 112)
SELECT @vOutputDate = MIN(Date)
FROM dbo.tblMIDates
WHERE LVDPeriod = @vLVDPeriod
AND LVDYear = @vLVDyear
RETURN @vOutputDate
END
View 5 Replies
View Related
Aug 24, 2007
hi
if suppose one query is running slow than what steps we can follow to optimize it...
this question is asked me in interview.
thanx
View 3 Replies
View Related
Dec 29, 2001
Hi,
I have SQLServer 6.5 SP5a update running on Windows NT 4.0 SP6
with 4 gig RAM and 4 processor.
Suddenly the SQL 6.5 jobs running on the production server started running very very slow. A job that suppose to run in 30 minutes are running like 2 hours and completing successfully.
(I suspect the after the Norton Anti virus automatic live update may be the reason but not the Second Vulnerability as mentioned by Microsoft Bulletin last week)
I check the SQLServer, ran the performance monitor, checked pagefiles, disk space, databases,memory, tempdb. Everything seems to be normal.
I rebooted the server, checked any other process making that slow. But no use.
Please help me out with this issue as this is a production and the CRM applications from the clients uses the database server.
Thanks in advance,
Anu
View 1 Replies
View Related
Nov 9, 2004
All,
I have been tapped to help with fixing a reporting tool. We have a Sql Server database/crystal reports(10) setup. I havent had the chance to look at the tables in the DB yet, but I was told that aggregate tables were used. In my past experience with crystal reports, we used database views to feed crystal reports (in Oracle). I was thinking that I could somehow use views instead of tables and then try to re-index the base tables and compile satitics (if theres such a thing with Sql Server). I was also going to look into bottlenecking and locking (table locking as opposed to row or page locking for the lookup tables...to reduce overhead on the main tables) but, I'm not sure if it'll make a difference since this is just a demo server with no major traffic hitting it yet.
The question is, does anyone have any experience with crystal reports running slow with Sql Server, what should I look out for??
'Wale
View 9 Replies
View Related
Apr 17, 2008
Hi all,
I'm having what you might call an optimisation issue - but I'm also not sure if my approach to this problem is right. I've spent the whole day trying various methods but none seem to be performing as admirably as I'd hoped.
Basically, here's the scenario:
1. Log files are being inserted into a SQL table via Log Parser 2.2.
2. I have a lookup table of ip address ranges to countries and other geographic data.
3. Once the log row has been inserted from Log Parser, I want to update the same log table with data from the lookup table.
The main problem seems to be the updating (the initial insert from Log Parser is lightning quick).
I initially wrote a trigger on AFTER INSERT on the log table that converted the actual user's IP address into IPNumber format using a simple algorithm, then pumped the IPNumber into a quick select query which retrieved the geolocation data. Then, in the same trigger, there was an update statement which basically said:
update dbo.Logs
set [Country] = @Country
where [IpNumber] = @IpNumber and [Country] = Null
Therein lies the problem. The update statement slows everything down to almost a standstill and also seems to obtain a lock on the table.
Critical factors:
1. The log table must remain readable.
2. The query must execute in seconds -- not over half hour :)
I've experimented with various combinations of indexing, so far to no avail.
Any suggestions would be very much appreciated.
Regards
View 10 Replies
View Related
May 3, 2007
Hi all,
I am having a problem ,SQL server is running very slow.This is happening some days only.For example my stored procedure ususally runs less than 2 minutes, some days will take 13minutes.I dont understand the problem.All the stored procedure having the same problem.Sorry, I am not a DBA,basically a devloper.Daily morning we are taking the DB backup and indexes already applied.DB size 10461.06 MB,RAM 4GB,CPU usage is less than 50%,This is a Cinema Database,so lot of users are accessing at same time(Web,IVR,cinema ticket counters etc).We are using SQL reports.Because of the stored procedure running slow,can not view the reports.pls advice..
please help me..If you need some more information please ask ..
Thanks in advance.
View 16 Replies
View Related
Jan 27, 2006
Hi friends
after working visual studio (on my report model project) few minutes it runs too slow. i mean clicking on entities ,attributes takes ages to finish. I opened task manager i see "devenv.exe" is taking more than 800,000 k !!
am using sql server 2005 standard edition.
have you seen similar problem.
Thanks for your help.
View 19 Replies
View Related
Jan 9, 2008
We have a stored procedure which is running fine on a SQL server 2000 from Query Analyzer. However, when we try to execute the same stored procedure from ADO.NET in an executable, the execution is hung or takes extremely long. Does anyone have any ideas or suggestions about how it could happen and how to fix. thanks
View 22 Replies
View Related
Apr 6, 2007
I support a website (ASP.NET 2.0) where recently users have been unable to insert data due to timeout issues. The functionality executes a query that inserts a single row into a SQL Server 2005 db table. I tried running this query from the backend, and it took 4:48 to insert a single row! Interestingly, after that initial agony any similar inserts I tried took no time whatsoever. I have checked the execution plan, but unfortunately don't really know what I'm looking for with inserts, as most of my experience with execution plans is with select statements. Any resources anyone could point me to for troubleshooting this would be much appreciated.
Thanks,
-Dave
View 3 Replies
View Related
Aug 27, 2007
I have a SELECT TOP query in order to return x number of top records from a table which has the indexing service enabled on it, such as this :
SELECT TOP(15) * FROM [TableName] ORDER BY [ColumnName]
and also there are not that many records(MAX 100 rows) kept in the table at the moment however, it will grow later.
The issue stems out from the ORDER BY [ColumnName] part of the syntax that changes the TOP selection order which makes the query to run very slow as I have also analyzed in the SQL SERVER QUERY ANALYZER.
Anyhow, I need to have the ORDER BY statement to show the data based on different columns either ascending or descending.
There might we workarounds to achieve the same goal that I am not aware of.
Any thoughts is appreciated.
View 3 Replies
View Related
Oct 5, 2006
I am running a Stored procedure which select from a table and returns approx 800000 records. When calling from any client machine it takes long time to return the result (90 sec). It waits for ASYNC_NETWORK_IO which is pushing the result to client. If select statement is used with TOP operator to return less number of records it executes faster. When calling from the server the stored proc returns data in 13 sec with all records. In another machine of identical HW and configuration this problem is not there. Can anyone help how to improve ASYNC_NETWORK_IO issue?
Environment
SQL-2005 SP1 64 bit Standard on Active/Passive cluster
Windows -2003 Ent.
Thanks
-Ashis
View 9 Replies
View Related