Mass SQL Inserts - Performance :-(
Aug 8, 2006
Hi,
I have an user table with a single integer column. No indexes, no identities, nothing. I have to insert 600,000 rows via a client app. In tests, using BCP/Bulk Insert/DTS all runs OK (sub 3 seconds). However the app takes 5000 rows a second [considerably slower]. I can mimic this slow perfomance, within DTS, by removing the 'Fast Load' & 'Batch' options.
Question= why would the SQL insert run slower on one server and as fast as BCP/Bulk Insert/DTS on another? What can I check on the 'slow' running server? May there be a file version anomally ??
Version = SQL-2000 SP4
Any help much appreciated !!!
View 2 Replies
ADVERTISEMENT
Sep 18, 2007
I have a SQL Server 2005 database where covering indexes had to be used to improve performance for the heavy amounts of retrievals; however, the inserts into the tables are now very slow of course. Is there any way to improve the performance of the inserts without taking away the indexes.
Would changing locking or partitioning the index help the inserts?
Other databases use a concept of "freespace" to set up in the beginning - making pre-existing space for inserts - is there anything like this in SQL Server 2005?
Thanks for any help, Mary
View 4 Replies
View Related
Feb 21, 2007
We recently implemented merge replication.We were expereincing. The replication is between 2 SQL Servers (2005) over same network box, and since we have introduced the replication, the performance has degraded considerably on subscriber end.
1) One thing that should be mention is that its a "unidirectional Direction" flow of changes is from publisher towards subscriber (only one publisher and distributor as well and one subscriber ).
2) Updates are high than inserts and only one article let say "Article1" ave update up to 2000 per day and i am experiecing that dbo.MSmerge_upd_sp_Article1_GUID taking more cpu time.what should be do..
on subscriber database response time is going to slow and i am experiencing a lot of number of LOCK time outs on application end.
can any one can also suggest me server level settings for aviding locking time out.
looking for any experieced solution/suggestion.
Thanks in advance.
View 3 Replies
View Related
Oct 8, 2007
We have a SQLServer 2005 Enterprise merge replication publication with SQL Mobile 3.0 subscribers (Windows Mobile 5.0 and 6.0). We do not use pre-computed partitions due to trigger performance issues with an SSIS/ETL application that supplies data to the merge database. We do use the "Optimize" (=true) option, though we have tried this both ways with no significant differences. We use filters and joins for each worker ID (as HOST_ID) from the subscriptions.
The sync times become increasingly worse after we run the snapshot and bring the publication online. I have tried rerunning the snapshots, this helps little, as it often behaves like the subscription was set to reinitialize and forces a big sync (reload of all data) to the subscriber. We have tried much of the obvious (e.g., flattening filters and joins, adding indexes, etc.).
When users are synchronizing, we watch replication monitor and notice that a lot of time is spent processing "enumerating inserts and updates for article [any article]", especially processing the many generations and batches. This is true for any follow-up syncs after the 1st big sync (initializing the subscription).
I read several posts regarding the batches and generations of changes, and decided to try increasing the €œDownloadGenerationsPerBatch€?. I tried adding this parameter to the snapshot agent job, and the job fails each time with a vague message, even with the default value of 100. How do you change this parameter for SQLServer 2005 Enterprise?
Any suggestions?
Thanks in advance,
Matt
View 5 Replies
View Related
Apr 25, 2008
I have a Hits table that tracks the hits on the id of a Link table:
Hits:
int linkId (foreign Key to Link)
datetime dateCreated
varchar(50) ip
We recently had to merge Links from different systems that are implemented similarly. As a result, all the linkIds are now wrong in the Hits table because the ids all changed. I managed to track down all the old ids and their associated new ids and have it in a table that I call joined_links
joined_linksint oldId
int newId
So, how do I do a mass update of these linkIds in the Hits table in SQL? I know I could do it in .NET, but I'd rather not write an app to do that runs thousands of update statements. There's gotta be a way to do it something like this:
UPDATE Hits h SET h.linkId = (SELECT newId FROM joined_links WHERE oldId=h.linkId)
but obviously I don't have visibility of that linkId in the subselect... A Loop maybe?
View 6 Replies
View Related
Aug 13, 2004
I have a series of DTS packages.
Each package has 20 queries.
Each query has a server name.
Is there a way to change the servername without editing each query in each DTS package.
I'd like to copy the template DTS package, then perform the modification.
Thanks
View 2 Replies
View Related
May 8, 2008
This is a run of the mill application that moves orders from one table to another. There are 2 tables, Ordersummary & HstOrders.
Ordersummary has the following columns...
Identifier
FollowupId
OrderNumber
OrderReference
OrderReferenceOrigin
......
......
HstOrders has the following columns...
Identifier
OrderNumber
OrderReference
OrderType
......
......
The above two tables are bound by Identifier. After each month end, Orders are moved from Ordersummary to HstOrders.
Now my task is to update all rows in OrderSummary with the order details as seen in HSTOrders for ordertype = 'CREDIT'. OrderReferenceOrigin(in Ordersummary) should be updated with the value of Orderreference(from hstorders).
I have to update each row at a time & I need to write a cursor for mass updates where ordersummary.identifier = hstorders.identifier.
Can someone please help with in writing this update statement as I never wrote a cursor.
View 1 Replies
View Related
Oct 1, 2001
Hi friends,
Any idea about mass mailing system using SQL Server .Pls get back to me.
thanx and regards
Chinmay
View 2 Replies
View Related
Apr 17, 2006
SQL Server 2000 on Win2k
I'm fairly new to SQL Server and I'm just wondering if it's possible to Update Statistice for all indexes somehow? I'm looking at the Update Statistics command and it doesn't seem to be possible.
The situation we have is a reporting DB that basically has all it's tables truncated and remade every night by some DTS jobs that import from another datasource and change the data and build some denormalzed tables etc.
Some of the large Insert operations go from taking 8 mins to taking several hours sometimes and updating the stats seems to fix the problem for a while. So I'd like to make sure the optimizer has all the latest stats for all tables.
Any other advice would be appreciated.
Cheers.
View 2 Replies
View Related
Oct 11, 2007
Does anyone know what the best way to do mass updates in SQL server is? I am currently using the methodology suggested in this article
http://www.tek-tips.com/faqs.cfm?fid=3141
But the article is assuming that once I update a field it is going to have a value that is NOT NULL. So I can loop through and update the rows that have a NOT NULL value. But my updated rows do contain NULL values, in this case what is the best way to go about this???
***************************************
Here is my code. I want to avoid using Upd_flag becos
after the following code runs I need to reset that flag
before I run my next query
***************************************
--Set rowcount to 50000 to limit number of inserts per batch
Set rowcount 50000
--Declare variable for row count
Declare @rc int
Set @rc=50000
While @rc=50000
Begin
Begin Transaction
--Use tablockx and holdlock to obtain and hold
--an immediate exclusive table lock. This usually
--speeds the insert because only one lock is needed.
update t_PGBA_DTL With (tablockx, holdlock)
SET t_PGBA_DTL.procedur = A.[Proc code],
t_PGBA_DTL.Upd_flag = 1
FROM t_PGBA_DTL
INNER JOIN CPT_HCPCS_I9_PROC_CODES A
ON t_PGBA_DTL.PROC_CD
= A.[Proc code]
WHERE t_PGBA_DTL.Upd_flag = 0
--Get number of rows updated
--Process will continue until less than 50000
Select @rc=@@rowcount
--Commit the transaction
Commit
End
View 4 Replies
View Related
Oct 4, 2007
I have the following problem. I need to insert 100.000 records (50Kb each) in one operation from a VB program into a SQL Server 2005 database. All of these records will be ready for inserting at the same time.
How to make this insert as one big transaction instead of 100.000 small ones?
View 2 Replies
View Related
Nov 15, 2007
The company i work for changed names and all email addresses within the company have changed. While it was OK for a while they are no longer going to be forwarding email to the old addresses to the new ones. There are Tons of subscriptions and tons of email addresses that need to be changed to the new names.
If i could find the table with the TO: part of the subscription held in it i could just run an update on that field and it would be solved...however, i cannot find that field...
So,
Without going into every subscription in report manager, how can i change the email addresses? Any Suggestions?
Thanks in advance
View 1 Replies
View Related
May 16, 2008
Hi There
Most of the time my solutions consist of 1 or 2 packages and config files work well.
Now i have a solution with about 50 packages i have to move to QA.
However a config file has the package ID, so even thoug they use the same data source connection. I would have to create and change 50 config files.
Data sources are kept in the package xml , so if i copy all the packages to QA , and then change the Global Data Source connection, i still have to open each package maually and save it again.
Surely there must be an easy way to move all 50 packages and have the connection now point to QA. But config files and global data sources dont do the trick, what am i missing here ?
Thanx
View 5 Replies
View Related
Feb 25, 2004
I am making a prog that needs to import many records from a spreadsheet on a local computer through asp.net into sql server
is there a simple command to do this or is there information on how to do this
please give all the information that you can thank you
View 9 Replies
View Related
May 30, 2002
I want to backing up my hourly transaction log backups direct to a mass storage unit as opposed to the local server. However when trying to set this up it only gives me option of backing up to local drives, even though I have a drive mappping to the mass storage unit.. I'm there is a simple around this.Would appreciate any advice..many Thanks..
View 2 Replies
View Related
Jun 28, 2001
Hi,
I was wondering if anyone knows of any way, including third party tools, to replicate a design change on a table across many different databases. I have written an ASP script that allows me to copy multiple tables to multiple databases in one go but I need something that will allow me to replicate the design of a table by comparing source and destination tables. So far I have scripted most of it but I have no idea on which system table to get the identity information and it seems there must be an easier way!
Any help would be appreciated,
Seoras
View 1 Replies
View Related
Mar 4, 2005
If I have a trigger on a field in a table, and I update one record trigger fire properly. If I do a update to that same field on all records in the table the trigger does not fire. I the error in the trigger, or do I need to change my update statement?
View 2 Replies
View Related
Sep 17, 2013
I have an existing SQL table that I want to import 300,000 rows into. I have copied the table headers into Excel and added all the rows but constantly getting multiple errors no matter what way I try to import it.
So far tried Import Method from SQL Management Studio, BULK INSERT and SQL Script. What is the easiest way?
If I try this
select *
into Coupon FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=D:coupon.xls;HDR=YES',
'SELECT * FROM [coupon$]')
I get
OLE DB provider 'Microsoft.Jet.OLEDB.4.0' cannot be used for distributed queries because the provider is configured to run in single-threaded apartment mode.
View 9 Replies
View Related
Dec 13, 2007
Hi,
I've been having problems with my tempdb filling up, and causing all databases on the server to stop functioning properly. I've been removing alot of data lately (millions of rows), and I think this is the reason why my tempdb log is going thru an unusual load.
Whats the best way to make sure the tempdb doesnt fill up causing me major problems? I had temporarily turned off backups while I was having a new HD put in. Am I right in thinking that when a DB is backed up, the tempdb log is reduced in size? Should maintaining a daily backup solution help keep things under control ?
Thanks very much for any tips!
mike123
View 4 Replies
View Related
Jun 29, 2007
Hi,I need to update a field in about 20 records on a table. The table hasan update trigger (which updates the [lastedited] field whenever arecord is updated). As a result I'm getting an error: "Subqueryreturned more than 1 value.", and the update fails.Is there a way in the stored procedure to handle this issue?thanks for your help.Paul
View 2 Replies
View Related
Apr 25, 2007
Hi,
I've deleted about 3-4 million rows from one of my tables as the data was old and no longer needed. The problem is that now queries are runnning extra slow. I am in the process of running taras isp_ALTER_INDEX however its taking quite a long time and seems to be slowing things down even further while its running as expected. (It's been running 4 hours already, I have stopped it and will rerun it a slower traffic period for the db server)
Just wondering if I have the right approach here or if anyone else has any suggestions.
Thanks for your help!
View 14 Replies
View Related
Jun 27, 2007
Hello,I need to alter fields in all my tables of a given database, and Iwould to do this via a t-sql script.Example, I want to change all fields called SESSION_ID to char(6). Thefield is usually varchar(10), but the data is always 6 characters inlength. I have serveral fields that are fixed length that I want tomove from varchar to char.I believe I can find all the tables where the field exists usingselect * from INFORMATION_SCHEMA.columns where column_name ='SESSION_Id'but I dont know how to take this into an ALTER TABLE , ALTER COLUMNthat can be automatedTIARob
View 2 Replies
View Related
Feb 26, 2008
Say you have an existing populated SQL 2005 database, with 700+ tables, and you want to just change the order of the columns inside every table. Short of manually building conversion scripts, anyone know an automated way to do this? I was thinking thru ways to do them all in one shot, and have tools like Erwin and DbGhost that could be used also. Basically moving some standard audit columns from the end of the tables to just after the PK columns.
Thanks, Bruce
View 8 Replies
View Related
Nov 15, 2006
Hola!I'm currently building a site that uses an external database to store all the product details, and an internal database that will act as a cache so that we don't have to keep hitting the external database to retrieve the products every time a customer requests a list.What I need to do is retrieve all these products from External and insert them into Internal if they don't exist - if they do already exist then I have to update Internal with new prices, number in stock etc.I was wondering if there was a way to insert / update these products en-mass without looping through and building a new insert / update query for every product - there could be thousands at a time!Does anyone have any ideas or could you point me in the right direction?I'm thinking that because I need to check if the products exist in a different data store than the original source, I don't have a choice but to loop through them all.Cheers,G.
View 2 Replies
View Related
Oct 5, 2007
Hi,
I'm working with MRS and I've got a table with a lot of entries. For each value in the table I'm trying to get the text colour to be set to 'red' when the value of the cell is less than 0. Otherwise remain black.
I can do this by setting the colour property cell by cell. But I have a lot of cells in the table. Is there a way to set the statement to apply to ALL cells in the table?
Basically I'm asking if there is a way to set the property in bulk instead of going through tediously cell by cell.
Any help would be much appreciated. Thanks!
View 4 Replies
View Related
Sep 12, 2004
1. Use mssql server agent service to take the schedule
2. Use a .NET windows service with timers to call SqlClientConnection
above, which way would be faster and get a better performance?
View 2 Replies
View Related
Mar 3, 2006
Just curios,
I have four 72 GB Drive on a RAID 5
Disk Specs
IO/Second = 130 per disk
Speed RPM = 15 K
When I did a load test of inserting data into a table
with four Columns
Col1 INT
Col2 VARCHAR(32)
Col3 VARCHAR(4000)
Col4 DATETIME
I could insert around 1044 Inserts per second where as
I thought I could do max of 520 Inserts ( 130 * 4 ) because
each disk can only take 130 Inserts multiplied by 4 Disks
gives me a theoritical limit of 520.
Also How does the Query Analyser Connects to the datbase
Server.. does it use ODBC
Thx
Venu
View 1 Replies
View Related
Mar 21, 2006
I am doing a simple IO Test with the below script ...
Just wanted to keep things simple and to check how many Inserts I can do on a given SQL Server.
I am running the below script from QA for 1 minute and then divide the No or rows inserted by 60.
Will it give me approximate results by duing this?
Actually the datafiles .MDF files are sitting on a single drive where the manufacturer specs shows that it will handle 130 IO's per disk. With the below script I am getting around 147 Inserts per second.
But my boss says that he is getting 2000 inserts per second on his laptop from a ...Am I missing some thing?
DECLARE @lnRowCnt INT
SELECT @lnRowCnt = 100000
WHILE @lnRowCnt > 0
BEGIN
SET NOCOUNT ON
INSERT INTO CTMessages..Iotest
SELECT @lnRowCnt , 'VENU' , REPLICATE ( 'V' , 4000 ) , 1000000
SELECT @lnRowCnt = @lnRowCnt - 1
END
Thx
Venu
View 3 Replies
View Related
Jun 23, 2006
Hello Everyone,I have a very complex performance issue with our production database.Here's the scenario. We have a production webserver server and adevelopment web server. Both are running SQL Server 2000.I encounted various performance issues with the production server with aparticular query. It would take approximately 22 seconds to return 100rows, thats about 0.22 seconds per row. Note: I ran the query in singleuser mode. So I tested the query on the Development server by taking abackup (.dmp) of the database and moving it onto the dev server. I ranthe same query and found that it ran in less than a second.I took a look at the query execution plan and I found that they we'rethe exact same in both cases.Then I took a look at the various index's, and again I found nodifferences in the table indices.If both databases are identical, I'm assumeing that the issue is relatedto some external hardware issue like: disk space, memory etc. Or couldit be OS software related issues, like service packs, SQL Serverconfiguations etc.Here's what I've done to rule out some obvious hardware issues on theprod server:1. Moved all extraneous files to a secondary harddrive to free up spaceon the primary harddrive. There is 55gb's of free space on the disk.2. Applied SQL Server SP4 service packs3. Defragmented the primary harddrive4. Applied all Windows Server 2003 updatesHere is the prod servers system specs:2x Intel Xeon 2.67GHZTotal Physical Memory 2GB, Available Physical Memory 815MBWindows Server 2003 SE /w SP1Here is the dev serers system specs:2x Intel Xeon 2.80GHz2GB DDR2-SDRAMWindows Server 2003 SE /w SP1I'm not sure what else to do, the query performance is an order ofmagnitude difference and I can't explain it. To me its is a hardware oroperating system related issue.Any Ideas would help me greatly!Thanks,Brian T*** Sent via Developersdex http://www.developersdex.com ***
View 2 Replies
View Related
Apr 11, 2007
I'm using a SQLDataSource and trying to do two inserts into two different tables with one InsertCommand, but it's not working. Here's the code I'm trying to use. Do you see anything wrong with the syntax? I keep getting an error that says error near ',' but I can't figure out why. Thanks
InsertCommand="INSERT INTO [OurProjects] ([Title], [Description], [Location], [Anchors], [Size], [Developer], [DesignBuilder], [Architect], [ImageName], [MapName], [ProjectTypeAbbrev], [Deleted]) VALUES (@Title, @Description, @Location, @Anchors, @Size, @Developer, @DesignBuilder, @Architect, @ImageName, @MapName, @ProjectTypeAbbrev, @Deleted),
INSERT INTO [OurProjectsImages] ([OurProjectsID], [ImageMonthName], [SwfName]) VALUES (@OurProjectsID, @ImageMonthName, @SwfName)"
View 2 Replies
View Related
May 27, 2008
I need to Add a Check in the database to ensure that user can only enter up to 20 entries to Database in a period of 10 minutes. Basically, to guard against people using scripts to add data to the database ( instead of using CAPTCHAE on the front end) what we want to do is restrict user to entering at most 20 transactions in 10 Minutes.How do I handle or do this in SQl Server 2005??
What I figure to do is right after I do an INSERT into the table, I Select the last 20 entries into that same table and then Calculate the Total time it took to add those 20 transactions and set the righ flag.
1) How do I select last 20 entries into a table??
2) How do I calculate the total time that elapsed between adding the first of those 20 records and the last??
Thanks in Advance
View 6 Replies
View Related
Sep 21, 2004
G'day,
I have a table with a primary key being a bigint and its set to auto increment (or identity or whatever ms calls it). Is there anyway I can get the ID number that will be assigned to the next Insert before I insert it? I want to use that ID number within another field when inserted.
I hope that makes sense.
Thanks for any help.
Robbo
View 3 Replies
View Related
Jul 14, 2005
Hi, I'm trying to create a form where new names can be added to a database. The webform looks like this:<body MS_POSITIONING="GridLayout"> <form id="Form1" method="post" runat="server"> Name:<asp:TextBox ID="newName" runat="server" /> <INPUT id="NewUserBtn" type="button" value="Create New User" name="NewUserBtn" runat="server" onServerClick="NewBtn_Click"> </form>And the code behind looks like this:Public Sub NewBtn_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles NewUserBtn.ServerClick Dim DS As DataSet Dim MyConnection As SqlConnection Dim MyCommand As SqlDataAdapter
MyConnection = New SqlConnection("server=databaseserver;database=db;uid=uid;pwd=pwd") MyCommand = New SqlDataAdapter("insert into certifications (name) values ('" & newName.Text & "'); select * from certifications", MyConnection)
DS = New DataSet MyCommand.Fill(DS, "Titles")
Response.Redirect("WebForm1.aspx", True) End SubWhen I try to insert one name it works. When I try to insert a second name, it overwrites the old one. Why is that?Thanks.James
View 3 Replies
View Related