Slow INSERTs On A Table

Apr 5, 2007

Hello all. I've got a problem with really slow INSERTs on one (and only one) of the tables in a database. For example, using SQL Management Studio, it takes 4 minutes and 48 seconds to insert 25 rows. There are only about 8 columns in the table and only about 1500 records. All the other tables in the database are very fast for inserts.

Another odd thing uniquely associated with INSERTs on this table: prior to inserting the 25 new rows of data, SQL Management Studio tells me that it inserted 463 rows of data which I know did not happen. Here's the INSERT statement:

INSERT INTO FieldOps(StudySiteID
, QA_StructureID
, Notes
, PersonID)
SELECT DISTINCT StudySiteKey
, QA_StructureKey
, SampleComments1
, '25'
FROM ScriptOutput_Nitrate
WHERE (ScriptOutput_Nitrate.StudySiteKey IS NOT NULL)

and SQL Management Studio (eventually) says:
(463 row(s) affected)
(463 row(s) affected)

(25 row(s) affected)

The table has an index on the primary key (INT data type with auto increment). I tried running the following code to fix things but it made no difference:

USE [master]
GO
ALTER DATABASE [FieldData] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
GO

use FieldData
GO
DBCC CHECKTABLE ('FieldOps', REPAIR_REBUILD) With ALL_ERRORMSGS
GO

USE [master]
GO
ALTER DATABASE [FieldData] SET MULTI_USER WITH ROLLBACK IMMEDIATE
GO

I'm guessing that the problem might be related to the index (??). I don't know... Does anyone here have a suggestion as to what I should do to fix this problem.

View 9 Replies


ADVERTISEMENT

ADO Inserts Running Slow

Oct 13, 1999

Hi list,
I'm a long time lurker on this list and really enjoy the discussions, although I rarely get a chance to participate.

Here is my situation: We are importing chunks of data (500 records at a time) from a C++ interface. The records have to be transformed before inserting into the target table which I am doing using a stored proc which is working fine. The records are in memory in C++ and the programmer is looping through the records building inserts into a temp table through ADO (which my proc picks up). The server business object is using the connection.execute method which is inserting one record at a time. That part of the process is taking over 15 seconds for 500 records which is the bulk of the total time.

My question is: Using ADO is there a better way to insert these records into the temp table? I see mention of a recordset interface but my programmers are new to ADO and since I am the DBA and have never used ADO, I am not sure what to tell them.

Any insight would be greatly appreciated.

shawn

View 2 Replies View Related

ODBC Inserts Are Very Slow

Dec 22, 2004

Hi,

I am new to the windows world. We use Informatica on UNIX for ETL process. We have a requirement to load approx. 200,000 rows to a MS SQL Server table . The table is not that big and it is a heap table (no indexes). Inserts are taking 69 rows/per minute. We are using DataDirect Closed 4.10 SQL Server ODBC driver.

SQL Profiler tells us that is is doing a row by row processing and using sp_execute procedure.

Is there a way we can speed up the ODBC process?

-Thanks in advance
srv

SQL Server Version:
Microsoft SQL Server 2000 - 8.00.818 (Intel X86) May 31 2003 16:08:15 Copyright (c) 1988-2003 Microsoft Corporation Standard Edition on Windows NT 5.0 (Build 2195: Service Pack 4)

View 3 Replies View Related

Inserts Gradually Slow Down

Jul 23, 2005

I'm inserting 2 million+ records from a C# routine which starts outvery fast and gradually slows down. Each insert is through a storedprocedure with no transactions involved.If I stop and restart the process it immediately speeds up and thengradually slows down again. But closing and re-opening the connectionevery 10000 records didn't help.Stopping and restarting the process is obviously clearing up someresource on SQL Server (or DTC??), but what? How can I clean up thatresource manually?

View 9 Replies View Related

Slow Inserts Into Large Tables

Nov 29, 2000

We are inserting into a table, which includes an identity primary key column. When the table gets really large (i.e. 1.5 million records), the performance of the inserts reduce.

I noticed that when we insert into the table an exclusive lock on the table is obtained. Do inserts into tables with identities always lock the table?

Given the table size is unavoidable, does anyone have a suggestion to improve the performance?

Thanks,
Matt

View 6 Replies View Related

Distributed Query Doing Inserts -----Very Slow

Jul 26, 2002

Below given query is being executed on a Sql 2k box with 4CPU and 2GB RAM
testXX.DB_GRP.dbo.group1-----> is a sql 7 box with single CPU and 512MB RAM
The result set is abt 30,000 rows .
This whole Process is taking abt 5 mins to do the Insert Process.
Is there a way to optimise the query and bring down the execution time


insert into testXX.DB_GRP.dbo.group1
select num, group_num,group_desc from group2
where id = 20

---------------
If we just run the
select num, group_num,group_desc from group2
where id = 20

it takes 10 secs to execute this selct statement so i was wondering why it takes 5 mins to do the insert process across the network thru linked server query.

Any help would be appreciated?

Thanks,

MK

View 3 Replies View Related

Slow DB Inserts. Is Reflection Really This Expensive?

Aug 10, 2007

I'm relatively new to compact framework and SQL Server Compact so bear with me if I've got an obvious thing I've forgotten.

I've written my own database helper layer. My idea is to generate SQL Insert statements dynamically based on what is in the contents of each object. Peformance however is horrible. I try to do 1800 inserts and it takes about 50 seconds on the device (Release Build, outside of the IDE).

I pass in a list of objects to be inserted which derive from a ModelBase class. ModelBase includes some ORM information (what table the object goes into. what fields are mapped to which properties). I generate one SQLCeCommand object, one sql string (new params for each insert), and am using SqlCeResultResultSets.

What can I do to make this run faster? Thank you





Code Snippet
public bool Insert(List<ModelBase> recs)
{
SqlCeConnection con = new SqlCeConnection(connectionString);

try
{
con.Open();
con.BeginTransaction();
}
catch
{
return false;
}

SqlCeCommand cmd = new SqlCeCommand();
cmd.Connection = con;
String sqlString = "INSERT INTO [" + recs[0].tableName + "] (";
String sqlString2 = ") VALUES (";
PropertyInfo[] props = recs[0].GetType().GetProperties();
for (int x = 0; x < props.Length; x++)
{
PropertyInfo pi = props[x];
if (recs[0].fieldMap.ContainsKey(pi.Name))
{

sqlString += "[" + recs[0].fieldMap[pi.Name] + "]";
sqlString2 += "@" + pi.Name;
sqlString += ", ";
sqlString2 += ", ";
}
}
sqlString = sqlString.Substring(0, sqlString.LastIndexOf(", "));
sqlString2 = sqlString2.Substring(0, sqlString2.LastIndexOf(", "));
sqlString += sqlString2 + ")";
cmd.CommandText = sqlString;
cmd.CommandType = CommandType.Text;
foreach (ModelBase rec in recs)
{
cmd.Parameters.Clear();
for (int x = 0; x < props.Length; x++)
{
PropertyInfo pi = props[x];
if (rec.fieldMap.ContainsKey((pi.Name)))
{
if (pi.GetValue(rec, null) == null)
cmd.Parameters.AddWithValue("@" + pi.Name, DBNull.Value);
else
cmd.Parameters.AddWithValue("@" + pi.Name, pi.GetValue(rec, null));
}
}

try
{
SqlCeResultSet rs = cmd.ExecuteResultSet(ResultSetOptions.None);
}
catch(Exception e)
{
cmd.Connection.Close();
cmd.Connection.Dispose();
Program.log.Error(e.Message,e);
return false;
}
}
con.Close();
return true;

}

View 7 Replies View Related

Slow SELECT Blocking Critical INSERTs

May 20, 2008

Our company records live sales into an SQL Server database. The same tables that store the sale information are also used by a reporting interface to query sales figures, but occasionally a SELECT generated by the reports is so slow that it causes the INSERT operations to time out and fail.

We've tried increasing the CommandTimeout property in .NET of the application performing the inserts, but despite it being set to 90 seconds, it's still not enough to prevent the occasional sale from failing to record which is a big no-no for us. We've run the Database Tuning Advisor on the stored procedure that generates the SELECT for the reports, and it is now fully optimized but still this isn't enough. The tables are quite massive (millions of rows) and the SELECT requires JOINs to other tables, some of which are in a separate database on the same server.

Is there a solution to this problem, aside from increasing the CommandTimeout property to the point where no timeout errors occur? My concern is that doing this could increase the number of concurrent connections and we'd hit another limit, so it's not really solving the problem. Is there a way to configure SQL Server to always favour INSERTs over SELECTs? The reporting users won't really care if the reports are slower but it's critical to get these INSERTs up to 100% reliability.

I'm not a DBA (just a developer) so I don't have an intricate knowledge of databases and this problem is a bit beyond my level of expertise. Obviously we don't want to re-design our entire system, but we'll do whatever necessary to ensure we aren't failing to record sales.

One idea I had, which may be awful I don't know, is to submit these sales to an MSMQ and write some software that will read from the queue and insert the records from there instead. We could then deal with the timeout issue by just re-submitting the sale until it is accepted, then removing it from the queue.

View 4 Replies View Related

Slow Inserts/updates As The Database Size Grows

Oct 23, 2007

Hi all,

This managed application was written to run on a Symbol 3090 Win CE 5.0 scanning device. We are using the symbol provided classes to access the scanning interface, and SQL Compact database on the device to collect the scanned data, and then using merge replication to synchronize scanned data when the device is docked. The problem we have experienced seems to be releated to the performance when inserting and updating records in the database.

We have tested some randomly generated 1000 records and inserting/updatating into a database. At first the time to commit a record increases when the database is flushing into the memory (The flush interval in the connection string property is 10 seconds by default). and then as the database size grows increasing the time to commit every single record which is causing the application to perform slowly as they scan items into the database. However, the device program memory remains consistant as they are scan items. From our tests, I found the time to execute either a update/insert command on 2MB sqlMobile database (upto 10000 records, depending on the size of the columns) is taking nearly 2 to 2 and half seconds to complete. Below is the only code I am executing,


If Not sqlObj.UpdateItem(1061022, itemNo, 1) Then

sqlObj.InsertResultSet(1061022, itemNo, itemObj.Style, itemObj.Color, itemObj.Size, itemObj.Description, 0, 1)

End If

For the notes, I am using prepared updated command and resultset.insert methods to perform update and insert commands into the database.

Any help on this issue is highly appreciated.

Thanks
Ravi.



View 1 Replies View Related

How Do I Script INSERTS From An ASP Membership Table Onto Live Server Table

Aug 7, 2007

Hi,I would have used the aspnet membership tool to auto-create all the ASP.NET membership tables.  However, the hosting company don't allow remote connections which meant I had to create the tables by hand, scripting the tables using script to CREATE using management studio.However, I noticed one of the tables has data without any users: aspnet_SchemaVersions, which causes an error when trying to log onto my site.The fix is to make sure the table has the 4-5 rows of data in it (which is missing off the live server).  Its just a few rows of data, but I want to script the inserts for each row so I don't have to type them in using myLittleAdmin (the host's web version of management studio). Can anyone point me in the right direction?

View 3 Replies View Related

Is Having A Trigger That Inserts A Row In Table 'A', When A Row In Same Table Is Inserted By ADo.Net Code?

Oct 13, 2006

I want to insert a row for a Global user  in Table 'A' whenever ADO.Net code inserts a Local user row into same table. I recommended using a trigger to implement this functionality, but the DBA was against it, saying that stored proecedures should be used, since triggers are unreliable and slow down the system by placing unecessary locks on the table. Is this true OR the DBA is saying something wrong? My thinking is that Microsoft will never include triggers if they are unreliable and the DBA is just wanting to offload the extra DBA task of triggers to the programmer so that a stored procedure is getting called, so he has less headache on his hands.Thanks

View 2 Replies View Related

T-SQL With Inserts In Various Table In One Transaction

Mar 29, 2008

Dear All,
I developed an application years ago where I have to insert records in multiple table from asp.net page during registration process with one button click. I have seperate Insert statements and then i have to select Identity column value and insert into another table as a foriegn key and so on.
I want to run all these SQL insert in one statement or Stored procedure. becuase currently, if one of insert statemetn fails, I have no way to roll back previous inserts and start over again.

Any advice or sample T-SQL Where I can insert in Table A, and then read the identity from Table A and insert in Table B, in one single Transaction, so if it fails at end of Transaction, i want to roll back the all the inserts and updates in table(s).

Thanks,

View 13 Replies View Related

Cursors - Looping Through A Table And Do Inserts From It

Dec 5, 2006

I've been looking online and cannot find any help  / resources with this so I brought it here :D
 I'm looking for help in creating a Cursor (this will be inside a SP) that will loop through the records of a "Table" (Temporary or Retrieved) and for each row that is looped through I can use it's values to do inserts against a few other tables.
 Any resources / help would be great! I work best by example.

View 12 Replies View Related

Export Data From Table As SQL Inserts

Jul 21, 2005

Hi,

I have data in an old database I would like to capture for my new
system but I dont have the original insert scripts.  Is there a
tool (in SQL Server 2000 or thirdparty) that will help me export the
data as SQL inserts?

Thanks

jr.

View 1 Replies View Related

Selecting From Table With Lots Of Inserts

Mar 19, 2008

Hi,

I am working on an application to analyse down time on a production line system. The system has about 40 rows inserted per minute. The inserts are coming from about 10 different stations.

I need to a analyse the downtime between each insert from each station. The plan is to copy the data to another database on a different server so as not to affect the live system that is being updated by the production line.

However the initial requirement was to do this at night while the production line was down but now they want the data to be updated every 3 hours which means performing this huge query while the production line is bombarding the DB with inserts.

I am wondering what is the best way of doing this. Is there any way I can limit the abount of processor this proceedure will take.

Any advice appreciated,

Thanks,
Sean

View 2 Replies View Related

DB Engine :: Concurrent Inserts Into Same Table

Jun 26, 2015

I have a table into which the inserts will be done by multiple users at the same time;

The table has a primary key, non clustered, on unique id; each insert will have unique id, which is a seqence generated number, so they are always different for different transactions.

Is this possible to set the settings of sql server in such a way, that these inserts are done at the same time i.e. during insert by one user that table is not locked so that other inserts can take place at the same time as well?

View 6 Replies View Related

How To: Create A Table That Only Allows Inserts And Deletes.

Feb 26, 2008


I need to create a table that only allows records to be inserted or deleted. Once the record has been created it can only be deleted. Is there anyway to configure a table in this manner?

Table Definition

USE [DB_AUTOMATED_PACKAGING_SYSTEM]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[TBL_PCL_LENS_DATA](
[SerialNumber] [varchar](50) NOT NULL,
[ProcessedDate] [datetime] NOT NULL,
[Filename] [varchar](50) NOT NULL,
[CartonLabelImage] [image] NOT NULL,
[ExpirationDateLabelImage] [image] NOT NULL,
[LabelSetLabelImage] [image] NOT NULL,
[ReplyCardLabelImage] [image] NOT NULL,
[TextFile] [ntext] NOT NULL,
CONSTRAINT [PK_TBL_PCL_LENS_DATA] PRIMARY KEY CLUSTERED
(
[SerialNumber] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO
SET ANSI_PADDING OFF

View 5 Replies View Related

Master And Child Table Inserts At A Time.

May 15, 2007

I have a situation here. Please
advice me on this.



I have a master table and a child
table. They have a PK and FK relationship. The master table has an identity
column with auto increment set to true. This map as a FK in the child table. My
questions are:

Can I have a single form to insert a new record into
master and child table at the same time?This has to be accomplished without stored
procedures. Can it be done?Is it possible to do this with a single insert query?
If yes, can it be done with sql data source or
dataset or tableadapters’?Please point me towards appropriate link for doing
so.





Thank you.

I am currently using SQL Server
2005 and VS 2005.

View 1 Replies View Related

Creating Stored Procedure That Inserts Value In Table

Feb 13, 2008

I am using vwde2008 and created db and table. i want to create a stored procedure that will insert values. I am bit lost on how write this.
this is my table info
table = BusinessInfo
Columns = BusinessID, BusinessName, BusinessAddress
how can i create a stored procedure ?

View 1 Replies View Related

Inserts And Updates To A Table That Contains A Unique Key Constraint

Nov 5, 2007

I am looking for pros and cons for the following scenarios:

When a table contains a unique key constraint is it viable to always do an insert and immediately check the @@ERROR value and if @@ERROR states a duplicate key exception then perform an update statement?

Another possible solution would be to always check if the key exists and then do the insert / update based upon that result. This method will always require two steps.

View 4 Replies View Related

Help: How To Detect Inserts, Updates, Deleted On A Table From Within C++ Application?

Jul 20, 2005

Hopefully someone can at least point me in the right direction for moreresearch (e.g.: correct terminology). My only previous experience was justdumping data into a database using ODBC, and that was some years ago so nowmostly forgotten.I need to write an NT Service/Application (in C/C++) that will be gettingdata sent to it via SQL Server 2000. The data will arrive in my SQL Server(read-only access), via replication of tables from another remote SQLServer.My application needs know when new row are inserted, or updated so it can toread this data (needs to be quick/timely so hopefully no polling) to theninterface with other remote proprietary systems.T.I.A.PS: If you can recommend appropriate books on SQL Server 2000 that wouldalso be useful.

View 2 Replies View Related

Counting The Inserts And Updates On A Table In A Sql Server Database

Jul 20, 2005

Hello,Can someone point me to getting the total number of inserts and updates on a tableover a period of time?I just want to measure the insert and update activity on the tables.Thanks.- Vish

View 3 Replies View Related

WHILE Statement To Loop Through A Table And Get The IDENT_CURRENT Values As It Inserts

Aug 14, 2007



Hi

I have a SSIS package that imports data into a staging table from an excel sheet (This works fine). From the staging tabler i want it to insert the values into my members table, take that unique indentityID that gets created and insert the other values into other tables for that member that was just created.

In the staging table, i have all the values for a single member. But the structure of the database needs all the values inserted into seperate tables. There is no conditions ID in my members table, so the member first has to be created and from there i need to use the newly created member's MemberID and insert the conditions into a seperate table using the MemberID

I have created some sample data that can be used. I think i have an idea of how to do it, but i'm not totally sure if it will work that way, i have however included it in the sample data.





Code Snippet
DECLARE @ImportedStagingData TABLE
(
ID INT IDENTITY(1,1),
Name VARCHAR(50),
Surname VARCHAR(50),
Email VARCHAR(50),
[Chronic Heart Failure] INT,
[Colon Cancer] INT
)
INSERT INTO @ImportedStagingData VALUES ('Carel', 'Greaves', 'CarelG@Email.com', 1,0)
INSERT INTO @ImportedStagingData VALUES ('Jamie', 'Jameson', 'JamieJ@Email.com', 1,1)
INSERT INTO @ImportedStagingData VALUES ('Sarah', 'Bolls', 'SarahB@Email.com', 0,1)
INSERT INTO @ImportedStagingData VALUES ('Bells', 'Scotch', 'BellsS@Email.com', 1,1)
INSERT INTO @ImportedStagingData VALUES ('Stroh', 'Rum', 'StrohR@Email.com', 0,0)
DECLARE @Conditions TABLE
(
ID INT IDENTITY(1,1),
Condition VARCHAR(50)
)
INSERT INTO @Conditions VALUES ('Chronic Heart Failure')
INSERT INTO @Conditions VALUES ('Colon Cancer')
DECLARE @Members TABLE
(
MemberID INT IDENTITY(1,1),
Name VARCHAR(50),
Surname VARCHAR(50),
Email VARCHAR(50)
)
DECLARE @memConditions TABLE
(
MemberID INT,
ConditionID INT
)
SELECT * FROM @ImportedStagingData
SELECT * FROM @Conditions
SELECT * FROM @Members
SELECT * FROM @memConditions
/* --- This is the part that i am battling with ---
DECLARE @CurrentValue INT
DECLARE @numValues INT
SET @numValues = (SELECT COUNT(ID) FROM @ImportedStagingData)
WHILE @numValues <> 0
BEGIN
INSERT INTO @Members
SELECT Name, surname, email
FROM @ImportedStagingData
GO
SET @CurrentValue = (SELECT IDENT_CURRENT('@ImportedStagingData'))
INSERT INTO @memConditions (MemberID), (ConditionID)
VALUES (@CurrentValue, --ConditionValue from @ImportedStagingData, all the values that have a 1)

@numValues = @numValues - 1
END
END
*/






All help will be greatly appreciated.

Kind Regards
Carel Greaves

View 5 Replies View Related

Transact SQL :: Multiple Inserts Different Columns And Tables Into One Table

Oct 12, 2015

I have a Problem with my SQL Statement.I try to insert different Columns from different Tables into one new Table. Unfortunately my Statement doesn't do this.

If object_ID(N'Bezeichnungen') is not NULL
   Drop table Bezeichnungen;
GO
create table Bezeichnungen
(
 Artikelnummer nvarchar(18),
 Artikelbezeichnung nvarchar(80),
 Artikelgruppe nvarchar(13),
 
[code]...

View 19 Replies View Related

SQL Server 2008 :: Add A Trigger That Inserts Original Data From 1 Table To Another

Apr 10, 2015

I am trying to create a trigger on a table. Let's call it table ABC. Table looks like this:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[ABC](
[id] [uniqueidentifier] NOT NULL,

[Code] ....

When someone updates a row on table ABC, I want to insert the original values along with the current date and time getdate() into table ABCD with the current date and time into the updateDate field as defined below:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[ABCD](
[id] [uniqueidentifier] NOT NULL,

[Code] .....

The trigger I've currently written looks like this:

/****** Object: Trigger [dbo].[ABC_trigger] Script Date: 4/10/2015 1:32:33 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TRIGGER [dbo].[ABC_trigger] ON [dbo].[ABC]

[Code] ...

This trigger works, but it inserts all of the rows every time. My question is how can I get the trigger to just insert the last row updated?

I can't sort by uniqueidentifier in descending as those can be random.

View 9 Replies View Related

SQL Server 2014 :: Automating Random Inserts Into A Memory Optimized Table

Jan 28, 2015

I have this table

CREATE TABLE [Sales].[Test_inmem]
(
[c1] [int] NOT NULL,
[c2] [nvarchar](20) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ModifiedDate] [datetime2](7) NOT NULL CONSTRAINT [IMDF_Test_ModifiedDate] DEFAULT (sysdatetime()),

[Code] ....

I have to generate 1000000 random records into it. I tried various ways to insert records, but not being a developer could not do it. I hope to make the C1 as a serial number, C2 can be anything, C3 I want to be the timestamp.

View 3 Replies View Related

TRIGGER: Help With 2 IFTHEN Statements Driving Multiple Inserts Into B_items Table...

Jul 30, 2007

Assuming I should be using values from temp inserted to insure correct record...
Need help coding IF...THEN INSERT statements in following After TRIGGER:

Create TRIGGER trg_insertItemRows

ON dbo.a_form
AFTER INSERT

AS
SET NOCOUNT ON
-- Checkbox Driven:
IF a_form.missingCheckbox = -1 THEN
Insert into b_items (form_ID, parent_ID, ItemTitle)
Values (Select Distinct i.form_ID,i.parent_ID from inserted i)', '+ 'User checked Missing Data')

-- Textbox Driven:
IF a_form.incorrectTxtbox <> 'na' THEN
Insert into b_items (form_ID, parent_ID, ItemTitle)
Values (Select Distinct i.form_ID,i.parent_ID from inserted i)', '+ Correction: Replace '+ incorrectTxtbox + ' with '+replaceWithTxtbox)


Sample code below:

-- Source table the Trigger acts on
Create Table a_form (
form_ID int Not Null,
parent_ID int,
missingCheckbox bit,
missingNote varchar(100),
incorrectTxtbox varchar(50),
replaceWithTxtbox varchar(50)
)

--Target table Trigger inserts into
Create Table b_items (
items_ID int Not Null,
form_ID int Not Null,
parent_ID int,
ItemTitle varchar(150)
)

View 5 Replies View Related

SQL Server 2014 :: Stored Procedure That Inserts And Updates A Table With Excel Data?

May 27, 2014

I need a script that inserts the data of an excel sheet into a table. If something already exists it should leave it, unless it's edited in the excel sheet and so on and so on. This proces has to go through a stored procedure... ...But how?

View 6 Replies View Related

SQL Server 2012 :: Why Indexes On Table Slow Down DML Operation On Table

Mar 7, 2014

Why the Indexes on table slow down the DML operation on table, what is the exact reason?

View 5 Replies View Related

Extremely Slow Table

Sep 16, 2005

Hi,I have a table defined asCREATE TABLE [SH_Data] ([ID] [int] IDENTITY (1, 1) NOT NULL ,[Date] [datetime] NULL ,[Time] [datetime] NULL ,[TroubleshootId] [int] NOT NULL ,[ReasonID] [int] NULL ,[reason_desc] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_ASNULL ,[maj_reason_id] [int] NULL ,[maj_reason_desc] [nvarchar] (255) COLLATESQL_Latin1_General_CP1_CS_AS NULL ,[ActionID] [int] NULL ,[action_desc] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_ASNULL ,[WinningCaseTitle] [nvarchar] (255) COLLATESQL_Latin1_General_CP1_CS_AS NULL ,[Duration] [int] NULL ,[dm_version] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_ASNULL ,[ConnectMethod] [nvarchar] (255) COLLATESQL_Latin1_General_CP1_CS_AS NULL ,[dm_motive] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_ASNULL ,[HnWhichWlan] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_ASNULL ,[RouterUsedToConnect] [nvarchar] (255) COLLATESQL_Latin1_General_CP1_CS_AS NULL ,[OS] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_AS NULL ,[WinXpSp2Installed] [nvarchar] (255) COLLATESQL_Latin1_General_CP1_CS_AS NULL ,[Connection] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_ASNULL ,[Login] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_AS NULL,[EnteredBy] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_ASNULL ,[Acct_Num] [int] NULL ,[Site] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CS_AS NULL ,CONSTRAINT [PK_SH_Data] PRIMARY KEY CLUSTERED([TroubleshootId]) ON [PRIMARY]) ON [PRIMARY]GOWhich contains 5.6 Million rows and has non clustered indexes on Date,ReasonID, maj_Reason, Connection. Compared to other tables on the sameserver this one is extremely slow. A simple query such as :SELECTSD.reason_desc,SD.Duration,SD.maj_reason_desc,SD.[Connection],SD.aolEnteredByFROM dbo.[Sherlock Data] SDWhere SD.[Date] > Dateadd(Month,-2,Getdate())takes over 2 minutes to run ! I realise the table contains severallarge columns which make the table quite large but unfortunately thiscannot be changed for the moment.How can i assess what is causing the length of Query time ? And whatcould i possibly do to speed this table up ? The database itself isrunning on a dedicated server which has some other databases. None ofwhich have this performance issue.Anyone have any ideas ?

View 5 Replies View Related

ETL Sort Is Very Slow On A Big Table

Feb 11, 2008

Hi,

We have developped an ETL. For development we used small test files (10 000 rows) to test if it works correctly.
This runs in less then a minute

In Test we are using a file which contains all rows (7 million). We did twice a test and we first stopped the process after a week and the 2nd time we stopped the process after a weekend.

We are able to trace the problem to the point where it has to sort the tables.

The proces is pretty simple. We use two connectors to directly connect to the tables.
Then we have two blocks to sort the data. And then we have one block to merge the data.

Should we which to let SQL do the sorting ? Since it is in staging is has no index on that column. A select on the tables with an order by takes 3 minutes to return all those rows.

Any idea's ?

Also is there a page with the best practices for ETL ?

Constantijn

View 4 Replies View Related

Can Logging Be Turned Off On Inserts To A Specific Temp Table From A Specific Sp?

Oct 10, 2007

I want to ship 500,000 aged transactions each night to an archive table and delete them from their source table in one or more logical units of work (LUW). Each row is approx 60 bytes and there is only one non clustered index on the source table presently.

I'm trying to weigh the pros and cons of 3 alternatives. One of them would basically insert the non-aged rows into tempdb, ship the aged records, truncate the table and then insert the tempdb records back into their source all in the same LUW.

For this alternative, I'd at least like to turn off logging when the records get inserted into tempdb as I dont see any value in logging that part of the activity. Is this possible?

View 4 Replies View Related

Slow SELECT On Single Table

Aug 4, 2000

SELECT * on a 4000 row table is taking more than 12 seconds.
Other larger tables are not nearly as slow.
I've DBCC dbreindex'd, and dbcc showcontig shows density at 100%.

How can I figure out why this is happening?
What are some remedies?

Thanks for your help.

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved