I have a DataTable in memory and I want to write a C# code to dump the data into a SQL database. Is there a faster way of dumping millions of rows into a SQL table besides running INSERT INTO row by row?
i have an application that generate a lot rows from 1 mellion to 2 mellions rows i wana insert this record in MS SQL server in a fast way
i am currentlly loop through this records while it is loaded in dataset building a command text that generate insert query for each row and run it against SQL server
but it takes a lot of time to be finished is there r a way to bulk insert this data?
Question A : I need to truncate a table, it has 21 millions of rows and it has a size of 14 GB.
1- How do I find out if this table is not being referenced by a FOREIGN KEY? 2- Does it Participates in a indexed view? 3- Is being published by using transactional replication or merge replication?
I have deleted nearly 30 million rows from a table. But however when I used the sp_spaceused command to calculate the data occupied by the table I don't see any difference in the data size of the table. In fact the data has increased to few MBs after the deletion, but not much.
) AS BEGIN SET NOCOUNT ON --Exception Handling Variable Declaration DECLARE @ErrorMessage NVARCHAR(200), @ErrorNumber INT, @ErrorSeverity INT, @ErrorState INT, @ErrorProcedure NVARCHAR(50), @ErrorLine INT, @ErrorDesc NVARCHAR(100)
DECLARE @XMLPayment INT BEGIN TRY IF @XMLParams IS NOT NULL BEGIN --BEGIN IF
SET @ErrorDesc='Error Occured While Inserting into TIX_PAYMENT_SCHEDULE FROM XML'
INSERT INTO TIX_PAYMENT_SCHEDULE ( OwedAmountId, ProposalId, BrandId, DueDate, OverdueDate , CreatedDateTime, LastUpdatedDateTime, ExpectedAmount, ActualAmountReceived, ScheduleBatchJournalId, RuleId, TransactionStatusId, ActionId, IsLate, IsPaymentReceived , IsValidSchedule, --Added by DC : 119 IsCatchupBalanced, CatchupBalanceIdentifier, HasModified --------------------------------------------------- ) SELECT Main.ELEMENT.value('(OwedAmountId)[1]','int') AS OwedAmountId, Main.ELEMENT.value('(ProposalId)[1]','int') AS ProposalId, Main.ELEMENT.value('(BrandId)[1]','int') AS BrandId, convert(datetime,Main.ELEMENT.value('(DueDate)[1]','varchar(100)')) AS DueDate, convert(datetime,Main.ELEMENT.value('(OverdueDate)[1]','varchar(100)')) AS OverdueDate, @ToDate AS CreatedDateTime, @ToDate AS LastUpdatedDateTime, convert(decimal(18,2),Main.ELEMENT.value('(ExpectedAmount)[1]','varchar(100)')) AS ExpectedAmount, convert(decimal(18,2),Main.ELEMENT.value('(ActualAmountReceived)[1]','varchar(100)')) AS ActualAmountReceived, Main.ELEMENT.value('(ScheduleBatchJournalId)[1]','bigint') AS ScheduleBatchJournalId, Main.ELEMENT.value('(RuleId)[1]','int') AS RuleId, Main.ELEMENT.value('(TransactionStatusId)[1]','int') AS TransactionStatusId, Main.ELEMENT.value('(ActionId)[1]','int') AS ActionId, Main.ELEMENT.value('(IsLate)[1]','char(1)') AS IsLate, Main.ELEMENT.value('(IsPaymentReceived)[1]','char(1)') AS IsPaymentReceived, Main.ELEMENT.value('(IsValidSchedule)[1]','char(1)') AS IsValidSchedule
--Added by DC for 119
,Main.ELEMENT.value('(IsCatchupBalanced)[1]','char(1)') AS IsCatchupBalanced ,Main.ELEMENT.value('(CatchupBalanceIdentifier)[1]','nvarchar(1000)') AS CatchupBalanceIdentifier ,@HasModified ---------------------------------------------------------------------
FROM @XMLParams.nodes ('(/ROOT/DATA)') AS Main(ELEMENT)
END--END IF
END TRY--Main END TRY BEGIN CATCH --Main BEGIN CATCH
SELECT @ErrorMessage = @ErrorDesc+Char(13)+Error_Message(), @ErrorSeverity = Error_Severity(), @ErrorState = Error_State(), @ErrorNumber = Error_Number(), @ErrorProcedure = Error_Procedure(), @ErrorLine = Error_Line() RAISERROR( @ErrorMessage, @ErrorSeverity, @ErrorState, @ErrorNumber, @ErrorProcedure, @ErrorLine ) END CATCH --Main END CATCH END --Main END
BEGIN TRY --Exception Handling SET @ErrorDesc='Error Occured while fetching records from TIX_PAYMENT_SCHEDULE'
SELECT PaymentScheduleId, OwedAmountId, ProposalId, DueDate, OverdueDate, ExpectedAmount, TransactionStatusId, IsPaymentReceived, IsLate, ActionId, ActualAmountReceived, IsValidSchedule, BrandId, CaseScheduleId, ReasonId, Comments, NoOfDays, ActionDate, IsCatchupBalanced, CatchupBalanceIdentifier, HasModified from TIX_PAYMENT_SCHEDULE with (nolock) WHERE DUEDATE <=@ToDate AND IsValidSchedule=@IsValidSchedule
SELECT DISTINCT OwedAmountId,proposalId,brandId from TIX_PAYMENT_SCHEDULE with (nolock) WHERE DUEDATE <=@ToDate AND IsValidSchedule=@IsValidSchedule Order By OwedAmountId,ProposalId,BrandId asc SELECT DISTINCT ProposalId from TIX_PAYMENT_SCHEDULE with (nolock) WHERE DUEDATE <=@ToDate AND IsValidSchedule=@IsValidSchedule Order By ProposalId asc
END TRY BEGIN CATCH SELECT @ErrorMessage=@ErrorDesc+CHAR(13)+ Error_Message(), @ErrorNumber=Error_Number(), @ErrorState=Error_State(), @ErrorProcedure=Error_Procedure(), @ErrorLine=Error_Line(), @ErrorSeverity=Error_Severity()
I have a SQL script to insert data into a table as below:
INSERT into [SRV1INS2].BB.dbo.Agents2 select * from [SRV2INS14].DD.dbo.Agents
I just want to set a Trigger on Agents2 Table, which could delete all rows in the table , before carry out any Insert operation using above statement.I had below Table Trigger on [SRV1INS2].BB.dbo.Agents2 Table as below: But it did not perform what I intend to do.
USE [BB] GO /****** Object: Trigger Script Date: 24/07/2015 3:41:38 PM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON
I have created a table Table with name as Varchar and id as int. Now i have started inserting the rows like, insert into Table values ('arun',20).Yes i have inserted a row in the table. Now i have got the values " arun's ", 50. insert into Table values('arun's',20) My sqlserver is giving me an error instead of inserting the row. How will you solve this problem?
Hi everyone - I have an ETL package which loads about 10 million rows from SQL 2005 staging tables to new, empty tables (no indexes or constraints) in another SQL 2005 DB to be SWITCHED into the main partioned data tables.
Both databases reside on the same SQL Server instance - it is a dev server so the disk aren't super fast/SAN speeds but it has plenty of RAM/CPU & SCSI disks.
The insert takes about 45 minutes - can I get this working any faster?? or is this typical for 10 million rows?? I've messed about withe the data flow a few times but I can't seem to get any significant improvements.
Any tips anyone??
I perform several lookups on dimensions - these are not cached.
I do query the source table concurrently with different WHERE clauses & run two pipelines processing the data into 2 destination tables.
Would it be better to query the base table once & use a conditional split instead of the two separate queries??
I also mulicast from each pipeline & use a UNION ALL to log some of the rows from each pipeline to anther destination table.
Hope this makes sense?? Any ideas or tips on how I can speed up this kinda transform would be appeciated..
I'm using oleDB connections.
Hope ths makes some kinda sense!! Thanks for any advice!!
I run the following statement and it will not update beyond 7 million plus rows and I have about 38 million to complete. I keep checking updated row counts and after 1/2 day it's still the same so I know something is wrong because it was rolling through no problem when I initiated it. I need to complete ASAP so it's adding to my frustration. The 'Acct_Num_CH' field is an encrypted field (fyi).
SET rowcount 10000 UPDATE [dbo].[CC_Info_T] SET [Acct_Num_CH] = 'ayIWt6C8sgimC6t61EJ9d8BB3+bfIZ8v' WHERE [Acct_Num_CH] IS NOT NULL WHILE @@ROWCOUNT > 0 BEGIN SET rowcount 10000 UPDATE [dbo].[CC_Info_T] SET [Acct_Num_CH] = 'ayIWt6C8sgimC6t61EJ9d8BB3+bfIZ8v' WHERE [Acct_Num_CH] IS NOT NULL END SET rowcount 0
Hello,Currently we are in the process of implementing a sql server database where couple tables will have millions of rows ( about 98 millions and will grow) and a web site that will retrieve and sort the data ( read only). How asp.net gridview and sqldatareader act situation like that? Will it be a very slow response? Is there any alternative? Is there any example on the net? Assuming tables are well tuned and well indexed. Thank you in advance.
Hello, Currently we are in the process of implementing a sql server database where couple tables will have millions of rows (about 98 millions and will grow) and an asp.net site that will retrieve and sort the data
What will be the best practice installing the database in situation like this one? Do we need a cluster server? Indexing needs to be done in a special way? Thanks in advance.
I write a insert trigger on my table LeaveRegister(1000 rows) and inserting rows in audit table, but when i inserting a row in LeaveRegister table. In audit table 1000 + 1 rows are inserting every time.
I have one view which is based on couple of tables. Here is the definition of view. Which are the options i can use to optimize the view for better performance. This is one of the view which causing issue on database.
CREATE VIEW [dbo].[V_Reqs] WITH SCHEMABINDING AS SELECT purchase.Req.RequisitionID, purchase.Req.StatusCode AS Expr2, purchase.Req.CollectionDateTime, purchase.Req.ReportDateTime, purchase.Req.ReceivedDateTime, purchase.Req.PatientName, purchase.Req.AddressOne, purchase.Req.AddressTwo, purchase.Req.City, purchase.Req.PostalCode, purchase.Req.PhoneNumber,
I'm working on a database for a financial client and part of what i need to do is calculate a value from two separate rows in the same table and insert the result in the same table as a new row. I have a way of doing so but i consider it to be extremely inelegant and i'm hoping there's a better way of doing it. A description of the existing database schema (which i have control over) will help in explaining the problem:
id metric_id metric_type_id metric_name 1 80 2 Fiscal Enterprise Value Historic Year 1 2 81 2 Fiscal Enterprise Value Current Fiscal Year 3 82 2 Fiscal Enterprise value Forward Fiscal year 1 4 83 2 Fiscal Enterprise Value Forward Fiscal Year 2 5 101 3 Calendar Enterprise value Historic Year 1 6 102 3 Calendar Enterprise Value Current Fiscal Year 5 103 3 Calendar Enterprise value Forward Year 1 6 104 3 Calendar Enterprise Value Forward Year 2
Table Name: metric_type_details
id metric_type_id metric_type_name 1 1 Raw 2 2 Fiscal 3 3 Calendar 4 4 Calculated
The problem scenario is the following: Because a certain number of the securities have a fiscal year end that is different to the calendar end in addition to having fiscal data (such as fiscal enterprise value and fiscal earnings etc...) for each security i also need to store calendarised data. What this means is that if security with security_id = 3 has a fiscal year end of October then using rows with ids = 1, 2, 3 and 4 from the metrics_ladder table i need to calculate metrics with metric_id = 83, 84, 85 and 86 (as described in the metric_details table) and insert the following 4 new records into metrics_ladder:
Metric with metric_id = 101 (Calendar Enterprise value Historic Year 1) will be calculated by taking 10/12 of the value for metric_id 80 plus 2/12 of the value for metric_id 81.
Similarly, metric_id 102 will be equal to 10/12 of the value for metric_id 81 plus 2/12 of the value for metric_id 82,
metric_id 103 will be equal to 10/12 of the value for metric_id 82 plus 2/12 of the value for metric_id 83 and finally
metric_id 104 will be NULL (determined by business requirements as there is no data for forward year 3 to use).
As i could think of no better way of doing this (and hence the reason for this thread) I am currently achieving this by pivoting the relevant data from the metrics_ladder so that the required data for each security is in one row, storing the result in a new column then unpivoting again to store the result in the metrics_ladder table. So the above data in nmetrics_ladder becomes:
-- Dummy year variable to make it easier to use MONTH() function -- to convert 3 letter month to number. i.e. JAN -> 1, DEC -> 12 etc... DECLARE @DUMMY_YEAR VARCHAR(4) SET @DUMMY_YEAR = 1900;
with temp(security_id, metric_id, value) as ( select ml.security_id, ml.metric_id, ml.value from metrics_ladder ml where ml.metric_id in (80,81,82,83,84,85,86,87,88,etc...) -- only consider securities with fiscal year end not equal to december and ml.security_id in (select security_id from company_details where fiscal_year_end <> 'dec') ) insert into @calendar_averages select temppivot.security_id -- Net Income ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[80]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[81]) as [101] ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[81]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[82]) as [102] ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[82]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[83]) as [103] ,NULL as [104] -- Share Holders Equity ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[84]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[85]) as [105] ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[85]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[86]) as [106] ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[86]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[87]) as [107] ,NULL as [108] -- Capex -- Sales -- Accounts payable etc... .. .. from temp pivot ( sum(value) for metric_id in ([80],[81],[82],[83],[84],[85],[86],[87],[88],etc...) ) as temppivot inner join company_details cd on temppivot.security_id = cd.security_id
********* END SQL *********
The result then needs to be unpivoted and stored in metrics_ladder.
And FINALLY, the question! Is there a more elegant way of achieving this??? I have complete control over the database schema so if creating mapping tables or anything along those lines would help it is possible. Also, is SQL not really suited for such operations and would it therefore be better done in C#/VB.NET.
I am trying to insert some new rows into an existing SQL table. The table name is Agt_table, and I want to add some data for some new agents into existing columns: Agent name, agent code, phone number, fax number
Example - I want to add the following record to my existing table Agt_table Agent name: ABC Company Agent code: 012345 Phone #: 555-555-5555 Fax#: 555-555-5555
Is it possible to insert multiple rows in a table using one INSERT statment. If yes, how can I do that ? I tried doing this using the substitution method.
Using Substitution Method - This is how,I proceeded.
I am having one querry regarding the same line. In my stored procedure i am fetching the data from one table containing upto 5 to 6 million rows I made use of index in my database but then also I cant optimise my execution time of that sp. Please help me out of this problem.
I am using the below SQL to insert a table. The problem is after I run this, I run another script to populate the table (see below). The population script will work if I run it as INSERT INTO ... SELECT TOP 99.9999999 PERCENT ..., but if I put 100 PERCENT, or just use no percent limiter I get the following error: Msg 8624, Internal SQL Server error.
It is weird b/c once something is inserted in the table, i can run the populate script without any problems. Any idea as to why this is happening?
Thanks,
Dave
TABLE GENERATION SCRIPT
if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[tbCelebroAds]') and OBJECTPROPERTY(id, N'IsUserTable') = 1) drop table [dbo].[tbCelebroAds] GO
CREATE TABLE [dbo].[tbCelebroAds] ( [AdID] [int] IDENTITY (1, 1) NOT NULL , [BranchCode] [int] NOT NULL , [PropertyID] [nvarchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL , [AdScheduleCode] [int] NOT NULL , [RecAtCentral] [int] NULL , [AdRan] [int] NOT NULL , [AdStatusID] [int] NOT NULL , [PatID] [int] NULL , [RunDate] [datetime] NOT NULL , [CreationCost] [float] NOT NULL , [BillCost] [float] NOT NULL , [AddedDate] [smalldatetime] NOT NULL ) ON [PRIMARY]
TABLE POPULATION SCRIPT
INSERT INTO tbCelebroAds (BranchCode, PropertyID, AdScheduleCode, PatID, AdStatusID, AdRan, RunDate, CreationCost, BillCost, AddedDate) SELECT [TOP 100 PERCENT or no percent limiter doesn't work, TOP 99.9999999 PERCENT does] BranchCode, PropertyID, AdScheduleCode, cp.PatID, 6, 0, CAST(PubDate AS DATETIME), CAST(ISNULL(pat.Cost,0) AS DECIMAL(10,2)), 0, GetDate() FROM tbCelebroView cv LEFT JOIN tbCelebroPubs cp ON cv.PublicationName = cp.PublicationName AND cv.AdSectionName = cp.AdSectionName LEFT JOIN tbPubToAdType pat ON cp.PatID = pat.PatID WHERE CAST(BranchCode AS nvarchar(20)) + CAST(PropertyID AS varchar(20)) + CAST(AdScheduleCode AS nvarchar(20)) NOT IN (SELECT CAST(BranchCode AS nvarchar(20)) + CAST(PropertyID AS varchar(20)) + CAST(AdScheduleCode AS nvarchar(20)) FROM tbCelebroAds)
I am using SQL Server 2012 SE.I am trying to delete rows from a couple of tables (GetPersonValue has 250 million rows and I am trying to delete 50Million rows and GetPerson has 35 Million rows and I am trying to delete 20 million rows). These tables are in TX replication.The plan is to delete data older than 400 days old.
I tried to move data to new tables from the last 400 days and it took me like 11 hours. If I delete data in chunks of 500000 then its taking a long time to rebuild indexes(delete plus rebuild indexes 13 hours). Since I am using standard edition partition wont work.
find ddl below:
GO CREATE TABLE [dbo].[GetPerson]( [GetPersonId] [uniqueidentifier] NOT NULL, [LinedActivityPersonId] [uniqueidentifier] NOT NULL, [CTName] [nvarchar](100) NULL, [SNum] [nvarchar](50) NULL, [PHPrimary] [nvarchar](50) NULL,
Hello all,my first post here...hope it goes well. I'm currently working onstored procedure where I translated some reporting language into T-SQLThe logic:I have a group of tables containing important values for calculation.I run various sum calculations on various fields in order to retrievecost calculations ...etc.1) There is a select statement which gathers all the "records" whichneed calculations.ex: select distinct Office from Offices where OfficeDesignation ='WE' or OfficeDesignation = 'BE...etc.As a result I get a list of lets say 5 offices which need to becalculated!2) A calculation select statement is then run on a loop for each ofthe returned 5 offices (@OfficeName cursor used here!) found above.Anexample can be like this(* note that @WriteOff is a variable storing the result):"select @WriteOff = sum(linecost * (-1))From Invtrans , InventoryWhere ( transtype in ('blah', 'blah' , 'blah' ) )and ( storeloc = @OfficeName )and ( Invtrans.linecost <= 0 )and ( Inventory.location = Invtrans.storeloc )and ( Inventory.itemnum = Invtrans.itemnum )"...etcThis sample statement returns a value and is passed to the variable@WriteOff (for each of the 5 offices mentioned in step 1). This is donearound 9 times for each loop! (9 calculations)3) At the end of each loop (or each office), we do an insert statementto a table in the database.
To anyone that is able to help....What I am trying to do is this. I have two tables (Orders, andOrderDetails), and my question is on the order details. I would liketo set up a stored procedure that essentially inserts in the orderstable the mail order, and then insert multiple orderdetails within thesame transaction. I also need to do this via SQL 2000. Right now ihave "x" amount of variables for all columns in my orders tables, andall Columns in my Order Details table. I.e. @OColumn1, @OColumn2,@OColumn3, @ODColumn1, @ODColumn2, etc... I would like to create astored procedure to insert into Orders, and have that call anotherstored procedure to insert all the Order details associated with thatorder. The only way I can think of doing it is for the program to passme a string of data per column for order details, and parse the stringvia T-SQL. I would like to get away from the String format, and gowith something else. If possible I would like the application tosubmit a single value per variable multiple times. If I do it this waythough it will be running the entire SP again, and again. Anysuggestions on the best way to solve this would be greatlyappreciated. If anyone can come up with a better way feel free. Myonly requirement is that it be done in SQL.Thank you
Below is a simplified table & dataset to illustrate a problem I'm experiencing with a more complex one.
Code Block
create table #test( recno smallint PRIMARY KEY, value decimal (18,2))
insert into #test values (1, 3.57) insert into #test values (2, 5.32) insert into #test values (3,6.29) insert into #test values (4, 9.25) insert into #test values (5, 0.84)
Method 1: I tried inserting rows from #test into a temp table (#table) as follows
Code Block
declare @n as nvarchar(3) set @n = 1 while @n <= (select count(recno) from #test) begin exec (' insert into ##table select *, originalrecno = (select recno from #test where recno = '+@n+') from #test' ) set @n = @n + 1 end However, this yields an error message:
Code Block
Server: Msg 208, Level 16, State 1, Line 2 Invalid object name '##table'. Note - you can comment out the insert into ##table line above to view the results that I'm trying to put into ##table.
Method 2: next I tried explicitly creating ##table & rerunning the loop containing the insert
declare @n as nvarchar(3) set @n = 1 while @n <= (select count(recno) from #test) begin exec (' insert into ##table select *, originalrecno = (select recno from #test where recno = '+@n+') from #test' ) set @n = @n + 1 end This worked - it inserted the data from the select statements in the loop into ##table.
Question pls. I have an MS SQL local package where it exports data from SQL table to Excel file. My question is, how can erase all the records in my excel file before i export the new data from SQL table?
What i want is to delete the rows in the destination file before inserting new records.
I want to delete 30-40 million rows from a transactional table. Whats the fastest way to delete these rows. just to delete 300,000 rows it takes 30 min. also i don't want to truncate the table.
I am trying to insert data into two different tables. I will insert into Table 2 based on an id I get from the Select Statement from Table1. Insert Table1(Title,Description,Link,Whatever)Values(@title,@description,@link,@Whatever)Select WhateverID from Table1 Where Description = @DescriptionInsert into Table2(CategoryID,WhateverID)Values(@CategoryID,@WhateverID) This statement is not working. What should I do? Should I use a stored procedure?? I am writing in C#. Can someone please help!!
I have an item table (#Codes) which contains itemIDs, some of which are always used in a subsequent data table (#Data) to hold information about a case+scearnio. These item rows have a standarditem value = 1. Rows in the item table which may be included in the subsequent data table (#Data), but not for every single case+scearnio, are considered nonstandard rows and therefore have a standarditem value = 0. I have been trying to write some code that automatically inserts rows from #Codes which have standarditem = 1 into #Data for each of the caseID+scenarioIDs in #Cases but have not figured it out yet. See "desired result" below to see exactly the output I'm trying to achieve.
Also, is there a way to insert the standarditem = 1 rows from #Codes into #Data when new caseID+scenarioID rows are added to #Cases? Ultimately, I'd like to either create a button for the user to click in the Access interface that inserts these rows when clicked OR when the user enters a new caseID+scenarioID in #Cases the rows just automatically appear for the new case row. This way, the rows will already have the caseID, scenarioID & itemID fields already populated and all the user will have to do is enter the item value and be able to manually add standarditem = 0 rows and their values if needed.
Code Block
create table #Codes (itemID nvarchar(2), standarditem int) insert into #Codes values (1,1) insert into #Codes values (2,0) insert into #Codes values (3,1) insert into #Codes values (4,1) insert into #Codes values (5,0)
create table #Cases (caseID nvarchar(5), scenarioID nvarchar(15), createdate datetime) insert into #Cases values (823, 1, '20071210') insert into #Cases values (823, 2, '20071211') insert into #Cases values (824, 1, '20071213')
Hi, would like to know if there are any links or sample code to learn how to Insert multiple rows with 1 sql statement.Also, can the inserted values' source be from a table in another database table or from a dataset?I am actually trying to insert about 117 rows of data.Table 1======UID Primary Key TeamCode a code value representing different teams Week will equal to 2Points nullable valuee.g.Table 1======UID TeamCode Week Points1 A1 1 1002 A2 1 99trying to insert into table 1Table 1========UID TeamCode Week Points1 A1 1 1002 A2 1 993 A1 2 null4 A2 2 nulletc...As you can see, UID is primary key, TeamCode may repeat according to week value Week is a constant Points will be nullHow can I do that with a single Insert Command? Thank you for your help. :)
Does anybody have a sample SQL script that will select table A and compare it to table B. If a row exists in both table A and table B, it will update the columns in table B with the columns in table A. If the row does not exist in table B, it will insert a row in table B using the row in table A. Is this possible?
Why does this code tell me that I inserted 2 rows when I really only inserted one? I am using SQL server 2005 Express. I can open up the table and there is only one record in it. Dim InsertSQL As String = "INSERT INTO dbCG_Disposition ( BouleID, UserName, CG_PFLocation ) VALUES ( @BouleID, @UserName, @CG_PFLocation )"Dim Status As Label = lblStatus Dim ConnectionString As String = WebConfigurationManager.ConnectionStrings("HTALNBulk").ConnectionString
Dim con As New SqlConnection(ConnectionString) Dim cmd As New SqlCommand(InsertSQL, con)
Dim added As Integer = 0 Try con.Open() added = cmd.ExecuteNonQuery() Status.Text &= added.ToString() & " records inserted into CG Process Flow Inventory, Located in Boule_Storage." Catch ex As Exception Status.Text &= "Error adding to inventory. " Status.Text &= ex.Message.ToString() Finally con.Close() End Try Anyone have any ideas? Thanks
I need to insert a very large number of rows into a table (in SQL Server 7.0) using ADO. Could you please tell me i there is a way for FAST insert, something similar to BCP ... or any other way of inserting large number of rows efficiently
I am trying to insert 9 rows into an table at the same time. My situation is this...
I have a survey page. There are 9 parts with each part ment for an individual person. Each part has 8 questions, and each part has the same 8 questions
The questions are answered using one of the answers in a drop down box.
So when the surveyer clicks submit, all the 9 parts should be entered into the table.
If this is confusing, I have the form up on the Internet at...
http://www.lavenderlane.ie/wage_survey_test.asp
I can insert one part no problem, (when I reduce the form to only 1 part) but i need to insert all of the 9 parts simultaneously. I reckon its some sort of for loop but if u could help me out i would appreciate it!