How To Insert More Number Of Rows Into Sql Server 2005 Database At Once.

Apr 14, 2008

Hi,

Good morning to all.
My table: User_Group_Map(UserID UNIQUEIDENTIFIER,GroupID UNIQUEIDENTIFIER)

Now, I want to write one stored procedure that can insert rows into the above table, but more number of rows at-once.

Means, the program should allow multiple insertions without the need to call the stored procedure from front-end more number of times.

Can anyone please help me on this...

Thanks in advance...
Ashok kumar.

View 3 Replies


ADVERTISEMENT

SQL Server 2012 :: Insert Rows Based On Number Of Distinct Values In Another Table

May 20, 2014

I have a table with PO#,Days_to_travel, and Days_warehouse fields. I take the distinct Days_in_warehouse values in the table and insert them into a temp table. I want a script that will insert all of the values in the Days_in_warehouse field from the temp table into the Days_in_warehouse_batch row in table 1 by PO# duplicating the PO records until all of the POs have a record per distinct value.

Example:

Temp table: (Contains only one field with all distinct values in table 1)

Days_in_warehouse
20
30
40

Table 1 :

PO# Days_to_travel Days_in_warehouse Days_in_warehouse_batch
1 10 20
2 5 30
3 7 40

Updated Table 1:

PO# Days_to_travel Days_in_warehouse Days_in_warehouse_batch
1 10 20 20
1 10 20 30
1 10 20 40
2 5 30 20
2 5 30 30
2 5 30 40
3 7 40 20
3 7 40 30
3 7 40 40

How can I update Table 1 to see desired results?

View 3 Replies View Related

Insert Rows Based On Number Of Distinct Values In Another Table?

May 21, 2014

I have a table with PO#,Days_to_travel, and Days_warehouse fields. I take the distinct Days_in_warehouse values in the table and insert them into a temp table. I want a script that will insert all of the values in the Days_in_warehouse field from the temp table into the Days_in_warehouse_batch row in table 1 by PO# duplicating the PO records until all of the POs have a record per distinct value.

Example:

Temp table: (Contains only one field with all distinct values in table 1)
Days_in_warehouse
20
30
40

Table 1 :

PO# Days_to_travel Days_in_warehouse Days_in_warehouse_batch
1 10 20
2 5 30
3 7 40

Updated Table 1:

PO# Days_to_travel Days_in_warehouse Days_in_warehouse_batch
1 10 20 20
1 10 20 30
1 10 20 40
2 5 30 20
2 5 30 30
2 5 30 40
3 7 40 20
3 7 40 30
3 7 40 40

how can I update Table 1 to get desired results?

View 2 Replies View Related

Using SSIS 2005 To Strip Out Bad Rows In Excel And Then Insert Detailed Rows Into OLE DB Data Source

Apr 6, 2006

Environment:
 
Running this code on my PC via VS 2005
.Net version 2.0.50727 on the server (shown in IIS)
Code is in ASP.NET 2.0 and is a VB.NET Console application
SSIS 2005
 
Problem & Info:
 
I am bringing in an Excel file.  I need to first strip out any non-detail rows such as the breaks you see with totals and what not.  I should in the end have only detail rows left before I start moving them into my SQL Table.  I'm not sure how to first strip this information out in SSIS specfically how down to the right component and how to actually code the component to do this based on my Excel file here: http://www.webfound.net/excelfile.xls

Then, I assume I just use a Flat File Source coponent or something to actually take the columns in the Excel and split into an OLE DB Datasource to shove each column into a corresponding column in my SQL Server Table.  I have used a Flat File Source in the past to do so with a comma delimited txt file but never tried with an Excel.
 
Desired Help:

 
How to perform
 
1)       stripping out all undesired rows
2)       importing each column into sql table

View 1 Replies View Related

SQL SERVER 2005 Database Mirroring For Large Number Of Databases

May 30, 2006

I am trying to enable database mirroring for 100 database.
It goes error free till 59 databases (some times 60 databases) with the
status (principal, synchronized) on principal. on the 60th or 61st database
it gave the status (principal, disconnected). Also mirror starts acting
abnormal. connection to mirror starts to give connection timeout and it is
not enabling database mirroring on any more databases. I have SQL SERVER
2005 Enterprise with SP1 on the servers. witness is not included yet.

this are my test servers... i have more than 500 databases on my production
servers.

principal and mirror both are using port 5022 for ENDPOINT communication.

View 1 Replies View Related

SQL SERVER 2005 DATABASE MIRRORING For Large Number Of Databases

Jun 1, 2006

I am trying to enable database mirroring for 100 database.
It goes error free till 59 databases (some times 60 databases) with the
status (principal, synchronized) on principal. on the 60th or 61st database
it gave the status (principal, disconnected). Also mirror starts acting
abnormal. connection to mirror starts to give connection timeout and it is
not enabling database mirroring on any more databases. I have SQL SERVER
2005 Enterprise with SP1 on the servers. witness is not included yet.

these are my test servers... i have more than 500 databases on my production
servers.

principal and mirror both are using port 5022 for ENDPOINT communication.

All of the databases are critical and all must be included in the Database Mirroring.
so, after that I tried to implement database mirroring again......
System has 3 GB of RAM, SQL SERVER (Mirror) using 85 MB of RAM but still
giving this error while trying to enable database mirroring for 37th
Database.....

"There is insufficient system Memory to run this query"

WHY?

View 19 Replies View Related

How Do You Delete (x) Number Of Rows From Database

Jan 10, 2008

I am setting up a database which schedules production and tracks inventory of items on a daily basis.  The scheduler may put in 100 identical entries (apart from the identity column) of an item with its corresponding quantity.  My problem is, if there is a shipment of product (a subtraction of quantity from the database), how can I delete a specified number of rows where the inventory listing is 100,000 pcs?  I think the DELETE TOP(r) command will work but I don't know how make the command into an actual variable.  Maybe there is another way too...
My current not-working try;  I look at the product desired to delete, figure out how many rows to delete, and since it is not always an integer, figure out a quantity to add back in.  The addition part works fine but delete command needs work.  Any help is appreciated.
    int InvRows = 0;    decimal RealInvRows = 0;    decimal AddQty = 0;    int preAddAmount = 0; protected void DelInv_Click(object sender, EventArgs e)    {        Label TotProdSum = (Label)DetailsView2.FindControl("TotProdSum");        Label RowQty = (Label)DetailsView3.FindControl("RowQty");        int SubQty = Convert.ToInt32(ShipQty.Text);        InvRows = SubQty / Convert.ToInt32(RowQty.Text) + 1;        RealInvRows = SubQty / Convert.ToDecimal(RowQty.Text);        AddQty = (InvRows - RealInvRows) * Convert.ToInt32(RowQty.Text);        IntLbl.Text = Convert.ToString(InvRows);        RealLbl.Text = Convert.ToString(RealInvRows);        preAddAmount = Convert.ToInt32(AddQty);        AddAmount.Text = Convert.ToString(preAddAmount);                for (int r = 0; r <= InvRows; r++)        {            forWhile.DeleteCommand = "DELETE TOP (r) FROM Inventory WHERE (Inventory = @Inventory)";            forWhile.DeleteParameters.Add("Inventory", RowQty.Text);            forWhile.Delete();            forWhile.DeleteParameters.Clear();        }        forWhile.InsertCommand = "INSERT INTO Inventory(Dte, Product, Inventory) VALUES (@Dte, @Product, @Inventory)";        forWhile.InsertParameters.Add("Inventory", AddAmount.Text);        forWhile.InsertParameters.Add("Product", InvProdDDL.Text);        forWhile.InsertParameters.Add("Dte", Date.Text);        forWhile.Insert();        forWhile.InsertParameters.Clear();    } 

View 1 Replies View Related

SQL Server 2005 Charts - Using Database Rows Rather Than Columns

Jun 14, 2007

I'm using Visual Studio 2005 and SQL Server 2005. I am creating a report to put on the Internet using C# and other .NET features. I'm building several charts using database information extracted from Microsoft Excel. I can make charts based on the data from columns using the categories, series, and values on the Data tab of chart properties. However, when I create a query that grabs a record ( a row) from a database, it forces me to use the field values even though each field contains only one value.



Basically, I want to take a row from a database and make it my x-axis data values and another row (record) from a database and make it my y-axis data values.



I'd prefer not to change the format in the database, if possible. Thanks in advance.

View 1 Replies View Related

Insert Number With Comma Into SQL Database 'int' Field

Feb 8, 2006

How can I allow users to input numbers with commas into a database field with an 'int' datatype without getting this error, 'Input string was not in a correct format'?

View 3 Replies View Related

How To Insert Rows Into Sql Ce From 2005

Jun 11, 2007

Hi!
I have a database in Sql Server 2005 and I want to transfer the data from 2005 to the Sql Server CE database. I've created a number of tables in CE but how do I write sql to transfer data?




Jesus saves. But Gretzky slaps in the rebound.

View 4 Replies View Related

Howto Insert A Null DateTime Into A Sql Server 2005 Database

Apr 2, 2008

I'm importing data form an Excel file to a Sql Server Database. Some of the data imported represents time as a double type so i convert the times into DateTime to be inserted into the database. The time values that aren't available in the Excel file are 0.. so what i want to do is insert null into the database for all the values that are 0 in the excel file... How do i do that based on this code i have so far:protected void ButtonImport_Click(object sender, EventArgs e){PanelUpload.Visible = false;PanelView.Visible = false;PanelImport.Visible = true;LabelImport.Text = "";OleDbCommand objCommand = new OleDbCommand();objCommand = ExcelConnection(); OleDbDataReader reader;reader = objCommand.ExecuteReader(); while (reader.Read()){DateTime? in_1 = null;DateTime? out_1 = null;DateTime? in_2 = null;DateTime? out_2 = null;   int emp_id = Convert.ToInt32(reader["emp_id"]);DateTime date_entry = Convert.ToDateTime(reader["date_entry"]);if (Convert.ToDouble(reader["in_1"]) != 0)in_1 = ConvertDoubleToDateTime((double)reader["in_1"]);if (Convert.ToDouble(reader["out_1"]) != 0)out_1 = ConvertDoubleToDateTime((double)reader["out_1"]);if (Convert.ToDouble(reader["in_2"]) != 0)in_2 = ConvertDoubleToDateTime((double)reader["in_2"]);if (Convert.ToDouble(reader["out_2"]) != 0)out_2 = ConvertDoubleToDateTime((double)reader["out_2"]); ImportIntoAttendance(emp_id, date_entry, in_1, out_1, in_2, out_2);} reader.Close();}protected void ImportIntoAttendance(int emp_id, DateTime date_entry, DateTime? in_1, DateTime? out_1, DateTime? in_2, DateTime? out_2){ SqlDataSource AttendanceDataSource = new SqlDataSource();AttendanceDataSource.ConnectionString = ConfigurationManager.ConnectionStrings["SalariesConnectionString1"].ToString();AttendanceDataSource.InsertCommandType = SqlDataSourceCommandType.Text;AttendanceDataSource.InsertCommand = "INSERT INTO Attendance (emp_id, date_entry, in_1, out_1, in_2, out_2) " +"VALUES ('" + emp_id + "', '" + date_entry + "', '" + in_1 + "', '" + out_1 + "', " +"'" + in_2 + "', '" + out_2 + "')"; int rowsAffected = 0;try{rowsAffected = AttendanceDataSource.Insert();}catch(Exception ex){LabelImport.Text += "<font color=red>" + ex + "</font><br />";} }private DateTime ConvertDoubleToDateTime(double dbTime){string[] SplitTime = dbTime.ToString().Split('.');string hours = SplitTime[0];string minutes = String.Empty;string time = String.Empty; if (dbTime.ToString().IndexOf('.') != -1){if (SplitTime[1].Length >= 1){if (SplitTime[1].Length == 1)minutes = Convert.ToString(Convert.ToDouble(SplitTime[1]) * 10);else if (SplitTime[1].Length > 1)minutes = SplitTime[1];}}elseminutes = "00";time = hours + ":" + minutes;return Convert.ToDateTime(time);}

View 3 Replies View Related

Help! How To Insert Multiple Rows Into Database???

Jan 7, 2007

I keep getting this error but it will only insert the 1st row into my database table
The variable name '@CustId' has already been declared. Variable names must be unique within a query batch or stored procedure.
 
Protected Sub Button2_Click(ByVal sender As Object, ByVal e As System.EventArgs)
 
 
 
 
Dim drow As GridViewRow
 
For Each drow In GridView1.Rows
 
Dim textBoxText As String = CType(drow.FindControl("Label2"), Label).Text
 
SqlDataSource2.InsertParameters.Add("CustId", TypeCode.String, Profile.UserName)
SqlDataSource2.InsertParameters.Add("OrderDate", TypeCode.DateTime, DateTime.Now.ToString)
SqlDataSource2.InsertParameters.Add("Total", TypeCode.Double, TotalUnitPrice)
SqlDataSource2.InsertParameters.Add("Quantity", TypeCode.Int32, textBoxText)
 
SqlDataSource2.Insert()
 
Next
 
 
 
 
Response.Redirect("checkout.aspx")
 
End Sub

View 3 Replies View Related

How To Enter More Number Of Rows In A Table Having More Number Of Columns At A Time

Sep 24, 2007

Hi

I want to enter rows into a table having more number of columns

For example : I have one employee table having columns (name ,address,salary etc )
then, how can i enter 100 employees data at a time ?

Suppose i am having my data in .txt file (or ) in .xls

( SQL Server 2005)

View 1 Replies View Related

Insert Multiple Rows From Dataset Into SQL Database

Aug 16, 2006

Hi,
is there anyway to insert all the rows from a dataset to SQL Server table in a single stretch..

Thanks
Anz

View 1 Replies View Related

Deleting Large Number Of Rows In SQL Server 2000

Jan 26, 2004

Hi,

I have some problems with our database which is growing too large, and was hoping someone might have some tips on what I can do!

I have about 100 clients, each logging about 10 000 rows of status logs a day. So after just a few days the db is growing very large.

At present it's manageable, since I don't need to "dig" into the logs more than a few times a day. The system it self is not affected by the size of the log or traffic on the server. But it will increase to about 500 clients in 2004, and 1000-1500 in 2005. So I really need a smarter solution than what I have today to be able to use the log efficiently.

98-99% of these rows are status-messages which are more or less garbage during normal operation. But I still need to keep them in case an error occurs, and we need to go back an hour or two (maybe a day) to see what went wrong. After 24-48 hours these 98-99% are of no use. I do however like to keep the remaining 1-2%, they are messages like startup, errors, etc. Ideally they should be logged in two separate tables by the clients, but unfortunatelly I cannot make the clients change their logging.

This presents problems on multiple levels. Mainly in searching, which often times out, but also with backup and storagespace. At the moment I check the system for errors, and every other day I just truncate the log-file. It works, but it's not exacly elegant......

The server is a 1100 MHz P3 / 512MB / Windows 2000 Server /
SQL Server 2000. Faster hardware would help, but the problem is more of a "bad design" than "slow hardware" problem.

My log is pretty simple, as follows:

LogId - int - primary key - clustered index
ClientId - int - index asc
LogTypeId - int - index asc
LogValue - nvarchar[2500], ikke index
LogTimeStamp- datetime - index asc


I have deducted 3 different solutions:

Method 1:
Simply run "Delete from db_log where logtyipeid <> stuff_I_want_to_keep".

This is the simplest and the one i prefer, but it takes too long time to complete. Any tips to speed this process up?


Method 2:
Create a trigger which runs something like "Delete from db_log where logtypeid <> stuff_I_want_to_keep and date < today_minus_two_days" every hour or so. This will ensure that the db doesn't grow to large. But if I'm away from work a few days we might loose data we'd wanted to keep.


Method 3:

Copy what I want to keep into another table, and empty the log. Sort of like "Insert into db_log_keep stuff_to_keep; drop db_log; create table db_log; " (or truncate, but that takes a long time too)

But then I would be stuck with two log tables, "48-hour_db_log" and "db_log_keep". I could use a view to "union" them so they would appear as a single table, but that's not ideal either.

However, it seems as this method is what will work best for my set-up, unless there are other suggestions??

Method 4:

...eagerly awaiting ideas!!! :-)




(Also, whatever tips and/or links to info on maintaing VLDB's are greatly appreciated. )

Thanks in advance for your help! :-)

Nikolai

View 4 Replies View Related

SQL Server 2008 :: Select Alternate Number Of Rows?

Jan 28, 2015

A simple Query to select alternate rows from a table ?

View 9 Replies View Related

SQL Server 2008 :: Update The Document Number Row For 3k Rows?

Aug 4, 2015

I have a table where I would like to update the document number row for 3k rows. The problem I have is that the documents come in sets of two (version 1 and 2) but both have different numbers. Picture it like this below:

DOCNUM: 4445787 Version 1
DOCNUM: 4445790 Version 2

It should be the same docnum (ie 4445787 Version 1, 4445787 Version 2).

The challenge is how can we assign the new docnum for version 1 to be also for version 2 as well. Basically in SQL we need a way to

1. Find a way to distinguish the pair of documents in the target db that are the same even though they have different docnums.

2. Update them so that the docnums match.

View 7 Replies View Related

SQL Server 2014 :: Huge Number Of Rows In Table Spool

Jul 6, 2015

I have a CTE query against a table with 32K rows that runs fine in 2008R2. I am running it in 2014 Std Ed. against the same data and it runs very slowly. Looking at the execution plan I think I see what's contributing to the slowness.

Note that the "actual number of rows" is some 351M...how is this possible?

the query:

declare @amts table (claim int,allowed decimal(12,2),copay decimal(12,2),deductible decimal(12,2),coins decimal(12,2));
;with unpaid (claimID) as (select claimID from claim where amt+copay + disct+mm + ded=0)
insert @amts
select lineID, sum(rc), sum(copay), sum(deduct),
case when sum(mm)>0 and (sum(mm)<sum(mmamt)) then sum(mm) else 0 end
from claimln
where status is null
and lineID not in (select claimID from unpaid)
group by lineID

it's like there's some massively recursive process going on?

View 5 Replies View Related

SQL Server 2008 :: Way To Check To Find Number Of Rows And Size Of A Table

Apr 29, 2015

How can we monitor the all tables in all databases and send notifications to the team.Is there a way to check to find the no of rows and size of a table last month and find out growth % now

View 4 Replies View Related

SQL Server 2008 :: Insert Data Into Table Variable But Need To Insert 1 Or 2 Rows Depending On Data

Feb 26, 2015

I am writing a query to return some production data. Basically i need to insert either 1 or 2 rows into a Table variable based on a decision as to does the production part make 1 or 2 items ( The Raw data does not allow for this it comes from a look up in my database)

I can retrieve all the source data i need easily but when i come to insert it into the table variable i need to insert 1 record if its a single part or 2 records if its a twin part. I know could use a cursor but im sure there has to be an easier way !

Below is the code i have at the moment

declare @startdate as datetime
declare @enddate as datetime
declare @Line as Integer
DECLARE @count INT

set @startdate = '2015-01-01'
set @enddate = '2015-01-31'

[Code] .....

View 1 Replies View Related

SQL Server 2008 :: Deleting Large Number Of Rows With Foreign Key And Mirroring Setup

Feb 19, 2015

We have a database. It is enabled for mirroring. We need to delete the old records. That is around 500k records from a table. But it has foreign key relation. How to do in Production servers these kind of deletes?

View 2 Replies View Related

SQL Server 2008 :: How To Pivot Unknown Number Of Rows To Columns Using Data As Column Headers

Sep 10, 2015

I have a single table that consist of 4 columns. Entity, ParamName, ParamsValue and ParamiValue. This table stores normalized Late Fee related parameters for apartments. The Entity field contains a code that identifies the apartment complex. The ParamName in a textual field that contains the name of the parameter that the other 2 fields define the value for; ParamsValue and ParamiValue. If the Late Fee parameter (as named in ParamName is something numerical then the value for that parameter can be found in ParamiValue else its in ParamsValue.

I don't know if 'Pivot' is the correct term to use for describing what I am trying to do because I've looked at the Pivot examples and I don't see how that will work for this. Using the Table and data as provided below, how would I construct a query so that I get 1 row per Entity in which the columns are the ParamsValue or ParamiValue for the ParamName listed in the column header (for the query)?

Below is the DDL to create the table and populate it.

USE [DBA_UTIL]
CREATE TABLE [dbo].[PARAMEXAMPLE](
[Entity] [varchar](16) NULL,

[Code]....

View 4 Replies View Related

How Can I Automate The Column Titled As 'ID NUmber' In My Database, In VS 2005?

Oct 23, 2007

I have a table with a primary key titled as 'ID NUmber' which needs to be created automatically, however every time i add a new record the ID is not added and i have to write it manually i.e. 1, 2, 3.., could you please advice me how i can format this; i know you can do this with microsoft Access but with VS 2005 + VB language this option is not available under data type
*i am using VS 2005 and VB language 

View 7 Replies View Related

Forwarding Variable Number Of Parameters From VB.2005 To Sql Server 2005 Stored Procedure

Jan 15, 2008

I have a problem regarding forwarding 'n number of parameters' from Visual Studio 2005 using VB to SQL-Server 2005 stored procedure.I have to save N number of rows in my stored procedure as a transaction. If all rows are not saved successfully, I have to roll-back else update some other table also after that. I am unable to handle - How to send variable number of parameters from Visual Stduio to Sql - Server ? My requirement is to use the SQL-Stored Procedure to store all the rows in the base table and related tables and then update one another table based on the updations done. Please Help .....

View 1 Replies View Related

Insert Rows From Other Server

Jul 20, 2005

Hi AllI want to insert rows from a table in a serverinto another table in another server usingINSERT SELECT command. For example :INSERT INTO Server1.database1.dbo.Tab1SELECT * FROM Server2.database2.dbo.Tab1WHERE Col1 = 1Can the command like this work ?If not, could you give me the solution ?Please help meThanks in advanceJohn Smile*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!

View 3 Replies View Related

SQL Server Admin 2014 :: Why Number Of Reads Increases During Insert Test

Jun 2, 2014

I am writing a performance baseline test.

The first test writes 5000000 rows in one table. I realise this is not representative OLTP behaviour, but it worked me to start interpreting performance counters and to test several setups to be discussed with our server, storage and network administrators. This way we have been able to compare the results of different hard disks, Lun vs vmdk, 1GB vs 10GB network, AMD vs Intel, etc. This way I can also compare several SQL setups (recovery model, max memory config, ...)

The screenshot shows the results of 2 runs on the same server : Win2012R2, SQL2014, 16GB RAM.

In test 1 min/max server memory was set to 9215MB/10751MB
In test 2 min/max server memory was set to 13311MB/14847MB

The script assures the number of bytes inserted in the nvarchar columns is always the same.

This explains why the number of pages and the number of MB in the table are the same at the end of the 2 tests (column 5 and 6)

Since ca 13GB has to be written, the results of test 1 show the lead time is increasing once more than 10GB has been inserted (column 8 and 9) In addition you can see at that moment

- buffer cache hit ratio is decreasing
- page life expectance becomes "terrible"
- free list stall/sec increases
- lazy writes/sec increases
- readlatency increases (write latency does not)

In test 2 (id 3 in column 1 in the screenshot) those counters are not really influenced (since the 5000000 rows can all be stored in memory).

Now what I do not understand is :

Why the number of pages read (instance level) as well as the number of bytes read and the number of reads (databaselevel) is increasing extremely during run 1.

I expected to see serious impact on write behavior, since SQL server is forced to start flushing dirty pages once memory is filled. Well actually you can see here the number of writes (not the the number of bytes written) starts to increase faster in test 1 after 4000000 rows, but there's no real impact on write latency.

Finally I want to notice

- I'm the only user on this machine
- the table has a clustered index on a identity column
- there are no foreign key constraints
- inserts are executed using a loop, not one big transaction
- to monitor progress and behaviour/impact, each 10.000 loops the counters are stored using dmv queries

So I wonder why SQL Server starts to execute so many reads in test 1.

View 4 Replies View Related

OPENROWSET (INSERT) Insert Error: Column Name Or Number Of Supplied Values Does Not Match Table Definition.

Mar 24, 2008

Is there a way to avoid entering column names in the excel template for me to create an excel file froma  dynamic excel using openrowset.
I have teh following code but it works fien when column names are given ahead of time.
If I remove the column names from the template and just to Select * from the table and Select * from sheet1 then it tells me that column names donot match.
 Server: Msg 213, Level 16, State 5, Line 1Insert Error: Column name or number of supplied values does not match table definition.
here is my code...
SET @sql1='select * from table1'SET @sql2='select * from table2'  
IF @File_Name = ''      Select @fn = 'C:Test1.xls'     ELSE      Select @fn = 'C:' + @File_Name + '.xls'        -- FileCopy command string formation     SELECT @Cmd = 'Copy C:TestTemplate1.xls ' + @fn     
-- FielCopy command execution through Shell Command     EXEC MASTER..XP_CMDSHELL @cmd, NO_OUTPUT        -- Mentioning the OLEDB Rpovider and excel destination filename     set @provider = 'Microsoft.Jet.OLEDB.4.0'     set @ExcelString = 'Excel 8.0;HDR=yes;Database=' + @fn   
exec('insert into OPENrowset(''' + @provider + ''',''' + @ExcelString + ''',''SELECT *     FROM [Sheet1$]'')      '+ @sql1 + '')         exec('insert into OPENrowset(''' + @provider + ''',''' + @ExcelString + ''',''SELECT *     FROM [Sheet2$]'')      '+ @sql2 + ' ')   
 
 

View 4 Replies View Related

SQL Server 2008 :: INSERT INTO Not Inserting Enough Rows

May 22, 2015

I've got a piece of code that returns 53 records when using just the SELECT section.When I change it to INSERT INTO ..... SELECT it only inserts 39 records into the receiving table.There are no keys/contraints/indices or anything else on the receiving table (it's just a dumping ground for some data that will be processed later).

The code for creating the table is here:-
USE [CDSExtractInpatients6.2]
GO
/****** Object: Table [dbo].[CDS_Inpatients_CDS_Feeds_Import] Script Date: 22/05/2015 15:54:15 ******/
SET ANSI_NULLS ON
GO

[code]...

I know most of the date fields are being created as varchar on here, but this is something I inherited and the SELECT is outputting the dates as text.Don't know if it makes any difference, but the server is running SQL2008.

View 9 Replies View Related

SQL Server 2012 :: Insert Rows In A Table

Jul 28, 2015

In a t-sql 2012 sql script, I have the following script, that only works for a few records since the value of TST.dbo.LockCombination.seq only contains the value or 1 in most cases. Basically for every join listed below, there should be 5 records where each record has a distinct seq value of 1, 2, 3, 4, and 5. Thus my goal is to determine how to add the missing rows to the TST.dbo.LockCombination where there are no rows for seq values of between 2 to 5. I would like to know how to insert the missing rows and then do the following update statement. Thus can you show me the sql on how to add the rows for at least one of the missing sequence numbers?

UDATE LKC
SET LKC.combo = lockCombo2
FROM [LockerPopulation] A
JOIN TST.dbo.School SCH ON A.schoolnumber = SCH.type
JOIN TST.dbo.Locker LKR ON SCH.schoolID = LKR.schoolID AND A.lockerNumber = LKR.number
JOIN TST.dbo.Lock LK ON LKR.lockID = LK.lockID
JOIN TST.dbo.LockCombination LKC ON LK.lockID = LKC.lockID
WHERE LKC.seq = 2

A normal select statement looks like the following:

select * from TST.dbo.Locker LKR
JOIN TST.dbo.Lock LK ON LKR.lockID = LK.lockID
JOIN TST.dbo.LockCombination LKC ON LK.lockID = LKC.lockID
where LKR.number in (000,001,1237)

In case you need the ddl statements for the tables affected here are the ddl statements:

CREATE TABLE [dbo].[Locker](
[lockerID] [int] IDENTITY(1,1) NOT FOR REPLICATION NOT NULL,
[schoolID] [int] NOT NULL,
[number] [varchar](10) NOT NULL,
[serialNumber] [varchar](20) NULL,
[type] [varchar](3) NULL,
[locationID] [int] NULL,

[code]...-

View 1 Replies View Related

SQL Server 2012 :: Insert Rows Into A Table Without Dropping It

Sep 23, 2014

I am finding it difficult to find an example that allows for insertion of additional rows into a table, without dropping the table I'm inserting into. Or inserting specific values. Like this example..

[URL] ....

I have 6 table I am formatting the data to conform to the final table as I'm inserting it into, but none of these examples gives me the example needed. I am using SQL 2012.

<code>
SELECT
CONVERT(VARCHAR(50),[FName]) + ' ' + CONVERT(VARCHAR(50),[LName]) AS [CustName]
,CAST('ALARMCOM' as nvarchar(8)) as VendorName
,CONVERT(VARCHAR(25),[CUSTOMER_CS_ACCOUNT_NUMBER]) AS [Cust_ID]
,CONVERT(VARCHAR(40),[Charge_Description])as [ChargeType]
,CASE

[Code] ....

View 6 Replies View Related

SQL Server 2012 :: Premature Casting On 0 Rows Insert

Feb 28, 2015

I have encountered some weird behaviour. Code that has been working for "eternities" suddenly started to fail. I couldn't recreate it on any other machines.

I have written a sample script to illustrate the issue:

CREATE TABLE #t_parcels (parcel_id INT, current_pos NUMERIC(28, 0), end_pos NUMERIC(28, 0))
CREATE TABLE #t_orders (outorder INT, parcel_id INT)
CREATE TABLE #t_missing_parcels (parcel_id INT, diff INT)

INSERT INTO #t_parcels (parcel_id, current_pos, end_pos)
SELECT1, 100000000000.0, 900000000000.0

[Code] ..

The last insert crashed with :"Arithmetic overflow error converting expression to data type int", even though there are no rows that satisfies the condition!

This is due to "diff" column having wrong datatype, BUT, the insert had no hits in the database. So how can inserting 0 rows crash with incorrect datatype?

I even copied the select so it was ran before the insert, in in that case, the SELECT completed successfully.

When i changed datatype in the table, the error went away, but I'm still curious what led to the error.

View 9 Replies View Related

SQL Server 2008 :: BULK INSERT Inserting No Rows

Aug 7, 2015

I am trying to BULK INSERT csv files using a stored procedure in SQL SERVER 2008R2 SP3. Although the files contain several thousand lines and BULK INSERT returns no errors, no data is actually imported into the table. Every field in the table is a NVARCHAR(50) datatype.

Here is the code for the operation (only the parameters for the insert itself):

set @open = 'bulk insert [DWHStaging].[dbo].[Abverkaufsquote] from '''
set @path = 'G:DataStagingDWHStagingSourceAbverkaufsquote'
set @params = ''' with (firstrow = 2
, datafiletype = ''widechar''
, fieldterminator = '';''
, rowterminator = ''
''
, codepage = ''1252''
, keepnulls);'

The csv file originates from a DB2 database. Using exactly the same code base I can import several other types of CSV files without problem.

The files are stored on the local server with as UCS2 Little Endian and one difference is that the files that do not import do not include a BOM. The other difference is that the failed files are non-UNICODE files.

View 4 Replies View Related

SQL Server 2014 :: Insert 500 Million Rows Into In-memory Table

Jul 29, 2014

I am doing a performance testing for In-memory option is sql server 2014. As a part I want to insert 500 million rows of records into a in-memory enabled test table I have created.

I need a sample script to insert 500 million records into a table ....

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved