Bulk Insert And Data Aggregation

Jan 30, 2008

I manage a legacy system that dumps it's data into a number of different databases (same schema) on a nightly basis using bulk insert. I need to formulate a strategy for efficiently aggregating that data into a single database right after these nightly extractions complete. Here is my current stategy:

1. Duplicate the legacy system's database schema and add an identifier column to specify which database the data loaded from.

2. Each night, delete all records in the table.

3. Each night, for each database:

3a. Set each table's default value to a value that references the current database being loaded.

3b. Use the legacy system's flat files and format files to bulk insert into the database.

3c. Clear the default value.


What other steps would faciliate performance? Dropping and recreating the indexes? Does anyone forsee faults in this strategy?

Thanks,
Matt

View 3 Replies


ADVERTISEMENT

Bulk Insert - Bulk Load Data Conversion Error

Jan 17, 2008

Im having some issues with bulk insert.

This is the table:

CREATE TABLE [dbo].[tmp_GA_status](

[GA_recno] [int] NOT NULL,

[GA_desc] [varchar](40) NULL

)


This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"


and this is the sql:

bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'

with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')



so yeah, pretty simple. But whatever I do I get this;

Msg 4864, Level 16, State 1, Line 1

Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).



So what am I doing wrong ?

View 13 Replies View Related

BULK INSERT ERROR Using Format File - Bulk Load Data Conversion Error

Jun 29, 2015

I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?

Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')

[code]...

View 5 Replies View Related

How To Insert Data From A File Into Table Having Two Columns-BULK INSERT

Oct 12, 2007



Hi,
i have a file which consists data as below,

3
123||
456||
789||

Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.


BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')

but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.

can anyone help me how to do this?

Thanks,
-Badri

View 5 Replies View Related

How Do You Use An Identity Column When Doing A Bulk Insert Using The Bulk Insert Task Editor

Apr 18, 2008



Hello,

I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.

Thanks.

View 8 Replies View Related

How To Insert Data Into Sql Server In Bulk Using ADO.net

Dec 27, 2007

Hi!
 I'm building a web application. I need to read data from a text or excel file and process the data and then store the result records into database. The record number is big. I can store the data record into database (SQL Server 2005) one at a time. I think it's slow. Is there any way to insert the data in bulk.
 
Thanks!
ccy

View 4 Replies View Related

Bulk Insert With CSV Data Files

Apr 18, 2001

SQL Server 7.0 doesn't seem to support data files for bulk insert that have quoted text fields.

e.g.
" 1","Farmer","Joe","AAA","Smith John","",20001001,

I've tried using the format file to strip out the quotes. But, this doesn't seem to work.

My format file looks like this:
4.2
9
1 SQLCHAR 0 0 """ 0 dummy1
2 SQLCHAR 0 9 "","" 2 EmployeeID
3 SQLCHAR 0 35 "","" 3 LastName
4 SQLCHAR 0 35 "","" 4 FirstName
5 SQLCHAR 0 10 "","" 5 Category
6 SQLCHAR 0 35 "","" 6 Supervisor
7 SQLCHAR 0 5 ""," 7 OpCode
8 SQLCHAR 0 8 "," 8 HireDate
9 SQLCHAR 0 8 "
" 9 TermDate


Any idea on how I can bulk insert a data file where some of the fields are qutoed.

View 2 Replies View Related

Import Data With Bulk Insert

Jun 25, 2014

I have imported data in my table using the bulk insert command. I was supposed to fill specific columns of my table with that data so I used a view to put them in the column I wanted.

The table looks like this now:

id | id_param | val_param
+-----------+--------------+
1 | no_tel | 742062141
2 | sex | 1
3 | age | 23
4 | no_tel | 765234157
5 | sex | 1
6 | age | 34

When I want to select only the val_param that is=1 for the id_param=sex using this interogation:

select * from bd_rox where id_param='sex' and val_param='1'

it returns no value and I don`t know why.The wanted result should look like this:

id | id_param | val_param
+-----------+--------------+-
2 | sex | 1
5 | sex | 1

View 9 Replies View Related

Indexes On Bulk Insert Data

Jul 20, 2005

Any help would be appreciated.I am running a script that does the following in succession.1-Drop existing database and create new database2-Defines tables, stored procedures and functions in the database3-Imports data using bulk insert4-Analyzes data using stored proceduresI would like to improve the performance of the analysis in step 4 bycreating indexes in step 2.Question 1-Are indexes updated when data is bulk inserted? I know they arewhen using normal insert, update, or delete T-SQL but I am not sure aboutbulk insert of data.Question 2-Do I need to update the index statistics in any way or would theybe ready to use in step 4.Thanks,CJ

View 2 Replies View Related

Data Transfer,bcp, Bulk Insert Problem

Sep 8, 2000

I have installed a msde engine (sql server 7 desktop) on a nt 4.0 sp6 workstation. I have around 17 megs of data
of need to transfer from a Sql Server 7 from a nt 4.0 sp6 Server. I can't get anything to work. Bcp complains
something to the effect there is no server attribute which is true because it is a work station. if I remove the
-S option it complains that I did not designate the server. This bcp script works just fine from SQL Server 6.5
on servers. DTS puts a message that I do not have a liscense to use DTS for the SQL Server Desktop. I tried
transfering the database to Access and then use the upsizing Wizard but the database is to big. I tried a bulk insert
but I got the vague message ' OLE DB provider 'STREAM' reported an error. The provider did not give any
information about the error' What is the best way to do this? Why am I having trouble with Bcp and Bulk insert?

View 1 Replies View Related

Loading Datetime Data Using Bcp/bulk Insert

Dec 11, 1998

Short version:
The best/fastest way to load large amounts of data from a comma delimited text file into an SQL Server table. Where the text file contains date fields in ccyy/mm/dd format and the SQL Server table defines those fields as datetime data types.

Details:
When I attempt to load files (using either bcp or BULK INSERT) containing datetime data the load process errors because the datetime fields in my text file are in ccyy/mm/dd format and the default format for SQL Server is mm/dd/yy. I have been unable to change the default format by using the SET DATEFORMAT statement (apparently the SET DATEFORMAT statement will not work for bcp because bcp runs outside of the SQL Server session???).
The only alternatives that I have come up with are: 1) Change the format of date fields in the text file from ccyy/mm/dd to mm/dd/ccyy. 2) Create a temporary table that defines the date fields as a char(n) datatype. Then load the data into the temp table. Then SET the DATEFORMAT to ccyy/mm/dd. Then copy the temp table into the permanent table (the permanent table using datetime data types).

Both of these alternatives would require additional processing time. Since this is a process that loads large amounts of data on a monthly (soon to be weekly) basis, speed is of the essence.

I would appreciate any suggestions.

Thanks!

View 6 Replies View Related

Retrieve Truncated Data From BULK INSERT?

Jan 10, 2007

Hi everyone. Does anyone know if you can retrieve truncated data from a BULK INSERT operation?

We have a file that needs to be inserted into a SQL Server Database. There is a field that has a maximum of 8000 characters, but some times users submit files that have more than that. We need to be able to capture the truncated data. The BULK INSERT operation does not throw an error. The only way I can think of to get the data is if I bulk insert the data into a temporary table with a memo field and then copy it over, but that may really slow down the SP.

Has anyone encountered this situation before? I also have the option of parsing the file in .NET.

Thanks and take care,

Angel

View 9 Replies View Related

Bulk Insert From Txt File Less Data Than Columns

May 23, 2007

Hi, I´m trying to bulk insert files that looks like this:

aaaa,bbb,dddd,
ccc,dfd,tghj,

each file can have up to 10 data fileds per line, and each file will have same number of data fileds in particular file, let´s say 3 like above. Second file could have let´s say 10 and that is maximum.

I read the file and insert data with fieldterminator in temp table from witch I insert data to other tables regarding some parameters inside.

Now problem is:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

That is because I´m trying to insert 3 fields of data in temporary table which is made of 10 columns (It have to be 10 because next file could have 10 fileds of data). If the temp table has same number of columns like text file has data fields than it works.

What is solution for this problem?
Can I bulk insert NULL in columns for which I don't have data?

I can also import each line of text file to one column (with delimiter inside) but than I don´t know how to insert that data to correct tables or even to one table but to seperate data fields to columns with fieldterminator which is , in this case.

I'm new to SQL and I would apriciate any help.
Thank you

View 3 Replies View Related

Transfer Data : DTS, Replication Or Bulk Insert ?

Mar 6, 2006

Hi,

I need to copy a table between 2 different SQL servers.

There are a lot of different solutions :

-DTS
-Bulk Insert
-Replication
...

What are the pros and cons of these solutions ?
How to choose ?

Thanks

View 1 Replies View Related

SSIS Bulk Insert For Data Insertion.

Oct 12, 2007

Hi,

I ran my package and it was successfu. I tried running it again, but this time it throws me this error:


Dim_T_Account [56575]: Unable to prepare the SSIS bulk insert for data insertion.

Error: 0xC004701A at CallerType, CallerChannel, Dealer, DODealer, HotlineType, Model, Reg'l Signal Code, Account, Contact, DTS.Pipeline: component "Dim_T_Account" (56575) failed the pre-execute phase and returned error code 0xC0202071.

Information: 0x40043008 at CallerType, CallerChannel, Dealer, DODealer, HotlineType, Model, Reg'l Signal Code, Account, Contact, DTS.Pipeline: Post Execute phase is beginning.



Why suddenly without changing anything, i encountered this error? What does it mean it cannot prepare the SSIS bulk insert. My connection to server is working ok.

cherriesh

View 9 Replies View Related

Data Transformation Services (DTS)(Bulk Insert)

May 31, 2007



Hi All,

I'm using DTS package, a tool to transfer data from a txt file to database(Bulk Insert).

The Bulk Insert task provides an efficient way to copy large amounts of data into a SQL Server table or view.It seems that the Bulk Insert task supports only OLE DB connections for the destination database. But I want to use sql server authentication as OLEDB connection requires windows authentication.



So can the bulk insert be done using SQLServer authentication ? if yes then please help me.


I have given the code snippet below.







Code Sample:





Dim oPackage As New DTS.Package2()
Dim oConnection As DTS.Connection
Dim oStep As DTS.Step2
Dim oTask As DTS.Task
Dim oCustomTask As DTS.BulkInsertTask
Try
oConnection = oPackage.Connections.New("SQLOLEDB")
oStep = oPackage.Steps.New
oTask = oPackage.Tasks.New("DTSBulkInsertTask")
oCustomTask = oTask.CustomTask
With oConnection
oConnection.Catalog = "pubs"
oConnection.DataSource = "(local)"
oConnection.ID = 1
oConnection.UseTrustedConnection = True
oConnection.UserID = "Tony Patton"
oConnection.Password = "Builder"
End With
oPackage.Connections.Add(oConnection)
oConnection = Nothing
With oStep
.Name = "GenericPkgStep"
.ExecuteInMainThread = True
End With
With oCustomTask
.Name = "GenericPkgTask"
.DataFile = "c:dtsauthors.txt"
.ConnectionID = 1
.DestinationTableName = "pubs..authors"
.FieldTerminator = "|"
.RowTerminator = "
"
End With
oStep.TaskName = oCustomTask.Name
With oPackage
.Steps.Add(oStep)
.Tasks.Add(oTask)
.FailOnError = True
End With
oPackage.Execute()
Catch ex As Exception
MsgBox("Error: " & CStr(Err.Number) & vbCrLf_
& Err.Description, vbExclamation, oPackage.Name)
Finally
oConnection = Nothing
oCustomTask = Nothing
oTask = Nothing
oStep = Nothing
If Not (oPackage Is Nothing) Then
oPackage.UnInitialize()
End If
End Try

View 1 Replies View Related

Bulk Insert Into Table With More Columns Than Data Within File

Jun 17, 2007

Hey all

I have a bulk insert situation that would be nice to be able to pull off. I have a flat file with 46 columns that are to go into a table. The table, I want to have a 47th column to be updated later on by means of a stored proc saying if the import into the system was sucessful or not. I have the rowterminator set as '"' thinking that would tell SQL to begin on the next row, leaving the importstatus column null but i still receive an error.

First of all, is this idea possible within this insert statement. Secondly, if so, what would be the syntax to tell the insert statement to skip that particular column. It is the last column listed in the table so it just needs to start on the next row after it inserts the last bit of data in the flatfile.

If this is not possible, is it possible to bulk insert into a temp table?

Thanks

View 1 Replies View Related

SQL Server 2014 :: Bulk Insert Data Into Table

Jul 29, 2014

I need to load the following data into a SQL table. This is how the vendor is able to provide it to us.

CRCorp Daily Report,,,,,,
,,,,,,
Facility,Location,Purchase Order #,Vendor,Inventory #,Date Ordered,Extended Cost
09-Mowtown 495 CRST,09-402A Women's Imaging,327937,"BARD PERIPHERAL VASCULAR, INC.",113989,7/25/2014,650
09-Mowtown 495 CRST,09-402A Women's Imaging,327936,"WB MASON CO., INC.",112664,7/25/2014,8.64
01-Mowtown 499 CRST,01-302B Oncology,327947,McKesson General Medical,n/a,7/25/2014,129.02

[Code] ....

I have attempted to bulk insert it into this table with no luck.

CREATE TABLE POMaster
(Facility VARCHAR(75),
Location VARCHAR(75),
PONum INT,
VendorNm INT,
INVENTORYNUM VARCHAR(25),
orderDte DATE,
ExtendedPrice NUMERIC(10,2)
)
GO

It does not like the double quotes. How to make this format work? Do I need a format file?

View 2 Replies View Related

T-SQL (SS2K8) :: Ignore First Row In Data File In Bulk Insert?

Feb 17, 2015

I use bulk insert to fill a table

I use code bellow

bulk insert dbo.test
from 'c: est.txt'
with(FIRSTROW=2,FORMATFILE='c: est.xml'
go

but data inserted to table start from 3throw.

View 2 Replies View Related

SQL Server 2008 :: Bulk Insert Data Into Table

Mar 23, 2015

I want to bulk insert data into a table named scd_event_tab inside a database named rdb.

When I do select * from rdb.dbo.scd_event_tab, i get :

JOB_ID RUN_ONPRIORITYPAYLOADTIMEOUT_INTERVALSTATUSPICKUP_TIMESCD_TYPESCHEDULE_IDDB_ADMIN_LOGIN_REQUIRED_YN

I saved the result into a csv file and then truncated the table. Now, I am trying to bulk insert the data into the table. So I used:

bulk insert
rdb.dbo.scd_event_tab from 'C:userssluintel.ctrdesktopeventtab.csv'
with
(
codepage = 'RAW',
datafiletype = 'native',
fieldterminator = ' ',
keepidentity,
keepnulls
);
go

However, I get this error:

Msg 4867, Level 16, State 1, Line 1
Bulk load data conversion error (overflow) for row 1, column 1 (JOB_ID).
Msg 4866, Level 16, State 5, Line 1

The bulk load failed. The column is too long in the data file for row 1, column 3. Verify that the field terminator and row terminator are specified correctly.

Msg 7399, Level 16, State 1, Line 1

The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.

Msg 7330, Level 16, State 2, Line 1

Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

View 9 Replies View Related

Bulk Insert - Flat File Data Into Table

Mar 12, 2015

I am running a set of SQL statements on a SQL server, to insert flat file data into a SQL table. The flat file is already FTP'ed to the SQL server. I seem to be getting an error, which is possibly pointing to a permissions issue

The statements:

BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer]
FROM 'c:jedox_dailyjdcom4401.txt'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '
'
)
GO

The error is :
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "c:jedox_dailyjdcom4401.txt" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 1815)

If it is permissions issue, how do I overcome this?

View 1 Replies View Related

Bulk Insert Data Conversion Error (truncation)

Dec 4, 2006

hi

"Bulk insert data conversion error (truncation) for row 1, column 1 (id)."

when you get the error above or similar in sql server 2000 does it continue inserting the data by truncating it or does it stop beacause looking at the data that i have got it seems to continue inserting the data but just truncates the colunm. i have tried it several time its seeems to be consistent.

I have data that has white spaces after the actual data e.g. '00093 ' hence i am happy aslong as i can be sure that it does always continue as i will be loading alot of data using a similar process.

hence my question is that will it load all the data all the time and just truncate it to fit the column size?

View 7 Replies View Related

Bulk Insert Data In Millions - Lock Issue

Oct 11, 2005

Hi all,Hope there is a quick fix for this:I am inserting data from one table to another on the same DB. Theinsert is pretty simple as in:insert into datatable(field1, field2, field3)select a1, a2, a3 from temptable...This inserts about 4 millions rows in one go. And since I had the'cannot obtain lock resources' problem, several methods were suggestedby some web sites:1) one to split the insert into smaller chunks (I have no idea how Ican spit a insert to insert only n records at a time..)2)to use waitfor - which I did but did not fix the error.3)use bulk insert (in t-sql) - I dont know how to do this?As I see I am simply trying to move data from one table to another(ofcourse lots of data) in SQL Server 2000 and I dont see one simplesolution to the locking problem.any ideas on how best I can do this will save my day!thanks all.

View 9 Replies View Related

Can A Data Flow Task Mimic Bulk Insert?

May 31, 2006

Hi Guys,

The way I understand a Data Flow Task is that it inserts the rows from the source to destination one by one. Is there a way to make it act like a bulk insert task? We have been experiencing performance issues when inserting a lot of rows from one table to another. If there's no way to actually do it, can a bulk insert task functionality be scripted? Coz what I need is a table to table insert, and the bulk insert task only accepts data files as sources.

Thanks!
Kervy

View 8 Replies View Related

Bulk Insert - Inserting Only One Column From Data File

Sep 29, 2007



Hi,

I have a data file in the folloing format

SubjectId1|class1
SubjectId2|class2
SubjectId3|class3


I just wanted to insert only SubjectIds into my table 'Subjects' which has the follwing schama ignoring the classes
The row delimeter is "
" and the column delimeter is ' | '

Table Subjects
{

ID (Autoincrement)
SubjectId varchar(20)
}

Can any one provide the format file for doing this or suggest anyway to do this?
Please do note that the file may contain millions of records

Thank u
~mohan

View 5 Replies View Related

Transact SQL :: Bulk Insert Arabic Data From CSV File To DB

Aug 5, 2015

USE TEST 
GO 
/****** BULK INSERT  ******/
BULK
INSERT [Table01]
FROM 'C:empdata.csv'

[code]....

I am using above code to insert csv file data which consist of arabic data as well.  Upload is successful however Arabic field data is uploaded with invalid characters and getting the following error Msg 4864, Level 16, State 1, Line 3...Bulk load data conversion error (type mismatch or invalid character for the specified codepage)

View 15 Replies View Related

Bulk Insert From Native Format Data File.

Dec 5, 2006

With "bcp MyDatabase.dbo.MyTable out C:MyFile.Dat -n -T" command line, I could get an exported data file. And I can also import this file  into MyTable using 'BULK INSERT MyDatabase.dbo.MyTable FROM 'C:MyFile.dat' WITH (DATAFILETYPE='native');' query statement.

Now, I want to make my own data file just like made by bcp above. Although I could make file of 'char' type, 'native' type file is needed for performance and other reasons. And the format file should not be used.



 Any one help?

View 5 Replies View Related

Skip The First Line Of The Data File - Bulk Insert

Oct 1, 2007

Hi,
I have a data file and the contents of it are as follows

2 -- This is the header indicating the no of records in my files
1001|s1
1006|s2

The content of format file is as follows. This is to skip first column of the all the rows and get only Subs (i.e s1 and s2 )


9.0

2
1 SQLCHAR 0 100 "|" 0 ID ""

2 SQLCHAR 0 100 "
" 1 Subs ""


Here is my query to get all the Subs from my data file


SELECT * FROM OPENROWSET( BULK 'datafile.txt',

FORMATFILE = 'FormatFile.fmt',

FIRSTROW = 2 ) AS a

But this query retuns only s2 where i was expeting s1 and s2. The reason being is that the firts row i.e header doesn't follow the format
Can any one please let me know how to skip the first line in the data file and get the result as required

~Mohan

View 6 Replies View Related

Bulk Insert Using Script And Not Bulk Insert Task

Nov 2, 2007



Does anyone know how to do a bulk insert using just the script task? I've been searching everyehere but can't seem to find a sample.

View 6 Replies View Related

Howto Get Avoid Bulk Insert Data Conversion Error?

Aug 7, 2006

hi, i having a problem in bulk insert , which is regard the text file that
to insert into database, when insertion processing,

if my textfile have NULL value, it give me Bulk insert data conversion error

for example in my text file c:mytest.txt , it contains data NULL

123 studentname NULL



can we let bulk insert detect NULL value ?

i have try on putting "KEEPNULLS" , but it doesn't help , caused some fields in table may in datetime type

BULK INSERT [mytable] FROM c:mytest.txt WITH (FIELDTERMINATOR = '' '', ROWTERMINATOR = '''', KEEPNULLS )'


thank you

View 4 Replies View Related

BULK INSERT, Setting Static Data Using The Format File

Mar 2, 2004

Hello dbforums,

I are using a BULK INSERT to insert the data from a ascii file to a sql table. The table has a ProductInstanceId column that exists in the tables but does not exist in the ascii DICast data. I am setting the ProductInstanceId to a Guid that will be used for Metrics. I would like to create the Guid in C++ and then set it somehow during the BULK INSERT DICastRaw1hr and DICastRaw6hr. I am calling the BULK INSERT from C++/ADO. I do not see how you can set a static data in the BULK INSERT for a column that exists in the table but does not the source data ... seems there should be a way to do this with the format file?

The other way to do this is with a TRIGGER. I have the TRIGGER below. Prior to the calling the BULK INSERT using ADO I will use ADO to ALTER the TRIGGER with the new Guid. When the BULK INSERT runs the ProductInstanceId will be populated with the new Guid.

ALTER TRIGGER DICastRaw1hrInsertGuid
ON Alphanumericdata.dbo.DICastRaw1hr
FOR INSERT AS UPDATE dbo.DICastRaw1hr SET ProductInstanceId = '4f9a44eb-092b-445b-a224-cc7cdd207092'
WHERE modelrundatetime = (select max(modelrundatetime) from Alphanumericdata.dbo.DICastraw1hr(NOLOCK))

More Questions:

- The Trigger is slow. The Bulk Insert without the Trigger runs in about 10 sec ... with the Trigger in about 40 sec. I tried to use the sql code below in the TRigger but it was only doing the UPDATE on the last row. The TRIGGER must run after the BULK INSERT is complete. Now I am using the select (bad). Any comments ...

ALTER TRIGGER DICastRaw1hrInsertDate
ON Alphanumericdata.dbo.DICastRaw1hr
FOR INSERT
AS
DECLARE @ID as integer
SELECT @ID = i.recordid from inserted i
UPDATE dbo.DICastRaw1hr SET ProductInstanceId = '4f9a44eb-092b-445b-a224-cc7cdd207092'
WHERE recordid = @ID

- I understand that I could set the Guid in the Default Value part of the table definition using the NEWID() function. I need the Guid to be the same for all the rows that are inserted during the BULK INSERT (all have the same modelrundatetime) ... how would I do this?

Thanks,
Chris

View 6 Replies View Related

Bulk Insert Fails. Column Is Too Long In The Data File

Jun 27, 2006

Hi,

for testing purposes I'm inserting a flat file into a sql-server table using BULK INSERT unsig the following code:

BULK INSERT rsk_staging
FROM 'c: empulk
sk.txt'
  WITH (
    FIELDTERMINATOR = '',
    ROWTERMINATOR = '
',
    CODEPAGE = 'RAW',
    DATAFILETYPE  = 'char',
    BATCHSIZE = 100000,
    ROWS_PER_BATCH = 1925604,
    TABLOCK
  )

I have two versions of "rsk.txt" one with 1.9mill rows and one with the first 2000 rows only. The files have one column only with 115 characters that I'll split in to several columns later using SUBSTRING. The one with 2000 rows fires in to the database with no problems whatsoever using this exact code, the other one throws the following error:

Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.

How can I resolve this problem?

EDIT: I tried several different row- and fieldterminators but this exact one works for the small data-file so I assume it should also work for the large one...the large one is however copyed directly using binary ftp from a unix-filesystem and the small one is manually copied into a new txt-file using UltraEdit.

View 1 Replies View Related

Data Access :: BULK INSERT Yields Inconsistent Results

May 6, 2015

I am getting inconsistent results when BULK INSERTING data from a tab-delimited text file. As part of my testing, I run the same code on the same file again and again, and I get different results every time! I get this on SQL 2005 and SQL 2012 R2.We have an application that imports data from a spreadsheet. The sheet contains section headers with account numbers and detail rows with transactions by date:

AAAA.1234 /* (account number)*/
1/1/2015      $150                 First Transaction
1/3/2015      $24.233              Second Transaction

BBBB.5678
1/1/2015      $350                 Third Transaction
1/3/2015      $24.233              Fourth Transaction

My Import program saves this spreadsheet at tab-delimited text, then I use BULK INSERT to bring the data into a generic table full of varchar(255) fields. There are about 90,000 rows in each day's data; after the BULK INSERT about half of them are removed for various reasons. Next I add a RowID column to the table with the IDENTITY (1,1) property. This gives my raw data unique row numbers.

I then run a routine that converts and copies those records into another holding table that's a copy of the final destination table. That routine parses though the data, assigning the account number in the section header to each detail row. It ends up looking like this:

AAAA.1234     1/1/2015      $150          First Purchase
AAAA.1234     1/3/2015      $24.233     Second Purchase
BBBB.5678     1/1/2015      $350            Third Purchase
BBBB.5678     1/3/2015      $24.233       Fourth Purchase

My technique: I use a cursor to get the starting RowID for each Account Number: I then use the upper and lower RowIDs to do an INSERT into the final table. The query looks like this:

SELECT RowID, SUBSTRING(RowHeader, 6,4) + '.UBC1' AS AccountNumber
FROM   GenericTable
WHERE RowHeader LIKE '____.____%'

Results look like this:

But every time I run the routine, I get different numbers! my results are not accurate. I get inconsistent results EVERY TIME.Here is my code, with table, field and account names changed for business confidentiality.This is a high profile project at my company;

TRUNCATE TABLE GenericImportTable;
ALTER TABLE GenericImportTable DROP COLUMN RowID;
BULK INSERT GenericImportTable FROM 'SERVERGeneralAppnameDataFile.2015.05.04.tab.txt'
WITH (FIELDTERMINATOR = ' ', ROWTERMINATOR = '
', FIRSTROW = 6)

[code]...

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved