Bulk Insert - Flat File Data Into Table

Mar 12, 2015

I am running a set of SQL statements on a SQL server, to insert flat file data into a SQL table. The flat file is already FTP'ed to the SQL server. I seem to be getting an error, which is possibly pointing to a permissions issue

The statements:

BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer]
FROM 'c:jedox_dailyjdcom4401.txt'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '
'
)
GO

The error is :
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "c:jedox_dailyjdcom4401.txt" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 1815)

If it is permissions issue, how do I overcome this?

View 1 Replies


ADVERTISEMENT

SQL Server 2012 :: Unable To Load Data From Flat File To Table Using Bulk Insert Statement

Apr 3, 2015

I am unable to load data from flat file to sql table using bulk insert sql statement

My code:-

DECLARE @filePath VARCHAR(200)
DECLARE @sql VARCHAR(8000)
Declare @filename varchar(100)
set @filename='CCNVZ_150401054418'
SET @filePath = 'I:IncomingFiles'+@FileName+'.txt'

[Code] .....

View 1 Replies View Related

SQL Server 2008 :: Can Bulk Insert Only Desired Column From A Flat File To Table

Mar 18, 2015

Can we bulk insert only the desired column from a flat file to a table?

I am using SSIS to bulk insert from a file with more than 200 columns. I am trying to find a way I can bulk insert them to multiples table through SSIS.

The one way I can think is pre map the columns from the file to the destination tables. Build numerous Bulk Insert tasks to achieve that. But not sure if SSIS will let me do that.

View 4 Replies View Related

Bulk Insert Vs. Data Flow Task (different Row Results Using Flat File Source)

Nov 2, 2006

I'm importing a large csv file two different ways - one with Bulk Import Task and the other way with the Data Flow Task (flat file source -> OLE DB destination).

With the Bulk Import Task I'm putting all the csv rows in one column. With the Data Flow Task I'm mapping each csv value to it's own column in the SQL table.

I used two different flat file sources and got the following:

Flat file 1: Bulk Import Task = 12,649,499 rows; Data Flow Task = 4,215,817 rows
Flat file 2: Bulk Import Task = 3,403,254 rows; Data Flow Task = 1,134,359 rows

Anyone have any guess as to why this is happening?

View 9 Replies View Related

How To Insert Data From A File Into Table Having Two Columns-BULK INSERT

Oct 12, 2007



Hi,
i have a file which consists data as below,

3
123||
456||
789||

Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.


BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')

but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.

can anyone help me how to do this?

Thanks,
-Badri

View 5 Replies View Related

BULK INSERT Flat File With Only One Column

Jan 30, 2004

Hi,

I have a text file with a single column that i need to bulk insert into a table with 2 colums - an ID (with identity turned on) and col2

my text file looks like:

row1
row2
row3
...
row10

so my bulk insert i have like this:
BULK INSERT test FROM 'd: estBig.txt'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '
'
)

but i get the error:

Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.

However, as you can see from the text file, there is only one column, so i dont have any field terminators.

Any ideas how to make this work?

Thanks.

View 4 Replies View Related

Importing DB2 Flat File With Bulk Insert

Oct 22, 2007

HI,



I am having trouble importing a flat file that was extracted from an AS400 server, into a SQL 2005 DB table using Bulk Insert. This file contains a column (field) that is of a Packed Decimal data type. Data in all other fields is displayed normally when viewing this file in a text editor such as Note Pad or Text Pad, but this one field comes up with unknown encoding: squares, thick vertical lines, basically strange characters and no numeric data.



Does anyone have any experience dealing with file of this sort?



Any insight would be appreciated.



Thanks.

View 14 Replies View Related

Bulk Insert Into Table With More Columns Than Data Within File

Jun 17, 2007

Hey all

I have a bulk insert situation that would be nice to be able to pull off. I have a flat file with 46 columns that are to go into a table. The table, I want to have a 47th column to be updated later on by means of a stored proc saying if the import into the system was sucessful or not. I have the rowterminator set as '"' thinking that would tell SQL to begin on the next row, leaving the importstatus column null but i still receive an error.

First of all, is this idea possible within this insert statement. Secondly, if so, what would be the syntax to tell the insert statement to skip that particular column. It is the last column listed in the table so it just needs to start on the next row after it inserts the last bit of data in the flatfile.

If this is not possible, is it possible to bulk insert into a temp table?

Thanks

View 1 Replies View Related

Copy Raw Data From A Flat File To A Database Table, BUT Do Not Insert Duplicates

Aug 2, 2007

Hi,

I'm new to SQL Server 2005 SSIS. I'm trying to do something very simple, but I cannot figure it out, PLEASE HELP!

I have a flat file, which I read and then insert the data in a database table, that works fine. The problem is that I don't want to insert duplicate records. For example; if I run the package again, it will appent to the table. What I need to do is that if the package runs again, check if the record already exist, based one two columns, date and hour, and do not insert the record.

Thank you,

Aldo

View 1 Replies View Related

[SQL 2005 Express] How Do I Load Fixed Width Per Row Flat File? Bulk Insert Possible?

May 14, 2007

I can't use DTS nor DTSwizard as I need to put it in a .sql and run it through a command line via .bat file (it's more for the users).

Each row ends with an EOL character, the fields are all fixed width, but I have a little problem here, some rows are empty but just with a EOL character.

How shall I go about it?

many thanks! :D

View 2 Replies View Related

BULK INSERT ERROR Using Format File - Bulk Load Data Conversion Error

Jun 29, 2015

I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?

Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')

[code]...

View 5 Replies View Related

SSIS - Data Flow To Flat File - Insert At Start Of File

Oct 24, 2007

Hi all,

In a foreachloop, I am inserting records into a flat file which is working fine. But the thing is that as the file grows, it takes longer for it to locate the EOF(End of File) of the flat file so as to insert the records.

I have around 70-100 lines written to the file at each loop and there are more than 20k records to be looped. wihich means that at the end I should be having 1400k - 20000k line in the text file.


One solution would be to insert the records at the start of the file itself so that it does not has to lookup the EOF each time before writting.

Another would be to generate separate files and then merge it.

Any idea how can this can be done?


Beside this I have to zip the file and then SFTP to a given address.

Any suggestion or help would be welcome.


Rdgs

David



View 5 Replies View Related

How Do I Insert Data From A Flat File Or .csv File Into An Existing SQL Database???

Mar 29, 2006

How do I insert data from a flat file or .csv file into an existing SQL database???

Here what I've come up with thus far and I but it doesn't work. Can someone please help? Let me know if there is a better way to do this... Idealy I'd like to write straight to the sql database and skip the datset all together...

strSvr = "vkrerftg"

StrDb = "Test_DB"

'connection String

strCon = "Server=" & strSvr & ";database=" & StrDb & "; integrated security=SSPI;"

Dim dbconn As New SqlConnection(strCon)

Dim da As New SqlDataAdapter()

Dim insertComm As New SqlCommand("INSERT INTO [Test_DB_RMS].[dbo].[AIR_Ouput] ([Event], [Year], [Contract Loss],[Company Loss], " & _

"[IndInsured Loss Prop],[IndInsured Loss WC],[Event Info]) " & _

"VALUES (@Event, @Year, @ConLoss, @CompLoss, @IndLossProp, @IndLossWC, @eventsInfo)", dbconn)

insertComm.Parameters.Add("@Event", SqlDbType.Int, 4, "Event")

insertComm.Parameters.Add("@Year", SqlDbType.Float, 4, "Year")

insertComm.Parameters.Add("@ConLoss", SqlDbType.Float, 4, "Contract Loss")

insertComm.Parameters.Add("@CompLoss", SqlDbType.Float, 4, "Company Loss")

insertComm.Parameters.Add("@IndLossProp", SqlDbType.Float, 4, "IndInsured Loss Prop")

insertComm.Parameters.Add("@IndLossWC", SqlDbType.Float, 4, "IndInsured Loss WC")

insertComm.Parameters.Add("@eventsInfo", SqlDbType.NVarChar, 255, "Event Info")

da.InsertCommand = insertComm

Dim upComm As New SqlCommand("UPDATE [Test_DB_RMS].[dbo].[AIR_Ouput] " & _

"SET [Event] = @Event " & _

",[Year] = @Year " & _

",[Contract Loss] = @ConLoss " & _

",[Company Loss] = @CompLoss " & _

",[IndInsured Loss Prop] = @IndLossProp " & _

",[IndInsured Loss WC] = @IndLossWC " & _

",[Event Info] = @EventInfo", dbconn)

upComm.Parameters.Add("@Event", SqlDbType.Int, 4, "Event")

upComm.Parameters.Add("@Year", SqlDbType.Float, 4, "Year")

upComm.Parameters.Add("@ConLoss", SqlDbType.Float, 4, "Contract Loss")

upComm.Parameters.Add("@CompLoss", SqlDbType.Float, 4, "Company Loss")

upComm.Parameters.Add("@IndLossProp", SqlDbType.Float, 4, "IndInsured Loss Prop")

upComm.Parameters.Add("@IndLossWC", SqlDbType.Float, 4, "IndInsured Loss WC")

upComm.Parameters.Add("@EventsInfo", SqlDbType.NVarChar, 255, "Event Info")

da.UpdateCommand = upComm

da.Update(dsAIR, "TextDB")



************* ANY HELP WOULD BE GREATLY APPRECIATED************

THANKS

View 6 Replies View Related

How Do I Insert Data From A Flat File Or .csv File Into An Existing SQL Database???

Mar 29, 2006



How do I insert data from a flat file or .csv file into an existing SQL database???

Here what I've come up with thus far and I but it doesn't work. Can someone please help? Let me know if there is a better wway to do this... Idealy I'd like to write straight to the sql database and skip the datset all together...

strSvr = "vkrerftg"

StrDb = "Test_DB"

'connection String

strCon = "Server=" & strSvr & ";database=" & StrDb & "; integrated security=SSPI;"

Dim dbconn As New SqlConnection(strCon)

Dim da As New SqlDataAdapter()

Dim insertComm As New SqlCommand("INSERT INTO [Test_DB_RMS].[dbo].[AIR_Ouput] ([Event], [Year], [Contract Loss],[Company Loss], " & _

"[IndInsured Loss Prop],[IndInsured Loss WC],[Event Info]) " & _

"VALUES (@Event, @Year, @ConLoss, @CompLoss, @IndLossProp, @IndLossWC, @eventsInfo)", dbconn)

insertComm.Parameters.Add("@Event", SqlDbType.Int, 4, "Event")

insertComm.Parameters.Add("@Year", SqlDbType.Float, 4, "Year")

insertComm.Parameters.Add("@ConLoss", SqlDbType.Float, 4, "Contract Loss")

insertComm.Parameters.Add("@CompLoss", SqlDbType.Float, 4, "Company Loss")

insertComm.Parameters.Add("@IndLossProp", SqlDbType.Float, 4, "IndInsured Loss Prop")

insertComm.Parameters.Add("@IndLossWC", SqlDbType.Float, 4, "IndInsured Loss WC")

insertComm.Parameters.Add("@eventsInfo", SqlDbType.NVarChar, 255, "Event Info")

da.InsertCommand = insertComm

Dim upComm As New SqlCommand("UPDATE [Test_DB_RMS].[dbo].[AIR_Ouput] " & _

"SET [Event] = @Event " & _

",[Year] = @Year " & _

",[Contract Loss] = @ConLoss " & _

",[Company Loss] = @CompLoss " & _

",[IndInsured Loss Prop] = @IndLossProp " & _

",[IndInsured Loss WC] = @IndLossWC " & _

",[Event Info] = @EventInfo", dbconn)

upComm.Parameters.Add("@Event", SqlDbType.Int, 4, "Event")

upComm.Parameters.Add("@Year", SqlDbType.Float, 4, "Year")

upComm.Parameters.Add("@ConLoss", SqlDbType.Float, 4, "Contract Loss")

upComm.Parameters.Add("@CompLoss", SqlDbType.Float, 4, "Company Loss")

upComm.Parameters.Add("@IndLossProp", SqlDbType.Float, 4, "IndInsured Loss Prop")

upComm.Parameters.Add("@IndLossWC", SqlDbType.Float, 4, "IndInsured Loss WC")

upComm.Parameters.Add("@EventsInfo", SqlDbType.NVarChar, 255, "Event Info")

da.UpdateCommand = upComm

da.Update(dsAIR, "TextDB")



************* ANY HELP WOULD BE GREATLY APPRECIATED************

THANKS

View 3 Replies View Related

Using BULK INSERT To Load File To Table

Apr 22, 2004

Is that possible to load files (*.bmp, *.jpg etc) to table (field type IMAGE) using BULK INSERT?
Or is it better to do it otherwise?

Thanks

View 5 Replies View Related

Insert From Flat File Where Column 1 Not In Table

Aug 7, 2007

Hi Guys,

I hope this is easy stuff for you..
First of all i searched the forum but didnt exactly find what iam searching for:

I have a File folder which contains 1..n Files of the same type.
The files contain a DateValue at the beginning of each row. I now want to read the first Record of the file - extract the datevalue and search in my Importtable if there are any records with that Date. If there are allready Records with this date i know i allready imported that file and skip it in my For each File Container. If no records where found I want to copy them from the file to the table.

So I have a flatfile source and thoght I just make an oledb command task afterwards which looks like "Select count(*) from Import where Processdate = ?. and then a conditional split if the count == 0 or not... but i have problems getting the Count value out auf the OLD DB Command Task because everytime I try to add an outputcolumn i get the message: "An Output cannot be added to the output collection" and since there is no possibility to map an expression to the result...


I tried to workaround the problem using a lookup task.. but that seems to be the wrong way.

thanks for your help
bye
AS

View 3 Replies View Related

Bulk Insert From Txt File Less Data Than Columns

May 23, 2007

Hi, I´m trying to bulk insert files that looks like this:

aaaa,bbb,dddd,
ccc,dfd,tghj,

each file can have up to 10 data fileds per line, and each file will have same number of data fileds in particular file, let´s say 3 like above. Second file could have let´s say 10 and that is maximum.

I read the file and insert data with fieldterminator in temp table from witch I insert data to other tables regarding some parameters inside.

Now problem is:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

That is because I´m trying to insert 3 fields of data in temporary table which is made of 10 columns (It have to be 10 because next file could have 10 fileds of data). If the temp table has same number of columns like text file has data fields than it works.

What is solution for this problem?
Can I bulk insert NULL in columns for which I don't have data?

I can also import each line of text file to one column (with delimiter inside) but than I don´t know how to insert that data to correct tables or even to one table but to seperate data fields to columns with fieldterminator which is , in this case.

I'm new to SQL and I would apriciate any help.
Thank you

View 3 Replies View Related

T-SQL (SS2K8) :: Ignore First Row In Data File In Bulk Insert?

Feb 17, 2015

I use bulk insert to fill a table

I use code bellow

bulk insert dbo.test
from 'c: est.txt'
with(FIRSTROW=2,FORMATFILE='c: est.xml'
go

but data inserted to table start from 3throw.

View 2 Replies View Related

Bulk Insert - Inserting Only One Column From Data File

Sep 29, 2007



Hi,

I have a data file in the folloing format

SubjectId1|class1
SubjectId2|class2
SubjectId3|class3


I just wanted to insert only SubjectIds into my table 'Subjects' which has the follwing schama ignoring the classes
The row delimeter is "
" and the column delimeter is ' | '

Table Subjects
{

ID (Autoincrement)
SubjectId varchar(20)
}

Can any one provide the format file for doing this or suggest anyway to do this?
Please do note that the file may contain millions of records

Thank u
~mohan

View 5 Replies View Related

Transact SQL :: Bulk Insert Arabic Data From CSV File To DB

Aug 5, 2015

USE TEST 
GO 
/****** BULK INSERT  ******/
BULK
INSERT [Table01]
FROM 'C:empdata.csv'

[code]....

I am using above code to insert csv file data which consist of arabic data as well.  Upload is successful however Arabic field data is uploaded with invalid characters and getting the following error Msg 4864, Level 16, State 1, Line 3...Bulk load data conversion error (type mismatch or invalid character for the specified codepage)

View 15 Replies View Related

Bulk Insert From Native Format Data File.

Dec 5, 2006

With "bcp MyDatabase.dbo.MyTable out C:MyFile.Dat -n -T" command line, I could get an exported data file. And I can also import this file  into MyTable using 'BULK INSERT MyDatabase.dbo.MyTable FROM 'C:MyFile.dat' WITH (DATAFILETYPE='native');' query statement.

Now, I want to make my own data file just like made by bcp above. Although I could make file of 'char' type, 'native' type file is needed for performance and other reasons. And the format file should not be used.



 Any one help?

View 5 Replies View Related

Skip The First Line Of The Data File - Bulk Insert

Oct 1, 2007

Hi,
I have a data file and the contents of it are as follows

2 -- This is the header indicating the no of records in my files
1001|s1
1006|s2

The content of format file is as follows. This is to skip first column of the all the rows and get only Subs (i.e s1 and s2 )


9.0

2
1 SQLCHAR 0 100 "|" 0 ID ""

2 SQLCHAR 0 100 "
" 1 Subs ""


Here is my query to get all the Subs from my data file


SELECT * FROM OPENROWSET( BULK 'datafile.txt',

FORMATFILE = 'FormatFile.fmt',

FIRSTROW = 2 ) AS a

But this query retuns only s2 where i was expeting s1 and s2. The reason being is that the firts row i.e header doesn't follow the format
Can any one please let me know how to skip the first line in the data file and get the result as required

~Mohan

View 6 Replies View Related

Transact SQL :: How To Do Bulk Insert Into Temp Table From Text File

Sep 15, 2015

How do I do a bulk insert into a temp table from a text file. Text file looks like that:
 
ver_id TYPE
E57AB326-803C-436E-B491-398A255C919A 58
34D2A601-6BBA-46B1-84E1-3A805FDA3812 58
986E140C-62F1-48F1-B428-3571EBF00DA0 58

My statement looks like this:

CREATE TABLE [dbo].[tblTemp]([ver_id]  [nvarchar](255), [TYPE] [smallint]) 
GO
BULK INSERT [dbo].[tblTemp]
FROM 'C:v2.txt'
I keep receiving errors.

The error I receive is: Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (TYPE).

View 2 Replies View Related

BULK INSERT, Setting Static Data Using The Format File

Mar 2, 2004

Hello dbforums,

I are using a BULK INSERT to insert the data from a ascii file to a sql table. The table has a ProductInstanceId column that exists in the tables but does not exist in the ascii DICast data. I am setting the ProductInstanceId to a Guid that will be used for Metrics. I would like to create the Guid in C++ and then set it somehow during the BULK INSERT DICastRaw1hr and DICastRaw6hr. I am calling the BULK INSERT from C++/ADO. I do not see how you can set a static data in the BULK INSERT for a column that exists in the table but does not the source data ... seems there should be a way to do this with the format file?

The other way to do this is with a TRIGGER. I have the TRIGGER below. Prior to the calling the BULK INSERT using ADO I will use ADO to ALTER the TRIGGER with the new Guid. When the BULK INSERT runs the ProductInstanceId will be populated with the new Guid.

ALTER TRIGGER DICastRaw1hrInsertGuid
ON Alphanumericdata.dbo.DICastRaw1hr
FOR INSERT AS UPDATE dbo.DICastRaw1hr SET ProductInstanceId = '4f9a44eb-092b-445b-a224-cc7cdd207092'
WHERE modelrundatetime = (select max(modelrundatetime) from Alphanumericdata.dbo.DICastraw1hr(NOLOCK))

More Questions:

- The Trigger is slow. The Bulk Insert without the Trigger runs in about 10 sec ... with the Trigger in about 40 sec. I tried to use the sql code below in the TRigger but it was only doing the UPDATE on the last row. The TRIGGER must run after the BULK INSERT is complete. Now I am using the select (bad). Any comments ...

ALTER TRIGGER DICastRaw1hrInsertDate
ON Alphanumericdata.dbo.DICastRaw1hr
FOR INSERT
AS
DECLARE @ID as integer
SELECT @ID = i.recordid from inserted i
UPDATE dbo.DICastRaw1hr SET ProductInstanceId = '4f9a44eb-092b-445b-a224-cc7cdd207092'
WHERE recordid = @ID

- I understand that I could set the Guid in the Default Value part of the table definition using the NEWID() function. I need the Guid to be the same for all the rows that are inserted during the BULK INSERT (all have the same modelrundatetime) ... how would I do this?

Thanks,
Chris

View 6 Replies View Related

Bulk Insert Fails. Column Is Too Long In The Data File

Jun 27, 2006

Hi,

for testing purposes I'm inserting a flat file into a sql-server table using BULK INSERT unsig the following code:

BULK INSERT rsk_staging
FROM 'c: empulk
sk.txt'
  WITH (
    FIELDTERMINATOR = '',
    ROWTERMINATOR = '
',
    CODEPAGE = 'RAW',
    DATAFILETYPE  = 'char',
    BATCHSIZE = 100000,
    ROWS_PER_BATCH = 1925604,
    TABLOCK
  )

I have two versions of "rsk.txt" one with 1.9mill rows and one with the first 2000 rows only. The files have one column only with 115 characters that I'll split in to several columns later using SUBSTRING. The one with 2000 rows fires in to the database with no problems whatsoever using this exact code, the other one throws the following error:

Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.

How can I resolve this problem?

EDIT: I tried several different row- and fieldterminators but this exact one works for the small data-file so I assume it should also work for the large one...the large one is however copyed directly using binary ftp from a unix-filesystem and the small one is manually copied into a new txt-file using UltraEdit.

View 1 Replies View Related

Transact SQL :: Bulk Insert When Data File Is On Network Share

Dec 3, 2015

I am running Microsoft SQL Server 2012 SP on a Windows Server 2008 R2 Standard SP1 box. The SQL Server service is running as a simple windows domain user (nothing special, no admin rights, etc.) I am having some issues with using Bulk Insert when the data file is on a network share when using Windows Authentication. What is known is that the SQL Server service account has access to the network resource, which is shown by logging into SQL Server with a SQL account and doing the Bulk Insert. I also have rights to the files on the share, as shown by the fact that I put the files there. My SQL is in the form of:

Bulk Insert [table name] From '[server][share][filename]' With (FirstRow = 2, FormatFile='FormatFile.xml')

Now, when connecting to SQL Server with Windows Authentication and running the Bulk Insert I get the following error:

Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "[server][share][filename]" could not be opened. Operating system error code 5(Access is denied.).

I found this snip at

BULK INSERT (Transact-SQL)Security Account Delegation (Impersonation), which says, in part (emphasis mine):

To resolve this error [4861], use SQL Server Authentication and specify a SQL Server login that uses the security profile of the SQL Server process account, or configure Windows to enable security account delegation. For information about how to enable a user account to be trusted for delegation.

How to Configure the Server to be Trusted for Delegation, and we tried the unconstrained delegation and I rebooted the SQL server, but it still does not work. Later we tried constrained delegation and it still does not work.

I have verified the SPNs:

C:>setspn adsvc_sqlRegistered ServicePrincipalNames for CN=SVC_SQL,OU=Service Accounts,OU=Users,OU=ad domain,DC=ad,DC=local:        MSSQLSvc/SQLQA.ad.local:1433        MSSQLSvc/SQLDev.ad.local:1433        MSSQLSvc/SQLQA.ad.local        MSSQLSvc/SQLDev.ad.local
I have verified that my SQL connection is TCP and I am getting/using a Kerberos security token.
C:>sqlcmd -S tcp:SQLQA.ad.local,1433 -E1> Select dec.net_transport, dec.auth_scheme From sys.dm_exec_connections As dec Where session_id = @@Spid;2>
gonet_transport auth_scheme------------- -----------TCP KERBEROS(1 rows affected)1>

If I move the source file to a local drive (on the SQL server), all works fine, but I must be able to read from a file share?

View 8 Replies View Related

SQL Server 2012 :: Bulk Insert Using UNC Path Of File Table Directory

Jul 22, 2013

Overall goal: Write a Bulk Insert statement using the UNC path of a filetable directory.

Issue: When using the UNC path of the filetable directory in a Bulk Insert Statement, receiving "Operating system error code 50(The request is not supported.)" Looking for confirmation as to whether this is truly not supported.

Environment: SQL Server 2012 Standard. Windows Server 2008 R2 Standard

View 9 Replies View Related

Insert Data From Flat File Into Serveral DB Tables Having Many-to-many Relations

Nov 4, 2007

Hi everyone,

I am new to SSIS and I thought maybe someone would give me tips for solving the problem I am facing.


Overview:
I want to insert data contained in a flat file into several DB tables, which have N-M relations.

For illustration, I would explain the problem on a very simple DB:
1. The database contains the following 3 tables:
EMPLOYEE (EMP_ID, EMP_NAME)
PROJECT (PROJ_ID, PROJ_NAME)
EMP_PROJ (EMP_ID, PROJ_ID) , where EMP_ID and PROJ_ID are foreign keys referencing records in the EMPLOYEE and PROJECT tables respectively.


2. Each entry in the falt file contains the following data:
EMP_ID, EMP_NAME, PROJ_ID, PROJ_NAME


3. In SSIS, I have created a Data Flow Task containing:
- a path from a Falt File Source to an SQL Server Destination (Table: Employee)
- a path from a Falt File Source to an SQL Server Destination (Table: Project)
- a path from a Falt File Source to an SQL Server Destination (Table: Emp_proj)

Note: I used SQL Server Destination, because I need to import a huge amount of data and I read that this component performs better than the OLE DB Destination!


Questions:
1. I would like to eliminate EMP_ID and PROJ_ID from the Flat File Source. Instead, I would like these fields to be generated automatically upon insertion.
a. How can I do this and propagate the generated key among the different paths, which I have explained previously?
b. Can I first generate the two keys somehow then the parallel insertions into the different tables should start using the generated keys?


2. Is my solution correct in the first place? Or is there another better way for inserting data which belong to N-N relations?


Thanks in adavance,
Samar


View 5 Replies View Related

Create Package For Multiple Table Insert From Flat File Source

Feb 23, 2008



Hello Guys,

I am working in one company and currently I am assigned to new project for Data Migration from company X to our company Y using SSIS. I am totally new and i just completed 5 tutorial which was gien on MSDN website.

Basically client is going to send us first flat file with 1 million records with Header, Detail and Trailer records.
I want to create a Package in such a way that it dumps all this first load into 7 to 8 different tables at a time.
we also have to include functionlity for validation and error check.
On successfull load error file should only return Header and Trailer but no detail records.
If there are any errors then error file should contain Header, Detail records which were unable to load plus trailer which we have to sent back to client.

When 2nd file comes that time we have to check whether this is new records or change (update) one depending on Flag which tells it.


This is basically high level idea of my Package what i need to create. If u guys have any question then let me know.

I know you guys are very experienced one. Anyone of you please give me some detail idea on it I would really appricate it.
I have very limited time line for it.

Thanks

Shah

View 4 Replies View Related

Csv File Import: Bulk Insert Data Conversion Error (type Mismatch)

Sep 27, 2004

Hi,

Iam trying to import data from a csv file into my table in SQL Server 2000. My table is called as temp_table and consists of 3 fields.

column datatype
-------- -----------
program nvarchar(20)
description nvarchar(50)
pId int

pId has been set to primary key with auto_increment.

My csv file has 2 columns of data and it looks like follows:

program, description
"prog1", "this is program1"
"prog2", "this is program2"
"prog3", "this is program3"


Now i use BULK INSERT like this

"BULK INSERT ord_programs FROM 'C:datafile.csv' WITH (FIELDTERMINATOR=',', ROWTERMINATOR='', FIRSTROW=2)"

to import data into my table in SQL server and it gives me this error

"Bulk insert data conversion error (type mismatch) for row 2, column 3 (pId)"

I guess i have to use fileformat or something since i dont have anything for pId field in the csv file to make it work...

Please help me out guys and please post a snippet of code if you have.

Thank You.

View 2 Replies View Related

Transact SQL :: How To Bulk Insert Rows From Text File Into A Wide Table Which Has 1400 Columns

Feb 3, 2010

we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008. 

We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns.  The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
 
Is there any proper way to load this kind of files into the db table? 

View 6 Replies View Related

SQL Server 2014 :: Bulk Insert Data Into Table

Jul 29, 2014

I need to load the following data into a SQL table. This is how the vendor is able to provide it to us.

CRCorp Daily Report,,,,,,
,,,,,,
Facility,Location,Purchase Order #,Vendor,Inventory #,Date Ordered,Extended Cost
09-Mowtown 495 CRST,09-402A Women's Imaging,327937,"BARD PERIPHERAL VASCULAR, INC.",113989,7/25/2014,650
09-Mowtown 495 CRST,09-402A Women's Imaging,327936,"WB MASON CO., INC.",112664,7/25/2014,8.64
01-Mowtown 499 CRST,01-302B Oncology,327947,McKesson General Medical,n/a,7/25/2014,129.02

[Code] ....

I have attempted to bulk insert it into this table with no luck.

CREATE TABLE POMaster
(Facility VARCHAR(75),
Location VARCHAR(75),
PONum INT,
VendorNm INT,
INVENTORYNUM VARCHAR(25),
orderDte DATE,
ExtendedPrice NUMERIC(10,2)
)
GO

It does not like the double quotes. How to make this format work? Do I need a format file?

View 2 Replies View Related

SQL Server 2008 :: Bulk Insert Data Into Table

Mar 23, 2015

I want to bulk insert data into a table named scd_event_tab inside a database named rdb.

When I do select * from rdb.dbo.scd_event_tab, i get :

JOB_ID RUN_ONPRIORITYPAYLOADTIMEOUT_INTERVALSTATUSPICKUP_TIMESCD_TYPESCHEDULE_IDDB_ADMIN_LOGIN_REQUIRED_YN

I saved the result into a csv file and then truncated the table. Now, I am trying to bulk insert the data into the table. So I used:

bulk insert
rdb.dbo.scd_event_tab from 'C:userssluintel.ctrdesktopeventtab.csv'
with
(
codepage = 'RAW',
datafiletype = 'native',
fieldterminator = ' ',
keepidentity,
keepnulls
);
go

However, I get this error:

Msg 4867, Level 16, State 1, Line 1
Bulk load data conversion error (overflow) for row 1, column 1 (JOB_ID).
Msg 4866, Level 16, State 5, Line 1

The bulk load failed. The column is too long in the data file for row 1, column 3. Verify that the field terminator and row terminator are specified correctly.

Msg 7399, Level 16, State 1, Line 1

The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.

Msg 7330, Level 16, State 2, Line 1

Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved