Can We Use Excel File (source) For Bulk Insert Task
Jun 1, 2007Can an excel file be used as a source for a Bulk Insert Task in SSIS instead of a delimited flat file?
View 6 RepliesCan an excel file be used as a source for a Bulk Insert Task in SSIS instead of a delimited flat file?
View 6 RepliesI'm importing a large csv file two different ways - one with Bulk Import Task and the other way with the Data Flow Task (flat file source -> OLE DB destination).
With the Bulk Import Task I'm putting all the csv rows in one column. With the Data Flow Task I'm mapping each csv value to it's own column in the SQL table.
I used two different flat file sources and got the following:
Flat file 1: Bulk Import Task = 12,649,499 rows; Data Flow Task = 4,215,817 rows
Flat file 2: Bulk Import Task = 3,403,254 rows; Data Flow Task = 1,134,359 rows
Anyone have any guess as to why this is happening?
Using BCP or BULK INSERT you can specify an Error File (-e and ERRORFILE). However this does not seem to be exposed in SSIS via the Bulk Insert Task.
Does anyone know if I'm missing something and the Property is called something else or if can be accessed via script?
Cheers,
-Ryan
hi
with my ssis-package i have to read several flatfiles. this files are stored in different folders on a unix machine. in my loop-task i have first a script, that checks whether the file exists. it does, so i set the path to a variable. in the connection-manager for the flat-file i have set source for the file to that variable. the bulk-insert-task starts, read the file and everithing is cool. but, sometimes, the package fails with this message:
[Bulk Insert Task] Error: An error occurred with the following error message:
"Cannot bulk load because the file "\Owpu0kas 0 0 5
eceivedw000005.090"
could not be opened. Operating system error code 53(error not found)."
i stop the degugging-mode, restart debuging and what happens? it runs ... maybe for the next one, two or three files and the error comes again, but never on the same file. i'm going crazy, what can i do?
thanks for your help.
I have an SSIS package doing a bulk insert from a file. Then later on I'm trying to delete that file (in a file delete task), but I'm getting an error:[File System Task] Error: An error occurred with the following error message: "The process cannot access the file 'xyz' because it is being used by another process.".I'm wondering if there isn't some way to 'tweak' the bulk insert syntax so that it doesn't lock the file?
View 5 Replies View Related
Hello,
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Thanks.
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages:
BULK INSERT TableName
FROM 'C:DataDbTableName.bcp'
WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following:
DataFileType: DTSBulkInsert_DataFileType_WideNative
RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help.
Paul
Hi, folks:
View 18 Replies View Related
Does anyone know how to do a bulk insert using just the script task? I've been searching everyehere but can't seem to find a sample.
I have the Excel Connection Manager and Source to read the contents from an Excel file. For some reason couple of numeric fields from the Excel worksheet are brought over as nulls even though they have a value of 300 and 150. I am not sure why this is happening. I looked into the format of the fields and they are set to General in Excel, I tried setting them to numeric and that did not help.
All the other content from the excel file is coming thru except for the 2 numeric fields.
I tried to bring the contents from the excel source to a text file in csv format and for some reason the 2 numeric fields came out as blank.
Any inputs on getting this addressed will be much appreciated.
Thanks,
Manisha
I am using the Bulk Insert task object in DTS and I am using a format file as well to import data from a text file. I was able to insert data into my SQL table fine when I did not have a identity column defined in my table; however, I added a identity field and now the bulk insert will not work. Anyone had a similiar problem?
The error msg box that pops up says:
The statement has been terminated.
string or binary data would be truncated.
I am using a Excel Source to get the data from an excel file to sql server 2005 table. A couple columns are coming in a double precision float, but some values have characters in them, but those values are coming out as null, even though I changed the datatype from float to unicode string. Any inputs on resolving this will be much appreciated.
Thanks,
Manisha
I want to check sorted data option for bulk insert task on DTS Package. I want to sort data on 1st column. How can I put order by clause on sorted data option? I put ORDER (SSNo) but I am getting error.
Thanks
ss
I am doing heavy Bulk insert task to one Database.When i checked the Activity Monitor I saw it was suspended and Wait type:PAGEIOLATCH_EX
So it stopped itself.Do any experts have good idea about what is going on?
Appreciate for help
Dear All,
For ex:
In SQL Server 2005 when we create table with 3 columns which has allow nulls option is disabled. Then we try to insert record with blank to any one of the column SQL server throws error, coz column name is NOT Null without inserting data it won't saves the record.
When we try to import a csv data file has 3 columns & has null data. If we use BULK INSERT TASK option, it easily inserts the data, without throwing error.
My question is how can we restrict null data without inserting it into the table using BULK INSERT TASK using SQL Server 2005.
Please suggest me to solve this issue.
Thanks in advance
Karna
Hi,
I am using the Bulk Insert Task to bulk copy data from a flat file to a SQL Server 2005 table. The flat file contains pipe ( | ) delimited columns and several thousand records which are {CR}{LF} delimited. The file also contains a header record and trailer record, which contain several words but also contains a ( | ) symbol among those words.
E.g.:
HEDR | yadi yadi yada500 | Data Data | More Data | 600460 | Datum | More More | 705550 | Data | Data | 603FOOTR | yadi yadi yada
I need to ignore the header and trailer records and just pickup the proper records. But even though I manually set the First Row property to 2 and the Last Row property to 4, It does not pickup the first true record i.e. the record which begins with 500, and if I set the First Row to 1, it throws me an error citing 'truncation error' or similar. I think it is taking the first record (i.e. header row along with the first row as one, and since there are now more pipes ( | ) the error is thrown)
I've tried setting different values for these properties but to no avail... Any help will be deeply appreciated...
Thanks
Gogula
i need to do a bulk insert for every csv file from a particular folder & i wish to do this programatically.
i hav tried a "foreach" loop & using variables for the filenames but it doesnot seem to be working & i just cant figure out why !!
hi,
i have a bulk insert task and i want to transfer to the source connection each time a different
text file name. how do i transfer to the connection source field a parameter with a new name
each time. the name will b excepted as a parameter to the package. can i do it in a bulk insert
task?
thx,
Tomer
Hello all,
I am running into an issue that seems to be collaborated here: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=188886&SiteID=1.
To summarize, I have a Foreach Loop that uses a Foreach File Enumerator. The loop writes the file names to a variable which is used in the Expressions property for a flat file's ConnectionString. The flat file is used as a source destination in a Bulk Insert Task that is inside the Foreach Loop. The flat file's connection string is not picking up the file name, resulting in this error: "The specified connection "test.txt" is either not valid, or points to an invalid object. To continue, specify a valid connection."
It doesn't work for a file connection manager or a flat file connection manager.
It does work if I replace the Bulk Insert Task with a Data Flow (flat file source -> ole db destination). It also works if I set the file's connection string manually in a script task.
It looks to me as though the Bulk Insert isn't calling whatever method in the connection manager that reevaluates the property expression. Am I missing something or is this a bug? I looked at the connect site, but couldn't find this particular scenario.
Thanks for any insight you might provide!
Jessica
Is there anyway to  send excel file from ssis using send mail task without saving the excel file locally. I need to automate the process which involves loading the excel file from the database and send it to some people.Â
View 6 Replies View RelatedReally Don't understand this and have run out of hair to pull out.
DTS package whihc has a Bulk Insert Task. The task is populated with its source filename dynamically. It has a format file which is stored locally to the box its running on.
The output file is from a unix box pipe delimited. Great dandy. All works a treat as long as I treat everything as either an INT or CHAR. But I have a couple of columns that are DATES in the format 13/02/2004.
I want these inserted into smalldatetime columns. Will settle for datetime columns though.
I get an error on every row Bulk Insert data conversion error (type mismatch) for Row X Column 11 (PostingDate) which happens to be the first date column.
Tried loading it as CHAR, SQLDATETIME and SQLDATETIM4 in the format file all to no avail. TRied it with columns in the dest table as DATETIME and SMALLDATETIME
Anyone have any ideas?
Hi,
I use a "Bulk Insert Task" inside of a "ForEach Container" and set the ConnectionString-Property of the "FlatFile-ConnectionManager" with an expression, pulling from a variable.
The packages works fine, but I get an Error-Message in the Output-Window:
Fehler: 0xC001401E bei Package, Verbindungs-Manager 'FlatFile': Der in der Verbindung angegebene Dateiname 'c:old.txt' war ungültig.
Translation:
Error: 0xC001401E at Package, Connection-Manager 'FlatFile': Cannot open the file 'c:old.txt'.
(Sorry, in this office is the no english version available).
'C:old.txt' is the initial value of the variable.
The new value is 'C:ew.txt' and I can see the new value in the Debugger or in a Scripttask.
Before the execution of the "Bulk Insert Task" starts, SSIS makes a validation. I have tried to stop the validation on setting the property "DelayValidation" = True (Package,ForEach-Container,BulkInsert-Task,FlatFile-ConnectionManager), but it is not possible to stop the validation and the error-message.
After printing the error-message, the variable is evaluated and the package works fine. After the first loop, there is no more error-message.
The error-message is a problem, because the operators look at the error-output.
As a workaround I can set the property in a Scriptatsk. Is there another solution????
Dan has a blog to this problem, but no solution:
http://www.developerdotstar.com/community/node/347
Thanks
Berthold Neumann
I would like to get the rowcount from a bulk insert task to validate that all the data is being inserted correctly. So far the only way that I can see this being done is through a trigger on the target tables. I would like to have this information inside the DTS package. Can anyone help me?
Thanks
Hi, ALL
I have a confusion with Bulk Insert task for several days.
Here is what Books Online say, which i totally understand---
"The source file that the Bulk Insert task loads can be on the same server as the SQL Server database into which data is inserted, or on a remote server. If the file is on a remote server, you must specify the file name using the Universal Naming Convention (UNC) name in the path."
Now, If my Bulk Insert task (within a package) is on one server, say ETL ( so when i run the package, it will run from ETL server); but, My destination table where data will be inserted, is in another server, say PROD. So, My OLEDB connection will be pointing to a remote server .Now, If my text file is on ETL server, do i have to give UNC path?..Or If i have text file on the same server as my destination ( here PROD) do i Have to Give UNC naming?. I found in several places that Bulk insert Task runs on that server where its connections points to.
My scnerio: I have multiple SSIS packages. Now These packages will be on ETL server, but all the databases will be hosted on PROD server. Now for my Bulk Insert , I asked them to keep all flat file on ETL server(where packages will be ). Looks like this will give me a problem later, since all my Destination will be on PROD server, during bulk insert , sql server will go and look for those files on PROD server rather than ETL server? Or i AM confused here?......
Say package.dtsx is On SER1which useses bulk insert tasks..
My OLEDB connection is SERV2 database B and Table T
Now my flat file connection is D:myfilefile.txt
Now when i run this task where does these txt file will be pulled from. Does it pulls from local server (SER1) or does it goes to SERV2 looking for txt file.
I tried several combinations of moving files to different servers, yet i have no conclusions.
Thanks, Kumar
i got some bulk insert tasks in SSIS inserting into some tables with identity set ON....
in 1 column. Can the bulk insert task go smoothly?
PS: i cannot find anywhere in the bulk insert task that can set the ignore identity columns...........
now my steps are prepare database -> create database -> bulk insert into tables
working on my previous problem, Jamie.
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')
[code]...
Hi,
i have a file which consists data as below,
3
123||
456||
789||
Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.
BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')
but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.
can anyone help me how to do this?
Thanks,
-Badri
I starting getting random LCK_M_SCH_M locks with huge wait time, which hung my etl proccess.
The ssis package runs like this:
I have 4 containers that run in parallel and do the same thing:
-Convert a tab delimited file from unicode->utf8
-Truncate the table (within a foreach loop)
-Bulk insert the data
Also transaction option is set to Not Supported.
What could be causing the lock?
All for each loops do not overlap regarding tables/files.
Do they contest somehow?
Hi Guys,
The way I understand a Data Flow Task is that it inserts the rows from the source to destination one by one. Is there a way to make it act like a bulk insert task? We have been experiencing performance issues when inserting a lot of rows from one table to another. If there's no way to actually do it, can a bulk insert task functionality be scripted? Coz what I need is a table to table insert, and the bulk insert task only accepts data files as sources.
Thanks!
Kervy
I am developing a SSIS package for inserting data in an Oracle database table. but looks like the 'bulk insert task' in SSIS does not support oracle database. I cannot set the destination connection property to use the oracle database connection.
I have created one OLEDB connection pointing to my oracle database. (using 'Oracle Provider for OLEDB' and I tried 'Microsoft OLEDB provider for Oracle' also). this connection is working fine.
When I go to the connection property in the 'Bulk Insert Task's edit dialogue box, in the DropDown list, I don€™t see my Oracle connection listed there. (but a SQL server database connection existing in the same package could be seen).
Does anyone have solution for this problem?
Hi Guys
The problem is...
When i try to bulk insert a single file its working fine. When i want to loop a set of files in a folder and use Foreach Loop and BulkInsert Task...its failing..
In the flat file connection When i specify usage type as existing file...its loading the same file "n" number of times where n is the number of files in the folder.
When i select usage type as existing folder i get error " Cannot bulk load because the file "....folder" could not be opened. Can someone help me out with this?
I have sql 2005 as as separate instance...beside my 2000 which is default
Thanks!
I have an SSIS Package which is designed to import log files. Basically, it loops through a directory, parses text from the log files, and dumps it to the database. The issue I'm having is not with the package reading the files, but when it attempts to write the information to the db. What I'm seeing is that it will hit a file, read 3000 some lines, convert them (using the Data Conversion component), and then "hang" when it tries to write it to the db.
I've run the SQL Server Profiler, and had originally thought that the issue had to do with the collation. I was seeing every char column with the word "collate" next to it. On the other hand, while looking at the Windows performance monitor, I see that the disk queue is maxed at 100% for about a minute after importing just one log file.
I'm not sure if this is due to the size of the db, and having to update a clustered index, or not.
The machine where this is all taking place has 2 arrays- both RAID 10. Each array is 600 GB, and consists of 8 disks. The SSIS package is being executed locally using BIDS.
Your help is appreciated!
Hey All,
Similar to a previous post (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=244646&SiteID=1), I am trying to import data into a SQL Table.
I am trying to program a small application that will import product data obtained through suppliers via CD-ROM. One supplier in particular uses Fixed width colums, and data looks like this:
Example of Data
0124015Apple Crate 32.12
0124016Bananna Box 12.56
0124017Mango Carton 15.98
0124018Seedless Watermelon 42.98
My Table would then have:
ProductID as int
Name as text
Cost as money
How would I go about extracting the data with an XML Format file? I am stumbling over how to tell it where to start picking up data for a specific column.
Is there any way that I could trim the Name column (i.e.: "Mango Carton " --> "Mango Carton")?
I don't know if it makes any difference, but I've been calling SQL from my code by doing this:
Code in C# Form
SqlConnection SqlConnection = new SqlConnection(global::SQLClients.Properties.Settings.Default.ClientPhonebookConnectionString);
SqlCommand cmd = new SqlCommand();
cmd.CommandType = CommandType.Text;
cmd.CommandText = "INSERT INTO PhonebookTable(Name, PhoneNumber) VALUES('" + txtName.Text.ToString() + "', '" + txtPhoneNumber.Text.ToString() + "')";
cmd.Connection = SqlConnection;
SqlConnection.Open();
cmd.ExecuteNonQuery();
SqlConnection.Close();
RefreshData();
I am running Visual Studio C# Express 2005 and SQL Server Express 2005.
Thanks for your time,
Hayden.