I have a folder on an SFTP server in our internal network which gets flat files periodically (almost every minute) from another server. The file structure is like this:
Flat File Format
•Line 1—List of field names comma separated
•Line 2—List of field type comma separated
•Line 3—Data comma separated
•Line 4—Data comma separated
and so on...
All the files will be like this without the .csv extention. There are a lot of fields and creating a column for each field manually will be a pain. Heres what I am trying to do:
1. Create a process that runs periodically on the server and imports data from files into the SQL server table.
2. Use the information in line 1 and 2 to create the schema of the table and then ignore those lines in every import (the data should be appended to the existing data).
What are my options?
Thank you all,
Bullpit
I have a couple of hundred flat files to import into database tables using SSIS.
The files can be divided into groups by the format they use. I understand that I could import each group of files that have a common format at the same time using a Foreach Loop Container.
However, the example for the Foreach Loop Container has multiple files all being imported into the same database table. In my case, each file needs to be imported into a different database table.
Is it possible to import each set of files with the same format into different tables in a simple loop? I can't see a way to make a Data Flow Destination item accept its table name dynamically, which seems to prevent me doing this.
I suppose I could make a different Data Flow Destination item for each file, in the Data Flow. Would that be a reasonable solution, or is there a simpler solution, or should I just resign myself to making a separate Data Flow for every single file?
I have over three hundred text files that I need to import to SQLServer.Each is in the exact same format.I want to import tham as seperate tables.Is there any way to do it in one process?Regards,Ciarán
I have a client who is sending me 800+ excel files each month with sales data. Each of the files is identical in structure, but has sales data for different stores. I receive all these files at the same time.
Is there a method with Data Transformation Services where I can have it work off of all the files in a given directory. I can set up DTS to work off of specific Excel files with no problem, but what I would like to do is set up a DTS so it could pull from each of the 800+ files.
Is this possible, or do I need to look at a solution outside of SQL to consolidate the Excel files first?
The Excel file would have columns similar to the following: store_id, zip_code, sales, transactions.
HiI have multiple excel files of the same format in a directory.They are called book1.xls, book2.xls, book3.xls and so on.What is the easiest way to import the tab named sheet1 from each ofthe excel files to a databse using SQL server 2000 enterprise edition?Regards,Ciarán
I am trying to import multiple .csv files to excel sheets using Script task in SSIS. I have trouble importing the reference that allows us to read and write to excel sheets. Can anyone help me create a script task that will import multiple .csv files to excel sheets.
I'm just learning SSIS. As I was following the tutorial on foreach loop container (lesson 2) to export multiple flat files in creating a simple ETL package in SSIS, I get the following error:
SSIS package "Take17.dtsx" starting. Information: 0x4004300A at Extract Cobra EBA, DTS.Pipeline: Validation phase is beginning. Information: 0x4004300A at Extract Cobra EBA, DTS.Pipeline: Validation phase is beginning. Information: 0x40043006 at Extract Cobra EBA, DTS.Pipeline: Prepare for Execute phase is beginning. Information: 0x40043007 at Extract Cobra EBA, DTS.Pipeline: Pre-Execute phase is beginning. Information: 0x402090DC at Extract Cobra EBA, Cobra EBA [1]: The processing of file "" has started. Warning: 0x80070003 at Extract Cobra EBA, Cobra EBA [1]: The system cannot find the path specified. Error: 0xC020200E at Extract Cobra EBA, Cobra EBA [1]: Cannot open the datafile "". Error: 0xC004701A at Extract Cobra EBA, DTS.Pipeline: component "Cobra EBA" (1) failed the pre-execute phase and returned error code 0xC020200E. Information: 0x402090DD at Extract Cobra EBA, Cobra EBA [1]: The processing of file "" has ended. Information: 0x40043009 at Extract Cobra EBA, DTS.Pipeline: Cleanup phase is beginning. Information: 0x4004300B at Extract Cobra EBA, DTS.Pipeline: "component "OLE DB Destination" (194)" wrote 0 rows. Task failed: Extract Cobra EBA Warning: 0x80019002 at Take17: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. SSIS package "Take17.dtsx" finished: Failure.
I spent some time in understanding where could be the error. I traced it to the following place. If you take a look at the Lesson#2 in creating a simple ETL package in SSIS, using foreach loop container to import files into SQL server, as soon as I finish going through steps in configuring the file connection manager to use the variable for connectionstring, it immediately disappears. I repeated the tutorial three times exactly as I pointed out and when I reach to this step of configuring file connection manager for connectionstring, it takes the path and when I start debugging it, I get this error. When I go and check if everything is Ok, the connectionstring value is empty.
Hello,I need to import a bunch of .csv files. The problem I am having is the"non data" information in the files creating bogus rows and columndefinitions. Here is an example of the csv file.CBOT - End-of-Day Futures Bulk Download 2001.2 Year U.S. Treasury Notes FuturesDateSymbolMonth CodeYear CodeOpen20010103ZTH2001102.0937520010104ZTH2001102.0312520010105ZTH2001102.28125In this case, there are bogues rows created with the text at thebeginning of the file, and also the column names get placed into a rowas well. My question is; how do you import the file and strip out the"non-data" data? so that only the actual data gets inserted into the db?Thanks,TGru*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
I have the original copies of the .MDF and .LBF of another database that has a bunch of data in that I'm trying to important into MSSQL Express for a project I'm working on. I no longer have a running version of this database just the .MDF and .LBF files and I have no clue what version of MSSQL they were originally created with. If anyone knows a way to import this database please help.
I am trying to import a CSV file into an SQL Server table with the OleDbDataReader and SqlBulkCopy objects, like this: using (OleDbConnection dconn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=c:\mystuff\;Extended Properties="text;HDR=No;FMT=Delimited"")) { using (OleDbCommand dcmd = new OleDbCommand("select * from mytable.csv", dconn)) { try { dconn.Open();
using (OleDbDataReader dreader = dcmd.ExecuteReader()) { try {
using (SqlConnection dconn2 = new SqlConnection(@"data source=MyDBServer;initial catalog=MyDB;user id=mydbid;password=mydbpwd")) { using (SqlBulkCopy bc = new SqlBulkCopy(dconn2)) { try { dconn2.Open(); bc.DestinationTableName = "dbo.mytable"; bc.WriteToServer(dreader); } finally { dconn2.Close(); } } } } finally { dreader.Close(); }
} } finally { dconn.Close(); } } } A couple of the columns for the destination table use a bit datatype. The CSV files uses the strings "1" and "0" to represent these.When I run this code, it throws this exception:Unhandled Exception: System.InvalidOperationException: The given value of type String from the data source cannot be converted to type bit of the specified target column. ---> System.FormatException: Failed to convert parameter value from a String to a Boolean. ---> System.FormatException: String was not recognized asa valid Boolean. at System.Boolean.Parse(String value) at System.String.System.IConvertible.ToBoolean(IFormatProvider provider) at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvider provider) at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType) at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaDatametadata) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaDatametadata) at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) at MyClass.Main()It appears not to accept "1" and "0" as valid strings to convert to booleans. The System.Convert.ToBoolean method appears to work the same way. Is there any way to change this behavior? I discovered if you change the "1" to "true" and "0" to "false" in the CSV file it will accept them.
Hi, I got one more problem with importing csv files using .net. The problem is that the csv file contains double-quotation marks (""). For example, the record looks like: ...,Bearing Double "D" Flange,... And the result is: ... | Bearing Double | null (all following columns are null) The code is as following: string strCsvConn = @"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=;Extended Properties='text;HDR=Yes;FMT=Delimited(,)';"; using (OleDbConnection cn = new OleDbConnection(strCsvConn)) { string strSQL = "SELECT * FROM " + strFileName; OleDbCommand cmd = new OleDbCommand(strSQL, cn); cn.Open(); using (OleDbDataReader dr = cmd.ExecuteReader()) {while (dr.Read()) { string str = Convert.ToString(dr[8]); } // Bulk Copy to SQL Server //using (SqlBulkCopy bulkCopy = new SqlBulkCopy(strSqlConn)) //{ // bulkCopy.DestinationTableName = strSqlTable; // bulkCopy.WriteToServer(dr); //} } } Any idea is highly appreciated. shz
I want to import multiple text files into a single table. I know I have to use BCP or DTS. But, I want import all files at once, instead of one at a time. And the file names are in sequence, viz. file1, file2, file3 etc. Can anybody tell me, How I can achieve this.
I have to import 18000 text files into a sql database. Each file contains 10 fields and around 5000 records. I am currently doing this with DTS.
What I am wondering is this: Is DTS the most efficient i.e. quickest way to import all this data. Bearing in mind there is about 90 million records to import in all.
I would appreciate the benefit of somebody elses experience when dealing with this type of thing.
I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file with over 100,000 records. The fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY is 32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002, Col003, etc. My problem is that I don't want everything as Characters or Col001 etc. I want different column names and columns of data to be INT, NUMERIC(x,x), etc. instead of characters every time. If I change these values to anything than the default in DTS it won't import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc. the way I want it to look, FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Hi i've 6 files with the same name part but all have a different ext. The 17 always changes to the current week number
xbouns.A17 xbouns.B17 xbouns.C17
I want to import these files into a database table. So I've create a foreach loop and select the foreach file enumerater but am not sure if this is the right way to go about it some help woth be great thanks.
I am a bit new to SQL Server but not to DBA or programming per se. I am having difficulties getting either an Excel or Text flat file to import properly.
I guess it would be best to ask, using either SSIS or BULK INSERT, what options need to be entered for a typical excel flat file?
I have a package with a corresponding configuration file.
If I open Sql server management studio and i connect to SSIS and right click to import a package, and then select to store the package in SQL Server (not on the file system)....
What happens with the configuration file?
Does the import take the values from the configuration file and place them in the package which then is stored in SQL Server?
Or do I need to put the configuration file someplace on the SQL Server where the package is imported so it can access it when it runs?
I'm a bit confused about what goes on there?
For example, I tried using the build command and then running the manifest file to import using the wizard and when it does that it copies the configuration files to a default location within the c:program filesmicrosoft sql server directory.
I need to write a genric CSV importer. one set of CSVs define the formats of all teh other CSVs. The format of the second set of CSVs is not know at the time of creating this project/SSIS package.
Imagine having two sets of CSV files.
Each file in the first set of CSVs define the format of files in the second set of CSVs The first set always have the same format. Each row of the CSV file defines a field with a Name and Type. The first two columns of each row of in the first set of CSVs have a FormatName and index. So a simple file may have:
The second set of CSVs contains records that have to comply with the format define in the first set. So in this aprticular case, a CSV file in the second set may have records such as:
The problem I have is creating a generic SSIS package for this. The first task, loading the first set of CSV file is fairly simple. The CSV format is known. But the second task is a bit trickier.
Assuming I have SQL tables to load the data. One 'Fields' record for each row in the CSVs from the fiirst set. One 'Rows' record for each row in the CSVs from the second set, One 'Values' record for each value in each row in the CSVs from the second set, Something like:
TABLE Fields (
FieldID int IDENTITY, FormatName varchar(100), Position int, ColumnName varchar(100), ColumnType varchar(20) )
- Dynamically create a SSIS package based on the content of each CSV fiel of the first set and execute that package for each file in the second set, selecting the correct package to execute (all records within a CSV file belong to the same format). - Write a single SSIS package that iterates through the rows of the second CSVs, does a lookup for each value of each row to find the field name and make an insert into the DB - Other SSIS method? - Don't use SSIS to parse the CSV and call a custom C# task? - Don't use SSIS at all ?
Hi, Can anyone help? Need to upload a text file to a sql database but keep getting errors. I'm creating a page that will allow users to to bulk import and update to a MsSql database. The users provide a text file every so often with new/update information. So i want to use a DTS package to transform the infomation, and create a table in the database, then check against existing/non existing records, if the record exist, update it, if not insert it. I'm using Visual Studio.Net, ASP.Net and coding in VB.Net.
Anyone know where i can find documentation/code regarding the above? I will be greatful for any help.
I have a load (180,000+) of text files whose contents need to go into a SQL server database.Whats the best way of doing this? Using a c# console program and if so, using FileStream or StreamReader? Or using a feature of SQL server itself. The text files themselves are less that 1k and are literally less than 200 characters.The problem is, I've tried a WinForm and although I can detect what files are there, as soon as I attempt to open one for reading, everything stops working and won't insert anything to the database.
I am new to SQL7 and I am trying to run a package that will import fixed text files in a folder and transform the data to a table. I am using a good example found in sywink.com for dynamically importing files in a directory, but I am getting an error message when there is more than one file in the folder. Also, I am able to display all the files during the loop process, but the file initially set for the source connection is the only one that transfers data. I have the close(Transform) connection checkbox option checked, and have tried other methods of closing the connection before the new source name is called, none have helped. Does anyone have any solutions to this type of problem? Or know of other methods used for my situation? Thanks James
I want to write a DTS that will import a file every day. The problem is that the files is not named the same thing every day. There is a naming convention (SOMMDDYY.TRN) that it will follow. I want to import this file (which is a fixed width file) each day to a table (The table will be empty each day).
After it is imported, I want to look at the NAME of the file, and pull out the date portion of it. So, if the file is called SO122603.TRN, i want to pull out 122603, and then update my table with that date for every record. So when I am done, I will have a table that represents the file I imported, with one added column. This added column would be a Date/Time that has the date that was in the filename. How do I do this???
HI!I am importing .txt files. How can i check the errors? I have created alog file, but the problem is that i lose some characters.I import for example:CodeABCFZHJHNfrom a text file, but sometimes Code can be 4 caracters longI import this 3 characters long now. When i add the same structuredtext file with some rows lenght 4, it skips the last character, but iget nothing in the log file.please helpxgirl
Hi allCould someone help me with the following problem? Hours of googlingyesterday couldn't get me the answer. I'm using SQL 2000 and DTS andtrying to import a huge fixed width text file.File is >1m rows and >200 columns and is defined by a proprietory (i.e. notbcp produced) format specification of the formName Start LengthFld1 0 20Fld2 19 5Fld3 24 53and so on.Tbe only way I've found to define the columns so that DTS can import thefile properly is to go through the wizard and click on the starts of eachcolumn. I don't want to use bcp if possible (I did enough of that onSQL7) - but surely there's a way to get DTS to read from a format file so Idon't have to click 200 times (with all the ensuing errors I could make).Any help greatly appreciated.CheersRob
I'm trying to input a few thousand flat files into a few thousand tables in a sql databaseim using integration services with a for each loop to read all the files in a directorythe problem is i can only insert the data from all the files into one tabledoes anyone know a way to do multiple tables? maybe using some sort of variable?
My company is running Livestats.XSP v8. We have been trying to import log files for a website for a few days now but Livestats keeps getting stuck randomly throughout the import process. What i mean by this is that i will import all of the log files by date which have been imported, then its stop and displays "Not Imported....."
We really need to get this remedied as soon as possible and it seems that support for Livestats is slim. Does anyone any idea that could help me? Thanks in advance.
I receive 4 .csv file downloads periodically (3 times per day) via email from our corporate database. I open each file in Excel, save as excel files, import to Access, replace the previous tables...run action queries, generate reports combined with production data from CSRs and supervisors. ALL DONE MANUALLY!!!
Here's where I am now: -I've recently switched over to SQL Server Express. -Used SSMA to bring tables over from previous Access database. -Exercised the option to LINK these tables to the original Access database.(I'll explain why in a moment) -Created ADPs for front end data entry use. -Imported old Access database forms into the new ADPs previously used in Access. -Connected to the new Server Express. -I've eliminated my concurrent user problems by doing this. HOWEVER, I am still bound to using old Access/Jet database to generate reports based on periodic downloads from corporate .csv files.
Here's the question: What is the best way to import the csv files being sent to me via email into SQL Server Express? I've tried DTS. Seems to me you can't save AND actually use the packages later since it's the Express edition..... Importing manually 4 files, 3 times per day is a very tedious option I'd like to avoid. Any ideas? Oh, by the way. Corporate has told me they would be willing to post the files to an FTP site instead of emailing the files. That's about as much help as I'm going to get from them. Can SQL Server Express be set up to run stored procedures (triggers) on a hot folder? Thanks for your help.
My company is running Livestats.XSP v7. We have been trying to import log files for a website for a few days now but Livestats keeps getting stuck randomly throughout the import process. What I mean by this is it will list all of the log files by date which have been imported, then on one date it will show as its status as "importing...".
We really need to get this remedied as soon as possible and it seems that support for Livestats is slim. Does anyone any idea that could help me? Thanks in advance.