Ssis Importing Files With ZERO Lentgh Varchar() Instead Of NULLs
Mar 31, 2008
hi,
i am importing files using SSIS but i notice that attributes that are empty are imported as ZERO lentgh character instead of NULL.
is there some option that i can choose for maintaining nulls instead of a char() type? i though this was the default setting with DTS.
many thanks,
Nicolas
sample file with pipe delimiter:
|some data|another data||previous with no data|
View 4 Replies
ADVERTISEMENT
Nov 20, 2006
For the life of me I cannot figure out why SSIS will not convert varchar data. instead of using the table to table method, I wrote a SQL query so that I could transform the datatype ntext to varchar 512 understanding that natively MS is going towards all Unicode applications.
The source fields from Access are int, int, int and varchar(512). The same is true of the destination within SQL Server 2005. the field 'Answer' is the varchar field in question....
I get the following error
Validating (Error)
Messages
Error 0xc02020f6: Data Flow Task: Column "Answer" cannot convert between unicode and non-unicode string data types.
(SQL Server Import and Export Wizard)
Error 0xc004706b: Data Flow Task: "component "Destination - Query" (28)" failed validation and returned validation status "VS_ISBROKEN".
(SQL Server Import and Export Wizard)
Error 0xc004700c: Data Flow Task: One or more component failed validation.
(SQL Server Import and Export Wizard)
Error 0xc0024107: Data Flow Task: There were errors during task validation.
(SQL Server Import and Export Wizard)
DTS used to be a very strong tool but a simple import such as this is causing me extreme grief and wondering of SQL2005 is ready for primetime. FYI SP1 is installed. I am running this from a workstation and not on the server if that makes a difference...
Any help would be appreciated.
View 7 Replies
View Related
May 5, 2006
I saw this post by dterrie in the Wishlist thread and I just wanted to second it:
"How about bringing back a simple dBase import. The SSIS guys are clearly FAR out of touch with reality if they think people who handle data no longer need to work with dbf files. I've seen alot of dumb stuff in my day, bit this is just sheer brilliance. I just love the advice of first importing into Access and then importing the Access table. Gee, why didn't I think of such a convenient solution. I could have had a V-8."
I've been struggling with this the last couple days and finally decided to import the dBase III file into Access and then import that into SQL Server 2005. Imagine my surprise when I discovered this was the current recommended method.
That's just ridiculous. Can someone tell me why they would reduce some of the functionality of SQL Server from 2000 to 2005? This was a very easy process in SQL Server 2000...
View 3 Replies
View Related
Mar 3, 2008
Hi,
I'm just learning SSIS. As I was following the tutorial on foreach loop container (lesson 2) to export multiple flat files in creating a simple ETL package in SSIS, I get the following error:
SSIS package "Take17.dtsx" starting.
Information: 0x4004300A at Extract Cobra EBA, DTS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Extract Cobra EBA, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Extract Cobra EBA, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Extract Cobra EBA, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Extract Cobra EBA, Cobra EBA [1]: The processing of file "" has started.
Warning: 0x80070003 at Extract Cobra EBA, Cobra EBA [1]: The system cannot find the path specified.
Error: 0xC020200E at Extract Cobra EBA, Cobra EBA [1]: Cannot open the datafile "".
Error: 0xC004701A at Extract Cobra EBA, DTS.Pipeline: component "Cobra EBA" (1) failed the pre-execute phase and returned error code 0xC020200E.
Information: 0x402090DD at Extract Cobra EBA, Cobra EBA [1]: The processing of file "" has ended.
Information: 0x40043009 at Extract Cobra EBA, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Extract Cobra EBA, DTS.Pipeline: "component "OLE DB Destination" (194)" wrote 0 rows.
Task failed: Extract Cobra EBA
Warning: 0x80019002 at Take17: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Take17.dtsx" finished: Failure.
I spent some time in understanding where could be the error. I traced it to the following place. If you take a look at the Lesson#2 in creating a simple ETL package in SSIS, using foreach loop container to import files into SQL server, as soon as I finish going through steps in configuring the file connection manager to use the variable for connectionstring, it immediately disappears. I repeated the tutorial three times exactly as I pointed out and when I reach to this step of configuring file connection manager for connectionstring, it takes the path and when I start debugging it, I get this error. When I go and check if everything is Ok, the connectionstring value is empty.
Why the value in connectionstring disappears?
Thank you.
Nagu
View 17 Replies
View Related
Apr 24, 2008
Hi.
There is a "text" file generated by mainframe and it has to be uploaded to SQL Server. I've reproduced the situation with smaller sample. Let the file look like following:
A17 123.17 first row
BB29 493.19 second
ZZ3 18947.1 third row is longer
And in hex format:
00: 41 31 37 20 20 20 20 20 ”‚ 31 32 33 2E 31 37 20 20 A17 123.17
10: 66 69 72 73 74 20 72 6F ”‚ 77 0D 0A 42 42 32 39 20 first row™Ş—™BB29
20: 20 00 20 34 39 33 2E 31 ”‚ 39 20 20 73 65 63 6F 6E 493.19 secon
30: 64 0D 0A 5A 5A 33 20 20 ”‚ 20 20 20 31 38 39 34 37 d™Ş—™ZZ3 18947
40: 2E 31 20 74 68 69 72 64 ”‚ 20 72 6F 77 00 69 73 20 .1 third row is
50: 6C 6F 6E 67 65 72 ”‚ longer
I wrote "text" in quotes because sctrictly it is not pure text file - non-text binary zeros (0x00) happen sometimes instead of spaces (0x20).
The table is:
CREATE TABLE eng (
src varchar (512)
)
When i upload this file into SQL2000 using DTS or Import wizard, the table contains:
select src, substring(src,9,8), len(src) from eng
< src ><substr> <len>
A17 123.17 first row 123.17 25
BB29 493.19 22
ZZ3 18947.1 third row 18947.1 35
As one can see, everything was imported, including binary zeros. And though SELECT * in SSMS truncates strings upon reaching 0x00's, still all information is stored in tables - SUBSTRINGs show that.
When i upload this file into SQL2005 using SSIS or Import wizard the result is following:
< src ><substr> <len>
A17 123.17 first row 123.17 25
BB29 4
ZZ3 18947.1 third row 18947.1 25
This time table is half-empty - all characters behind binary zeros in respective rows are lost.
I stumbled upon this problem while migrating my DTSes to SSIS packages. Do you think there is some workaround, or i need to turn on some checkbox or smth else could help? Please...
View 8 Replies
View Related
Jul 31, 2006
I'm trying to use a DTS package to import data from an excel file. A few tables keep throwing errors about not being able to insert null values.
Is there any way to just skip a row when a column is null? I know there are certain columns that will never be null, so if they are null, I know that I just have a messed up row in Excel.
I could probably do it by implying an SQL query and having it ignore null rows, but I can't find anything useful on SQL syntax where Excel files are concerned.
View 5 Replies
View Related
May 15, 2008
I have two SSIS packages that import from the same flat file into the same SQL 2005 table. I have one flat file connection (to a comma delimited file) and one OLE DB connection (to a SQL 2005 Database). Both packages use these same two Connection Managers. The SQL table allows NULL values for all fields. The flat file has "empty values" (i.e., ,"", ) for certain columns.
The first package uses the Data Flow Task with the "Keep nulls" property of the OLE DB Destination Editor unchecked. The columns in the source and destination are identically named thus the mapping is automatically assigned and is mapped based on ordinal position (which is equivalent to the mapping using Bulk Insert). When this task is executed no null values are inserted into the SQL table for the "empty values" from the flat file. Empty string values are inserted instead of NULL.
The second package uses the Bulk Insert Task with the "KeepNulls" property for the task (shown in the Properties pane when the task in selected in the Control Flow window) set to "False". When the task is executed NULL values are inserted into the SQL table for the "empty values" from the flat file.
So using the Data Flow Task " " (i.e., blank) is inserted. Using the Bulk Insert Task NULL is inserted (i.e., nothing is inserted, the field is skipped, the value for the record is omitted).
I want to have the exact same behavior on my data in the Bulk Insert Task as I do with the Data Flow Task.
Using the Bulk Insert Task, what must I do to have the Empty String values inserted into the SQL table where there is an "empty value" in the flat file? Why & how does this occur automatically in the Data Flow Task?
From a SQL Profile Trace comparison of the two methods I do not see where the syntax of the insert command nor the statements for the preceeding captured steps has dictated this change in the behavior of the inserted "" value for the recordset. Please help me understand what is going on here and how to accomplish this using the Bulk Insert Task.
View 2 Replies
View Related
Jul 19, 2007
I am trying to import an Excel Spreadsheet into SQL2005. There is a column in the spreadsheet that has character values, and numbers. I have formatted the numbers as text on the spreadsheet. I have declared the column on the table as char/varchar/nchar, but whatever I do, the numbers don't get imported into the table, but show up as nulls. Any idea why?
Thanks
Mangala
View 1 Replies
View Related
Jun 27, 2006
I have a couple of hundred flat files to import into database tables using SSIS.
The files can be divided into groups by the format they use. I understand that I could import each group of files that have a common format at the same time using a Foreach Loop Container.
However, the example for the Foreach Loop Container has multiple files all being imported into the same database table. In my case, each file needs to be imported into a different database table.
Is it possible to import each set of files with the same format into different tables in a simple loop? I can't see a way to make a Data Flow Destination item accept its table name dynamically, which seems to prevent me doing this.
I suppose I could make a different Data Flow Destination item for each file, in the Data Flow. Would that be a reasonable solution, or is there a simpler solution, or should I just resign myself to making a separate Data Flow for every single file?
View 9 Replies
View Related
Nov 20, 2007
I have looked far and wide and have not found anything that works to allow me to resolve this issue.
I am moving data from DB2 using the MS OLEDB Provider for DB2. The OLEDB source sees the column of data as DT_TEXT. I setup a destination to SQL Server 2005 and everything looks good until I try and run the package.
I get the error:
[OLE DB Source [277]] Error: An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "Microsoft DB2 OLE DB Provider" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
[OLE DB Source [277]] Error: Failed to retrieve long data for column "LIST_DATA_RCVD".
[OLE DB Source [277]] Error: There was an error with output column "LIST_DATA_RCVD" (324) on output "OLE DB Source Output" (287). The column status returned was: "DBSTATUS_UNAVAILABLE".
[OLE DB Source [277]] Error: The "output column "LIST_DATA_RCVD" (324)" failed because error code 0xC0209071 occurred, and the error row disposition on "output column "LIST_DATA_RCVD" (324)" specifies failure on error. An error occurred on the specified object of the specified component.
[DTS.Pipeline] Error: The PrimeOutput method on component "OLE DB Source" (277) returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Any suggestions on how I can get the large string data in the varchar column in DB2 into the varchar(max) column in SQL Server 2005?
View 10 Replies
View Related
May 14, 2008
I have the following code tha imports the contents of a text file into a varchar(max) field.
Unfortunately the CR/LF are stripped out when I look at the field.
How can I preserve them?
Code Snippet
DECLARE @obj VARCHAR(MAX)
SELECT @obj=BulkColumn
FROM
OPENROWSET(BULK 'C:qsiObjectCreation.sql',SINGLE_CLOB) AS ExternalFile
insert into scriptor (script) values (@obj)
View 2 Replies
View Related
Feb 16, 2007
With DTS it was easy to import a DBF file, but its a huge pain with SSIS. Is there any way on the horizon to import DBF files as easily as DTS did?
View 1 Replies
View Related
Jul 23, 2005
Hello,I need to import a bunch of .csv files. The problem I am having is the"non data" information in the files creating bogus rows and columndefinitions. Here is an example of the csv file.CBOT - End-of-Day Futures Bulk Download 2001.2 Year U.S. Treasury Notes FuturesDateSymbolMonth CodeYear CodeOpen20010103ZTH2001102.0937520010104ZTH2001102.0312520010105ZTH2001102.28125In this case, there are bogues rows created with the text at thebeginning of the file, and also the column names get placed into a rowas well. My question is; how do you import the file and strip out the"non-data" data? so that only the actual data gets inserted into the db?Thanks,TGru*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
View 4 Replies
View Related
Jul 20, 2005
What is the best way to import a structured .prt export file from anaccounting package into SQl Server??Tabs, Fixed col length???
View 1 Replies
View Related
Sep 14, 2006
I have the original copies of the .MDF and .LBF of another database that has a bunch of data in that I'm trying to important into MSSQL Express for a project I'm working on. I no longer have a running version of this database just the .MDF and .LBF files and I have no clue what version of MSSQL they were originally created with. If anyone knows a way to import this database please help.
View 1 Replies
View Related
May 2, 2007
I am trying to import a CSV file into an SQL Server table with the OleDbDataReader and SqlBulkCopy objects, like this: using (OleDbConnection dconn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=c:\mystuff\;Extended Properties="text;HDR=No;FMT=Delimited""))
{
using (OleDbCommand dcmd = new OleDbCommand("select * from mytable.csv", dconn))
{
try
{
dconn.Open();
using (OleDbDataReader dreader = dcmd.ExecuteReader())
{
try
{
using (SqlConnection dconn2 = new SqlConnection(@"data source=MyDBServer;initial catalog=MyDB;user id=mydbid;password=mydbpwd"))
{
using (SqlBulkCopy bc = new SqlBulkCopy(dconn2))
{
try
{
dconn2.Open();
bc.DestinationTableName = "dbo.mytable";
bc.WriteToServer(dreader);
}
finally
{
dconn2.Close();
}
}
}
}
finally
{
dreader.Close();
}
}
}
finally
{
dconn.Close();
}
}
}
A couple of the columns for the destination table use a bit datatype. The CSV files uses the strings "1" and "0" to represent these.When I run this code, it throws this exception:Unhandled Exception: System.InvalidOperationException: The given value of type String from the data source cannot be converted to type bit of the specified target column. ---> System.FormatException: Failed to convert parameter value from a String to a Boolean. ---> System.FormatException: String was not recognized asa valid Boolean. at System.Boolean.Parse(String value) at System.String.System.IConvertible.ToBoolean(IFormatProvider provider) at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvider provider) at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType) at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaDatametadata) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaDatametadata) at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) at MyClass.Main()It appears not to accept "1" and "0" as valid strings to convert to booleans. The System.Convert.ToBoolean method appears to work the same way. Is there any way to change this behavior? I discovered if you change the "1" to "true" and "0" to "false" in the CSV file it will accept them.
View 3 Replies
View Related
Sep 14, 2007
Hi,
I got one more problem with importing csv files using .net. The problem is that the csv file contains double-quotation marks (""). For example, the record looks like:
...,Bearing Double "D" Flange,...
And the result is: ... | Bearing Double | null (all following columns are null)
The code is as following:
string strCsvConn = @"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=;Extended Properties='text;HDR=Yes;FMT=Delimited(,)';";
using (OleDbConnection cn = new OleDbConnection(strCsvConn))
{
string strSQL = "SELECT * FROM " + strFileName;
OleDbCommand cmd = new OleDbCommand(strSQL, cn);
cn.Open();
using (OleDbDataReader dr = cmd.ExecuteReader())
{while (dr.Read())
{
string str = Convert.ToString(dr[8]);
}
// Bulk Copy to SQL Server
//using (SqlBulkCopy bulkCopy = new SqlBulkCopy(strSqlConn))
//{
// bulkCopy.DestinationTableName = strSqlTable;
// bulkCopy.WriteToServer(dr);
//}
}
}
Any idea is highly appreciated.
shz
View 2 Replies
View Related
Apr 10, 2001
Hi,
I want to import multiple text files into a single table. I know I have to use BCP or DTS. But, I want import all files at once, instead of one at a time. And the file names are in sequence, viz. file1, file2, file3 etc.
Can anybody tell me, How I can achieve this.
Thanks
View 4 Replies
View Related
Jul 7, 2000
Has anyone written a SQL script to import ASCII comma or space delimited?
I am trying to load a PROGRESS exported file (Unix) into SQL SERVER 6.5 (NT).
Much appreciated,
Nickd
View 1 Replies
View Related
Sep 19, 2001
I have to import 18000 text files into a sql database.
Each file contains 10 fields and around 5000 records.
I am currently doing this with DTS.
What I am wondering is this: Is DTS the most efficient i.e. quickest way to import all this data. Bearing in mind there is about 90 million records to import in all.
I would appreciate the benefit of somebody elses experience when dealing with this type of thing.
Cheers,
Brookesy
View 2 Replies
View Related
May 10, 2004
Hello,
I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file with over 100,000 records. The fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY is 32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002, Col003, etc. My problem is that I don't want everything as Characters or Col001 etc. I want different column names and columns of data to be INT, NUMERIC(x,x), etc. instead of characters every time. If I change these values to anything than the default in DTS it won't import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc. the way I want it to look, FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Any suggestions are greatly appreciated.
Thanks,
JB
View 5 Replies
View Related
Apr 25, 2008
Hi i've 6 files with the same name part but all have a different ext. The 17 always changes to the current week number
xbouns.A17
xbouns.B17
xbouns.C17
I want to import these files into a database table. So I've create a foreach loop and select the foreach file enumerater but am not sure if this is the right way to go about it some help woth be great thanks.
View 7 Replies
View Related
Aug 5, 2005
I have over three hundred text files that I need to import to SQLServer.Each is in the exact same format.I want to import tham as seperate tables.Is there any way to do it in one process?Regards,Ciarán
View 5 Replies
View Related
Apr 28, 2008
Hi i've 6 files with the same name part but all have a different ext. The 17 always changes to the current week number
xbouns.A17
xbouns.B17
xbouns.C17
I'd like to do this within a SSIS package some help in getting this to work would be great.
Thanks
View 12 Replies
View Related
Jan 24, 2008
Hello Everyone.
I am a bit new to SQL Server but not to DBA or programming per se. I am having difficulties getting either an Excel or Text flat file to import properly.
I guess it would be best to ask, using either SSIS or BULK INSERT, what options need to be entered for a typical excel flat file?
View 2 Replies
View Related
Feb 28, 2006
I have a package with a corresponding configuration file.
If I open Sql server management studio and i connect to SSIS and right click to import a package, and then select to store the package in SQL Server (not on the file system)....
What happens with the configuration file?
Does the import take the values from the configuration file and place them in the package which then is stored in SQL Server?
Or do I need to put the configuration file someplace on the SQL Server where the package is imported so it can access it when it runs?
I'm a bit confused about what goes on there?
For example, I tried using the build command and then running the manifest file to import using the wizard and when it does that it copies the configuration files to a default location within the c:program filesmicrosoft sql server directory.
Thx.
R-
View 1 Replies
View Related
Jan 29, 2008
I need to write a genric CSV importer. one set of CSVs define the formats of all teh other CSVs. The format of the second set of CSVs is not know at the time of creating this project/SSIS package.
Imagine having two sets of CSV files.
Each file in the first set of CSVs define the format of files in the second set of CSVs
The first set always have the same format. Each row of the CSV file defines a field with a Name and Type.
The first two columns of each row of in the first set of CSVs have a FormatName and index.
So a simple file may have:
Format1,1,Name,string
Format1,2,Surname,string
Format1,3,DOB,Date
Format1,4,Email,string
The second set of CSVs contains records that have to comply with the format define in the first set.
So in this aprticular case, a CSV file in the second set may have records such as:
Format1,John,Doe,2007-01-01,john@doe.com
Format1,Jane,Doe,2007-02-02,jane@doe.com
The problem I have is creating a generic SSIS package for this.
The first task, loading the first set of CSV file is fairly simple. The CSV format is known.
But the second task is a bit trickier.
Assuming I have SQL tables to load the data.
One 'Fields' record for each row in the CSVs from the fiirst set.
One 'Rows' record for each row in the CSVs from the second set,
One 'Values' record for each value in each row in the CSVs from the second set,
Something like:
TABLE Fields (
FieldID int IDENTITY,
FormatName varchar(100),
Position int,
ColumnName varchar(100),
ColumnType varchar(20) )
TABLE Rows (
RowID int IDENTITY )
TABLE Values (
ValueID IDENTITY,
RowID int,
FieldID int,
ColumnValue varchar(100) )
What would be the best/easiest SSIS approach:
- Dynamically create a SSIS package based on the content of each CSV fiel of the first set and execute that package for each file in the second set, selecting the correct package to execute (all records within a CSV file belong to the same format).
- Write a single SSIS package that iterates through the rows of the second CSVs, does a lookup for each value of each row to find the field name and make an insert into the DB
- Other SSIS method?
- Don't use SSIS to parse the CSV and call a custom C# task?
- Don't use SSIS at all ?
Thanks
View 1 Replies
View Related
Jun 2, 2008
I have a folder on an SFTP server in our internal network which gets flat files periodically (almost every minute) from another server. The file structure is like this:
Flat File Format•Line 1—List of field names comma separated •Line 2—List of field type comma separated •Line 3—Data comma separated •Line 4—Data comma separated and so on...
All the files will be like this without the .csv extention. There are a lot of fields and creating a column for each field manually will be a pain. Heres what I am trying to do:
1. Create a process that runs periodically on the server and imports data from files into the SQL server table.
2. Use the information in line 1 and 2 to create the schema of the table and then ignore those lines in every import (the data should be appended to the existing data).
What are my options?
Thank you all,Bullpit
View 3 Replies
View Related
Sep 20, 2004
Hi,
Can anyone help? Need to upload a text file to a sql database but keep getting errors.
I'm creating a page that will allow users to to bulk import and update to a MsSql database. The users provide a text file every so often with new/update information. So i want to use a DTS package to transform the infomation, and create a table in the database, then check against existing/non existing records, if the record exist, update it, if not insert it. I'm using Visual Studio.Net, ASP.Net and coding in VB.Net.
Anyone know where i can find documentation/code regarding the above?
I will be greatful for any help.
View 5 Replies
View Related
Jun 6, 2005
I have a load (180,000+) of text files whose contents need to go into a SQL server database.Whats the best way of doing this? Using a c# console program and if so, using FileStream or StreamReader? Or using a feature of SQL server itself. The text files themselves are less that 1k and are literally less than 200 characters.The problem is, I've tried a WinForm and although I can detect what files are there, as soon as I attempt to open one for reading, everything stops working and won't insert anything to the database.
View 1 Replies
View Related
Aug 4, 2000
I am new to SQL7 and I am trying to run a package that will import fixed text files in a folder and transform the data to a table. I am using a good example found in sywink.com for dynamically importing files in a directory, but I am getting an error message when there is more than one file in the folder. Also, I am able to display all the files during the loop process, but the file initially set for the source connection is the only one that transfers data. I have the close(Transform) connection checkbox option checked, and have tried other methods of closing the connection before the new source name is called, none have helped. Does anyone have any solutions to this type of problem? Or know of other methods used for my situation?
Thanks
James
View 1 Replies
View Related
Oct 27, 2006
I have a client who is sending me 800+ excel files each month with sales data. Each of the files is identical in structure, but has sales data for different stores. I receive all these files at the same time.
Is there a method with Data Transformation Services where I can have it work off of all the files in a given directory. I can set up DTS to work off of specific Excel files with no problem, but what I would like to do is set up a DTS so it could pull from each of the 800+ files.
Is this possible, or do I need to look at a solution outside of SQL to consolidate the Excel files first?
The Excel file would have columns similar to the following: store_id, zip_code, sales, transactions.
View 4 Replies
View Related
Dec 30, 2003
I want to write a DTS that will import a file every day. The problem is that the files is not named the same thing every day. There is a naming convention (SOMMDDYY.TRN) that it will follow. I want to import this file (which is a fixed width file) each day to a table (The table will be empty each day).
After it is imported, I want to look at the NAME of the file, and pull out the date portion of it. So, if the file is called SO122603.TRN, i want to pull out 122603, and then update my table with that date for every record. So when I am done, I will have a table that represents the file I imported, with one added column. This added column would be a Date/Time that has the date that was in the filename. How do I do this???
View 11 Replies
View Related