Help Needed On Loading Different Files To Different Tables
Sep 16, 2007
I am trying to do here is to load different flat files to different tables:
For example, if the file name starts with "enrollment", then it goes to table "enrollment" table;
if the file name starts with "student", then it goes to "student" table.
For now, I created a foreach loop container for the each different files. So it ended up using several foreach loop containers. I am wondering if there is a way just to use one foreach loop containters to process the loading.
Anyone shed some light on this?? Thank you very much for your help!
I am pretty new to SQL Server 2005 and SSIS. I am trying to develop a package that will dynamically load files into SS2005 based on the contents of a configuration table. The configuration table (see below for example) contains the path to the file, a flag indicating whether or not to process the flag, the type of file (specifies the nature of data -- financial, order, etc.) and some parameters specific to each file.
FileName ProcessFlag Type ExcelTab Param1
C:File1.xls TRUE 1 Sheet1$
C:File2.xls TRUE 1 Sheet1$
C:File3.xls TRUE 1 Sheet1$
C:File4.xls TRUE 2 Sheet1$
C:File5.xls FALSE 2 Sheet1$
C:File6.csv TRUE 3
C:File7.txt TRUE 4
Right now I basically have a seperate sequence for each of the file types. The task in each sequence are virtually identical with the exception of the the data flow source in the data flow task (since the source file could be .xls, .txt, .csv). The first sequence ran fine in isolation, but when I linked a second sequence I started getting a Package Validation Error: DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER I use the same variables (FileName, ExcelTab) for each of the sequences, so I am not sure if that causes te error. I also tried changing the ValidateExternalMetadata setting to false, since the connection variable wont be applicable until that sequence is being processed. I am not sure where to go here ... should I rearchitect how the package all together?
Is there a better/most efficient way to to architect this package to handle the different file types (develop a package for each type which is called by a master package, create one package with a different sequence container for each type, etc).
Currently we are trying to load the xml files into sqlserver tables by using ssis 2012,We are getting xml files as a column in source table ,so we have to push these xml files into destination tables.
I'm following the below way to perform this activity
[URL]
But We have standard XSD structure for all the xml files ,and if xml file matches the XSD structure then only we have to load ,else it should skip to next xml file.
I have a table structure like this EmpID LastName FirstName Emp_Picture 100| x |T |<BINARY> 200| W |W |<BINARY> .. .. ETC This table has 935 rows in it with Emp_Picture Blank.
How to insert the jpeg files into the Emp_Picture Column?? Do we have run the update statement for each and every employee or is there a way to get around this problem..
Is it possible to take a text file that contains multiple record types through the Data Transformation Service in MS SQL 7.0 and load each different record type into a seperate table?
I am new to sql server. I have some tiff files to load into sql database. The Server is 2005.
Can i do this without using any application like asp.net/c#. Is there any way to upload tiff files into tables using sql.The size of each image is approx 200-300kb.
I have tables with member information. The tiff file name is same as the member id. So i have to uplaod the image to the column in the member table with same id.
Can you guys please help me with this or suggest some articles/urls which use sql to upload tiff files.
I am writing a SSIS package to load a lot of Excel files. I use SQL statement to select the Excel data. However, I found it's hard to dynamical set the table name (Excel Tab name) - the user name the Tab differently.
In MySQL, I use "LOAD DATA INFILE 'my_path/data_file.txt'" to load datafrom a plain text file. Of course, the actual statement is a bit morecomplex once one considers the various options (e.g. comma delimited vstab delimited, record termination strings, &c.).My problem is that I have yet to find the equivalent within MS SQLserver. I did find a LOAD statement in T-SQL, but at first glance itseems to do something completely different.How does one normally load data from a plain text file into a table inMS SQL? This needs to be relatively efficient since, once inproduction, it will be used to load tens of megabytes of data into thedatabase (a feed from a data provider). Is it flexible enough to allowme to specify whether the fields are tab delimited vs comma delimited,optionally enclosed by quotes, record termination charactors, &c.?All I really need is direction to the right part of the T-SQL reference(MS SQL Server 2005). Anything else, such as examples, is icing on thecake.ThanksTed
I'm using the For Each loop container to load multiple XML data files into SQL Server, and noticing some peculiar behavior and need some advice.
The pattern I'm trying to accomplish is this: Iterate over a collection of XML files in a specific folder, loading each in turn into SQL Server. If the file has already been loaded, delete the records first before the load. After the load succeeds, move the file into an Archive folder.
To accomplish this, I've set up a For Eac Loop container using the For Each File enumerator, and retrieve just the file name and extension into a variable. The first task in this is an Execute SQL task that uses a SQL DML statement to delete records based on a field in the table containing the file name (DELETE FROM table WHERE PROG_NAME = ?), and map a variable to the parameter. The next task is the data flow task that uses an XML source using the variable as the file name, and SQL Server destination. I use a derived column task in between to plug the variable holding the file name into the PROG_NAME field. So far, so good. This works.
But now comes the peculiar part. I initially had the XSD files in the same folder as the XML files, but wanted to put them in their own directory, so moved them, and made the change to the XML source adapter for the new path to the XSD file. The next time I ran my package, it failed. For some reason, as the For Each Loop tried to iterate over the directory, it was using the XSD path assigned in the XML source instead of the path for the XML files. Unusual...
My question is, why when choosing the File name & Extension retrieval type (as opposed to the fully qualified name) will the task try to use the XSD location to find the files? Is my variable getting reassigned somewhere?
Here's my delema, I have a file that's 308 bytes wide by 5.7 million records. The record length is fixed and the position and width of the known within the record. When I run DTS I recieve this error Msg MS DTS flat file provide and Err Diesdription: error creating file mapping view: not enough storage is available to process this command. Then when I try to continue with the wizard, it will not allow me to separate the data into the format that I need. Is there any other way to import this file using DTS?
I have 6000+ text files, average size 400 kb, that I need to load into 1 table in Sql Server 2000. Does anyone know of an easy way to do this? I thought I would just write a little VB app to loop through all the files in the directory and insert the data into an existing table but there must be an easier way.
Hi.I need to give my customer an sql file that they can run in query analyzer.All the stuff they need to run is in a set of existing files.I'd like to just tell them to load this file (this is oracle syntax):@file1.sql@file2.sql@file3.sqlis there some way of calling these files (that are in the same dir) from amaster sql file?ThanksJeff Kish
How to download files from a webpage before loading into SQL Server tables? I have the following URL and under the Downloads & Resources section, I have different file formats.
By doing hover on the download tab for each file type, I see that there is a link that is associated with it just like the following:
For CSV - [URL] .... For XML - [URL] ....
The above is just an example for your reference/understanding. In the sample data from the internal website I have, I need to do a similar operation. The only difference would be that I would be having multiple XLS files with a description for each.
<li> <sub>Sales for Calendar Year 2015--All Countries </sub> <a href="/Data/Downloads/Documents/Sales/Sales_Quarter1.xlsx"> <sub>[XLS]</sub></a><sub> , <a href="/Data/Downloads/Documents/Sales/Sales_Quarter1.pdf"><sub>[PDF]</sub></a><sub></sub></sub> </li>
I need to download the file based on the month/quarter every time.
Hi there - can anyone advise on the following issue. We have recently performed some server side tracing on a particular SQL instance over 24hr period. We are now attempting to load these into a database for analysis. Here lies the problem.
When we are loading the profiler trace files (one at a time) into the database the transaction log is growing at an excessive rate. Even though the database is in SIMPLE mode.
We are loading the traces using the command:
INSERT INTO sqlTableToLoad SELECT * FROM ::fn_trace_gettable('MytraceFileName', DEFAULT)
Can anyone advise how we could possibly get round this issue as we're running out of space due to the transaction log.
I have a load (180,000+) of text files whose contents need to go into a SQL server database.Whats the best way of doing this? Using a c# console program and if so, using FileStream or StreamReader? Or using a feature of SQL server itself. The text files themselves are less that 1k and are literally less than 200 characters.The problem is, I've tried a WinForm and although I can detect what files are there, as soon as I attempt to open one for reading, everything stops working and won't insert anything to the database.
Hi allCould someone help me with the following problem? Hours of googlingyesterday couldn't get me the answer. I'm using SQL 2000 and DTS andtrying to import a huge fixed width text file.File is >1m rows and >200 columns and is defined by a proprietory (i.e. notbcp produced) format specification of the formName Start LengthFld1 0 20Fld2 19 5Fld3 24 53and so on.Tbe only way I've found to define the columns so that DTS can import thefile properly is to go through the wizard and click on the starts of eachcolumn. I don't want to use bcp if possible (I did enough of that onSQL7) - but surely there's a way to get DTS to read from a format file so Idon't have to click 200 times (with all the ensuing errors I could make).Any help greatly appreciated.CheersRob
We are trying to load flat text files with upwards of 7 million records into a table on SQL. The table has a clustered index on 3 fields. We are sometimes able to complete smaller tables (500,000-750,000 records) and build the indexes prior to importing the data, however when we try the larger tables an error occurs :
Error at Destination for row number 6785496. Errors encountered so far in this task: 1
Location: somerge.c:1573 Expression: mrP->mrStatus!=MERGERUN::NONE SPID: 11 Process ID: 173
None of the recods end up importing. The row number it gives is always the total number of records that was in the text file I was trying to import. I tried to import the text files first and then build the clustered indexes but a table with only 300,000 records ran for nearly 4 days without completing before we killed it.
I have got an xml file with size more than 2 GB. I have to load this file into tables. With 32 bit platform, I am unable to load this file using SSIS. Ram is 8 GB, but it is still bombing out. As I know it uses XML DOM Parser and tries to shred the file in memory and because of memory limition, it fails. Although I have already written code in C# using XmlTextReading object(implemetation of SAX Parser) to load data in tables, but I want to keep this loading process within the limits of DBAs.
I am stuck. Can someone guide me through the situation?
Hi!I have a large project that is due to complete this week. In order tocomplete it I need SQL Server 2000 installed on a remote server. Mydisk is corrupt and to order another media disk would damage mydeadline. I have the licence and serial key, but now need good installfiles. I am even ready to buy another retail box, if I can find asupplier that would give me a download site for the media, while I waitfor the shipment!Please PLEASE help!Regards,Barry
I am pretty new to SQL Server 2005 and SSIS. I am trying to dynamically load files into SQL Server 2005 using files/paths contained in tables. I have a key table in my SSIS package that defines which files should be processed, some default values that will be associated with each file (ie CompanyKey and PeriodKey) and the file path (see below for example):
CompanyKey
PeriodKey
ProcessFlag
FileName
2
61
FALSE
2005TB200501.xls
2
62
FALSE
2005TB200502.xls
2
63
FALSE
2005TB200503.xls
2
64
TRUE
2005TB200504.xls
2
65
TRUE
2005TB200505.xls
2
66
TRUE
2005TB200506.xls
2
67
TRUE
2005TB200507.xls
2
68
TRUE
2005TB200508.xls
2
69
TRUE
2005TB200509.xls
2
70
TRUE
2005TB200510.xls
2
71
TRUE
2005TB200511.xls
2
72
TRUE
2005TB200512.xls
The package I am developing is supposed to loop through this table checking for files with ProcessFlag = TRUE. For those files, it will load all of the records/columns (prefaced with the company and period keys) into a common table.
Do I have to manually create data sources for each file or is there a way to dynamically define the connection and process the connection?
Any assitance you could provide would be greatly appreciated!
I built an application that connect to MS SqlServer2005 using Native driver (sqlncli.msi) I install that file from MS site, I need to deply my application to the end-user, and I would like to know what files do I need to deploy to make sure the application is gona run okay on the client PC's.
I search in the registry for the driver, and I found this "sqlncli.dll", is it enough or I need to include more files !!
I'm having a very irritating time trying to migrate data from a COBOL system to SQL Server.
One of the A/R Master files has approx. 200 columns.
I can export this file any number of ways that will (sometimes) load partially into my database, but always when the load succeeds, columns 75 through N simply contain NULL, even though there is data in the file. When the load fails in DTS, the error is always missing column delimiter. Using BULK INSERT the error is always something like data too long at column 75.
Putting all this together, I have deduced that something isn't working if I try to load a staging table with more than 74 columns of data. This seems like a way-too-low threshold for a robust database, especially since SQL Server is supposed to be able to handle up to 1,024 columns per table.
I am having trouble loading tables (within the same data flow) that have a foriegn key relationship defined between them. For instance:
Table A is a parent (one side of the relationship) to Table B (many side of the relationship).
I am trying to load Table A first within the data flow and then Table B after, but I get the following error:
[OCMD EntityRole Insert [2666]] Error: An OLE DB error has occurred. Error code: 0x80040E2F. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The statement has been terminated.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The INSERT statement conflicted with the FOREIGN KEY constraint "FK_EntityRole_Entity". The conflict occurred in database "ODS", table "dbo.Entity", column 'EntityGuid'.".
I am currently using OLE DB commands to perform the inserts, I load table A and move on to then load Table B, I can see the records in Table A before trying to load Table B but for some reason Table B load still fails.
I was thinking maybe this has something to do with the transaction setting or Isolation level but have played with this to no avail (currently everything is the default - supported/serializable). Also, I am thinking maybe because the OLE DB commands are creating two seperate connections (they are using the same connection manager) the second one is unable to see the transactions from the other (first) connection (Table A)?
Is there a way around this without dropping (disabling) forigen keys before the load and adding them back in after? Would like to avoid this?
I would also like to avoid reading the data source multiple times. Everything I need is in the one source so I would like to populate multiple tables from the one source data stream instead of reading the same data 2,3 or 4 times etc.
Seems to me there must be a simple explanation/solution for this but I'm stuck at this point?
P.S.
I was intially using OLE DB destinations (because they are much faster) and was having the same issue, which made sense because the OLE DB destinations do not let you pass the data stream on so I had to multi cast to the destinations so they were loading at the same time. I would rather use the OLE DB destinations so if you have any ideas around how I could do this using those components that would be appreciated too!
Hi I have a question how to load data to tables linked by Foreign Keys in MSDE/SQL server. Example: If I have 2 tables linked (by Foreign Key): One table:
ITEM idITEM NAMEITEM CATEGORY (FK) 1cheese 2
And another:
Category IDCATEGORY NAME 1 household 2 food 3 general
How do I enter the load of data Do I have to enter it as 1cheese2 or is there some way of entering it as 1cheesefood
TDS wizard does not allow me to transfer to views/querries what I thought would be a normal way as I would enter data to view(relevant to Access's form) and it would update related tables . When I wrote sql to do it it said I can not update my view table as too many tables would be affected(I had lookup tables empty then though) I am doing it by number using TDS wizard to transfer it directly to the main table but there must be a better way
Is there a way ,(if so what is the syntax?), to set up a DTS package that loads a table that has an identity column. I am trying to load the data from another table, (leaving the identity field unmapped), and de-selecting the "enable identity insert" from the advanced tab of the Data Transformation Properties window. I keep getting errors due to the table not allowing null values. I tried using the set_identity command, but this still did not work. Any help would be appreciated. TB
The reason it is poorly designed is the table is used to hold questions and answers, all with a 1:1 relationship. Instead of having ID, ProductType, Question, Answer they have unfortunately adopted the approach of the above i.e:
id 1 thisID 3 parentid nuLL DESCRIPTION: this is a question
id 20 thisID 3_1 parentID 3 DESCRIPTION: this is the answer to the question above
So I am writing a sproc that does this using a temp table. I got this far:
SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO -- ============================================= -- Author:Spencer -- ============================================= ALTER PROCEDURE [dbo].[GetFAQs] -- Add the parameters for the stored procedure here @ProductType varchar(255) AS BEGIN SET NOCOUNT ON; -- Insert statements for procedure here CREATE TABLE TEMP ( IDINT, thisIDvarchar(50) null, parentIDvarchar(50) null, Titlevarchar(255) null, DescriptionQvarchar(8000) null, DescriptionAvarchar(8000) null, ProductTypevarchar(255) null, )
SELECT ID, thisID, parentID, Title, DescriptionQ, DescriptionA, ProductType FROM A2Z WHERE ProductType = @ProductType AND parentID IS NULL END GO
This gets all my questions for that product type.
What I need to do is load the questions into my temp table and then run through the a2z table again gaining the answers to the questions (the parentid holds the question ID). The answers then will also get loaded into the temp table.
I have problem in loading multiple excel sheets data in to according to that excelsheets tables in a DB. All the excel sheets are in a folder,from that folder i have to acces all excel sheets. For this i am unsing script task and one dataflow task. But the error is coming in script task i am not able to put the path in the script..
Is this the correct way to do like this? Or any other way?
Can u please tell me the solution for this..Thanks in advance who are responding to this mail...