Processing Multiple Unzipped Files
Feb 26, 2008
The requirement is to process two zip files in turn.
Each zip will contain either 1 or 2 csv files, but both zips will contain the same number of csv files. The structure of the csvs inside the zips is identical.
I have set up a Foreach Loop container with a File Enumerator to look for *.zip files. The zip file name is successfully retrieved into my 'Zipfile' user variable, and I use this in an Execute Process task to unzip the zip file. So far so good.
However what I now need is to get hold of the name(s) of the csv file(s) that have been unzipped, and use those in my flat file connection strings.
I considered trying to use a nested foreach loop, the outer loop enumerating the zips, and the inner enumerating the csvs. However this is no good because if 2 csvs are present, I need them both available in the control flow at the same time as they need to be merged together.
I'm sure there's a way to do this in SSIS but I'm struggling to identify it. Anyone got any ideas at all?
Thanks guys
KO
View 2 Replies
ADVERTISEMENT
Sep 26, 2007
I previously posted a problem with result set bindings but I have not been able to resolve my problem. I guess all this comes with being new to programming in this environment! Anyway, I am trying to figure out how to process from an ADO.NET connection multiple rows with multiple columns. I have to read and manipulate each row. I was originally looking at using a foreach loop but have not been able to get it to work. One reply to my previous thought I should be using a data task to accomplish this. Could someone tell me the best way to handle this situation? As a note, I am new to programming in SSIS and basically trying to learn it as I go so please bear with me! Thanks in advance!
View 1 Replies
View Related
Sep 21, 2006
Hi,
I'm just starting to learn SSIS and would like some advice on how to handle something I encounter frequently. I often have to connect to a remote FTP site which contains a large number of files. Each day a number of files are added (old files are not purged). So each day I need to download all files which have been added since last time I checked, store on a file server, then load to a SQL database.
I've written a DataFlow using a script which generates a list of files and dumps to a raw file and doing the same for FTP directory listing wouldn't be too hard - I could then feed these into a third DataFlow to work out what the new files were, then a third to download the files to a temp directory etc.
But is there a cleaner way of doing this (I'm not adverse to scripting or even writing my own components - but for maintainability reasons I'd like to keep this as "out of box" as possible)? How are other people approaching this task?
Dave
View 7 Replies
View Related
May 8, 2007
hi guys,
i have a unique problem here.i get four different type of flat files,which i need to pick up and process parallelly,this i am doing using four different foreach containers for the four load process as i can get multiple files ( i had other issues like rollback etc for using forecah containers).
my problem is,my input file names will have format like this
yyyymmdd_salesdataforproduct_yyyymmdd_hhmmss.txt
here the first date is bussiness date and the second one is the sysytem date..if i get multiple file i have to proceess considering the second date.but by default the foreach loop is considering the first date,what can i do to ensure that only second is used to process my files.
thanks in advance
srikanth
View 1 Replies
View Related
Mar 10, 2008
Hello I have some flat files that contain CSV records with different number of fields but the first 4 fields of each record type are the same of each re. eg there would be an entry of one record that has eight fields and another that has 6 fields. Which of the items in the toolbox can i use to filter the records based on the entry in the first 4 fields so i can process the filtered records.
Thank you
Kenalex
View 5 Replies
View Related
Mar 12, 2008
We have a scenario to process last created/modified files from a location using SSIS package , eventhough the folder contains multiple files with same name and extension.
Kindly give respond to this if any one has worked on this.
Regards,
Sajesh
View 7 Replies
View Related
Apr 15, 2008
Hello, I'm at a loss with this problem....
I have written many reports with multiple datasets but today I have something bizarre happening. I have three stored procedures (datasets) in this report. This is a rolling twelve month report that looks at three numbers (restraints, census, and percent of census) by month and year...pretty simple. The first data set is for the graph for all twelve months. The second and third data sets are broken out by MonthNum in matrix tables to show the first and then second six months in separate tables for displaying purposes...so that that they don't have to scroll far over to the right since I'm printing all the three values per month.
This should be no problem. What's happening is that when I refresh the report not all three datasets are displaying properly. Some times the data in the graph shows and some times it doesn't. Another odd thing is that in my matrix tables the restraint number and then percent values sometimes double or sometimes they are 0. What makes it even more strange is the census number is always there and is always accurate.
All the aggregates are calculated in the T-SQL stored procedure. So, all I'm doing in RS is displaying the values. It almost appears to be cycling through the datasets when I refresh the report. Other than the fact that the census value is always there.
I rebuilt the report with only the graph...no problem. Once I add another data set, every other time I refresh the report the graph has no data. I can't imagine what I'm missing as I've done this numerous times.
Has anyone come across anything like this?
Thanks!! Jamie
View 3 Replies
View Related
Jun 16, 2015
I have a requirement where in i have around 15 different flat files , filenames are fixed but folder path can be changed(i think i should use a variable for folder path). These 15 files data should go to their respective tables in the database.
Whether I need to create separate data flow task for each file or separate package? In addition to these, example : while importing product data into product table, if product ID already exists, we need to ignore it and upload only the new records.
View 4 Replies
View Related
Jun 27, 2006
I have a couple of hundred flat files to import into database tables using SSIS.
The files can be divided into groups by the format they use. I understand that I could import each group of files that have a common format at the same time using a Foreach Loop Container.
However, the example for the Foreach Loop Container has multiple files all being imported into the same database table. In my case, each file needs to be imported into a different database table.
Is it possible to import each set of files with the same format into different tables in a simple loop? I can't see a way to make a Data Flow Destination item accept its table name dynamically, which seems to prevent me doing this.
I suppose I could make a different Data Flow Destination item for each file, in the Data Flow. Would that be a reasonable solution, or is there a simpler solution, or should I just resign myself to making a separate Data Flow for every single file?
View 9 Replies
View Related
Aug 14, 2012
I am trying to restore multiple .bak backup SQL database files onto a new server. However, I have found that it will not allow me to restore multiple databases at once. Is there a way to do this so that I do not have to manually upload one at a time? I tried adding all the .bak files at once to the backup device window but it only did the first one listed. It would be so much easier to restore them all at once so that I do not have to continue this manual process. I am restoring them via device.
View 13 Replies
View Related
Feb 15, 2008
I need to be able to bulk insert a bunch of tables from their corresponding flat file. I have created an XML file (see below) which has the file name/table name pair at each node. I then created a ForEachLoop task and used the Node enumeration type and the following OuterXpathString: ReferenceFiles/File. At this point I get lost. How do I pass the 2 inside node values (file name and table name) to variables which I can then use as expressions for the bulk insert task inside the Foreach?
Here is XML file:
Code Snippet
<ReferenceFiles>
<File>
<FileName>Ref_Categories.txt</FileName>
<TableName>Ref_Categories</TableName>
</File>
<File>
<FileName>Ref_Configs.txt</FileName>
<TableName>Ref_Configs</TableName>
</File>
</ReferenceFiles>
Thanks.
View 1 Replies
View Related
Nov 29, 2007
I used the data export wizard to export a single table to a single flat file (multiple wasn't allowed). I saved the package as a *.dtsx file which I'm attempting to edit to add the additional tables.
Creating additional sources is fairly easy copy of the first source and change to the table name.
I've tried copying the destination connection and changing to a new text file, but can't get past having to add each column manually to the new destination.
How can I duplicate the mapping that must be taking place in the wizard in the *.dtsx editing environment?
This seems like a simple / common task, but I've been unable to find a solution.
Thanks, Richard
View 1 Replies
View Related
Jun 1, 2007
Hi,
I have searched but not found quite the best way to look at this so far..
I have an application that outputs data to several text files (up to 30). These have commonality by an object name, but then contain completely different column data.
In DTS I had each of the source text file connections going to one OLE DB connection and then individual transform data tasks pointing to the one OLE DB connection.
Looking at SSIS, it would appear that I would need to have one source and one destination for each of these and therefore 30 parallel data flows?
Just wondering if there is a neater way of doing this??
It is a regular data import that happens a few times a day - the text files are named the same as the SQL tables - ie app_userdata.txt goes to app_userdata table.
Hope that explains ok and thanks in advance.
Mike
View 3 Replies
View Related
Aug 4, 2004
I am trying to write (my first, unfortunatly) DTS, and am having some problems.
I need to be able to import multiple flatfiles (all in the same format, just with different schema), each one going into a different table. I have written an application to call my DTS, sending it variables for the tablename and the filename. This works fine when I test it on a single flatfile.
My problem is, the Tranformation object does not reset after each DTS call, so I get "Column does not exist" errors after the first successful import. I can go into the DTS Manager and reset the Transformation options, but that would defeat the purpose of automation. Is there anyway to reset, or another technique, the Transformation object so that it will continuosly work on files that use different schema?
I am very new at DTS, so please consider me "ignorant" when replying.
Thanks in advance.
- Jordan
View 4 Replies
View Related
Dec 17, 2003
I have multiple .sql files, exe each one manually is a pain, so how do you run multiple .sql files all at once? Beside creating a batch file, are there t-sql commands that could execute .sql files?
Thanks!
View 4 Replies
View Related
May 18, 2004
I have a rather large sale transaction DB. Basic header, and detail tables. I am providing a third party company with daily sales information, and I need to give them back data from about 8 or 9 months ago. I currently have a DTS package that gets sales for the current day, but since I have to go back, I have to manually edit the query in the DTS package, and change the date range...UNLESS ...
Blah, blah, blah. The problem is that they can only take the data in Daily files. So, there would be ONE file for each day. I really don't need to be manually running these jobs, so I'm wondering if someone could point me to a way of writing a package (maybe ActiveX, not sure) that would run through a loop, basically, of dates, and create a seperate file for each day. Versus having to edit a generic DTS package, and changing the date range 350 times...
View 6 Replies
View Related
Jan 24, 2007
I need to restore a database that has a bak file and many log files. Is there a way I can do this with one recovery step?
Or do I need to use a different restore object for each file?
MTmace
View 1 Replies
View Related
Jun 7, 2006
Using an expression to set the log filename to include the date and time results in 3 log files being created.
Ummm. Why? I only ran the package once. Is SSIS not sharing my log file connection among the different components?
On first thought my workaround would involve using a script task to "set" the log file name to a variable and use an expression to set the log connection to the variable. But the problem with that is that logging starts BEFORE the script task is run...
View 8 Replies
View Related
Apr 20, 2007
I have a job with a single step that executes a stored procedure that performs the following steps:
1. Checks for the existance of a file A in a folder A
2. If it exists,
a. executes the cmdshell to run a DTS package to drop a table, recreate it and load the data in the file A to table X
b. runs other stored procedures that use the data in table X to create other tables Y and Z
c. executes the cndshell to remove and rename the file A from Folder A into Folder B
What I'd like to do is use this same stored procedure if possible, but create a job or another store procedure that would loop thru and process multiple files in Folder A instead of just one.
Any suggestions would be greatly appreciated
View 1 Replies
View Related
Jul 17, 2007
Hi,
I have about 300-400 XML files I want to load in my SQL database (2005). The following code will load one (1) file. How do i do a mulitple collections?
INSERT INTO MEL (DATA) SELECT * FROM OPENROWSET (BULK'C:TempCHAPTER1.xml', SINGLE_BLOB) AS TEMP Thanks,
View 4 Replies
View Related
Sep 13, 2000
Usually, our in house ERP software has 1 database and 1 database file. After an upgrade from MS SQL 6.5 to MS SQL 7.0 I have a database who's properties show that it is made up of multiple datafiles. What is the easiest and safest method to return this database to only have 1 datafile?
View 11 Replies
View Related
Sep 8, 2000
Is it possible to take a text file that contains multiple record types through the Data Transformation Service in MS SQL 7.0 and load each different record type into a seperate table?
Thanks in advance.
View 2 Replies
View Related
Jan 25, 2005
We have a large Database (91 GB) that is currently in one large data file. Now that we have muliple disk arrays I can split that up on I would like to have a couple data files. My question is, what is the best way to split this up? Should I keep one primary file group and just create another file, or should I create a file group for indexes and put those on that? This database is used for reporting only so it doesn't really have any writes being done on it.
Thanks much.
View 10 Replies
View Related
Sep 3, 2007
I have 8GB of text files which are basically log files from the past few years.
There is 24 text files per directory which are labeled for every day (so they are not all in 1 folder).
It would make reading them much easier if I could import them to SQL but I only seem to be able to import 1 at a time? (with the wizards :eek: )
Surely there is a way to mass import without all the costly applications that google searches give me?
cheers :P
View 5 Replies
View Related
May 22, 2008
When having multiple files for log, do they get filled one after the other (like Oracle) or proportionately fill?
------------------------
I think, therefore I am - Rene Descartes
View 2 Replies
View Related
Nov 4, 2014
I have a script which imports the contents of a csv file from our CRM system and updates a table in my database. This works OK but the problems I have are that a) sometimes there is more than one file in the folder, and b) that I wish to move any csv files that have been imported into an archive folder. The csv files arrive with a time/datestamp and I currently rename them manually to FREXPORT before importing (the name is in the format FREXPORT_20141101_1217.csv).How do I:
1) get it to process the file without me having to manually rename the file(s) each time,
2) if there is more than 1 file in the folder process all the files and 3)move the correctly processed files to an archive folder which is: importarchive?
Ultimately, I would like the script to be run as a scheduled job, so it also has to deal with the fact that sometimes there will be no files to import too.
[code]create table #import
(worknumber nvarchar(12), date_done smalldatetime)
BULK
Insert #import
from 'fork04-hq-dc01dataimportFREXPORT.csv'
[code]....
View 2 Replies
View Related
Dec 18, 2007
Hi i have a FTP task on my SSIS package where i want to select 3 files from a directory, this directory already contains other files but i only want 3 of them how do i only pull down the 3 files from the FTP site i can't seem to find a option on the FTP task to select what files i want from the direcroty.
View 3 Replies
View Related
Aug 5, 2005
I have over three hundred text files that I need to import to SQLServer.Each is in the exact same format.I want to import tham as seperate tables.Is there any way to do it in one process?Regards,Ciarán
View 5 Replies
View Related
Feb 19, 2008
Hi,
I have a situation where every morning files are downloaded from an ftp site.
Problem is, I don't know beforehand how many files there will be. Usually it's one or two, but might be more. All files need to be loaded in the database.
How can I account for there multiple files, whose number I do not know beforehand? That is, in the file connection manager, how do I tell it which files to load?
Help.
View 1 Replies
View Related
Jan 24, 2008
Is there any control flow task or any task in SSIS that is able to get 50 XML files
from a directory on a server and insert them into a table/column of xml datatype
in a sql server database?
I lnow I can use the foreachloop container and specify the directory path to pick up
the xml files but what do you do after that?
I also know of the OPENROWSET Function but that does 1 file at a time only!
??
View 5 Replies
View Related
Oct 12, 2006
Hi,
When do we need to have multiple log files for one Database.
What performance counters should I measure before deciding to go for multiple log files.
Any comments / suggestions are welcome.
Thanks,
Loonysan
View 1 Replies
View Related
May 10, 2007
Hi All,
New to the forums, so, hello.
I want to import data from multiple XML files into SQL 2005 using SSIS. I know how to set up a flat data source etc... but am unsure how I would go about importing from multiple files.
Anyone got any ideas?
Thanks,
Drammy
View 6 Replies
View Related
Dec 14, 2006
Hi everyone,
How to change dynamically the connection properties for a DTSX?
Imagine that you launch a SSIS and under a several rules the same connection is reused with four or five different databases.
I know that I can attach a configuration file or more than one but how to tell to SSIS which use in every moment?
Thanks in advance and regards,
View 3 Replies
View Related