Flat Data Length Handling In IS Package
Apr 11, 2008
hello All,
I have a flat file which has a column (length of 24) . Whenever I have that column more the 24 char length, my package failed. I would like to know if there is any way if I can continue my package in this situation and send my bad row to the output table?
Any help will be appreciated.
Thanks
View 2 Replies
ADVERTISEMENT
Mar 27, 2008
For those of you who would like to reference my exact issue, I'm dealing with the RSExecution SSIS package at the "Update Parameters" data flow task, at the Script Component.
The script tries to split parameter data into name and value. Unfortunately, I have several reports that are passing parameters that are very large. One example has over 65,000 characters all in the normal "¶mname=value&parm2=value..." format.
The code in the script works fine until it gets to one of these very large parameter sets. I have figured out what is causing the issue. Here's some code:
Dim paramBlob as Byte()
paramBlob = Row.BlobColumn.GetBlobData(0, Row.BlobColumn.Length)
The second parameter of the .GetBlobData function takes an INTEGER as its count! Therefore, no matter what kind of datatype I pass to the string that the script will later split, it will be limited to 32767 characters.
THIS IS A PROBLEM!!!
Does anyone know a workaround for this issue? I need all of the parameter data to be reported, and I would hate to have to skip over rows like this. Also, if I'm missing something, please fill me in!
Thanks for your help in advance,
LOSTlover
View 6 Replies
View Related
Nov 3, 2006
Hello,
I have a package that contains 22 data flow tasks, one for each flat file that I need to process and import. I decided against making each import a seperate package because I am loading the package in an external application and calling it from there.
Now, everything works beautifully when all my text files are exported from a datasource beyond my control. I have an application that processes a series of files encoded using EBCDIC and I am not always gauranteed that all the flat files will be exported. (There may have not been any data for the day.)
I am looking for suggestions on how to handle files that do not exist. I have tried making a package level error handler (Script task) that checks the error code ("System::ErrorCode") and if it tells me that the file cannot be found, I return Dts.TaskResult = Dts.Results.Sucsess, but that is not working for me, the package still fails. I have also thought about progmatically disabling the tasks that do not have a corresponding flat file, but it seems like over kill.
So I guess my question is this; if the file does not exist, how can I either a) skip the task in the package, or b) quietly handle the error and move on without failing the package?
Thanks!
Lee.
View 9 Replies
View Related
Apr 17, 2007
I have a package set up basically with two consecutive data flows. The first flow takes data from an OLE DB Source and stores it into a Flat File Destination. The second flow uses this same flat file as a source, alters the data, and stores the data in the same flat file, overwriting the old file. I set DelayValidation to True on the flat file. Still, here are the error messages I am receiving:
Error: 0xC020200E at DO, Flat File Destination [7676]: Cannot open the datafile "C:Temp.txt".
Error: 0xC004701A at DO, DTS.Pipeline: component "Flat File Destination" (7676) failed the pre-execute phase and returned error code 0xC020200E.
I am new to SSIS, so I'm sure I have a setting wrong or something. Is the problem that SSIS is trying to write to a file from which it is simultaneously reading data?
Thank you.
View 6 Replies
View Related
Oct 12, 2007
Hi,
I am reading a flat file in to SSIS. In the script, I want to process differently depending on the length of the row. However, since I am new to .net, I can't figure out the syntax to find the length of the row.
Can anyone tell me the syntax?
Thanks,
Linda
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
'Dim d As Double
If Row.MemberSourceLine.ToString.Length.Equals(267) Then <-- my latest try, but it does not work....
Row.WholesalerCode = Row.MemberSourceLine.ToString.Substring(1, 5)
Row.UPCNumber = Row.MemberSourceLine.ToString.Substring(34, 12)
'Row.SKUNumber = Row.MemberSourceLine.ToString.Substring(6, 14)
End If
View 5 Replies
View Related
Aug 21, 2006
Hi,
I'm trying to extract data from a Flat File which is as fixed length as they come. The file has a header, which simply contains the number of records in the file, followed by the records, with no header delimeter (No CR/LF, nothing).
For example a file would look like the following:
00000003Name1Address1Name2Address2Name3Address3
So this has 3 records (indicated by the first 8 characters), each consisting of a Name and Address.
I can't see a way to extract the data using a flat file connection, unless we add a delimeter for the header (not possible at this stage). Am I wrong?
Any suggestions on possible solution would be much appreciated - I'm thinking Ill have to write a script to parse the file manually.
Thanks in advance,
Scott
View 6 Replies
View Related
Dec 8, 2006
When I use SQL 2000 DTS Export to create a fixed length flat file, the data rows are delimited by carriage return-line. Which means that when I open the flat file in a text editor like UltraEdit or WordPad, the data rows are broken out nicely (row ends at the max row length position and new row starts at position 0).
But when I use SSIS to create the file, the whole file is displayed as one line in WordPad. The data rows don't end at the max row lenght position in ultraEdit neither. From Flat File Connection Manager's Preview page, I can see the data rows are displayed properly.
Now I wonder if the flat file destination is a true fixed length file.
View 3 Replies
View Related
Jul 24, 2015
I have three tables in data base:
customer
product
sales
And i want to use SSIS package dynamically load data from database into three separate flat file, each table into each file.
I know i got to use for each loop task ADO.Net schema row set enumerator, with OLEDB connection manager, select table name or view name variable from access mode list, but the problem comes, as table name is dynamic then flat file connection is also dynamic, i am using visual studio 2013...
View 5 Replies
View Related
Sep 14, 2007
how do I make the output columns padded with extra space ? I intentionally set my output width larger than the input width, but the generated file is still jamming all the columns next to each other
View 2 Replies
View Related
Feb 5, 2015
After performing a join operation on two tables i get the below resultset
pid, fname, typename, pname, pcost
1, cad, bars, product-1, 100
2, har, witte, product-2, 120
3, nes, bars, product-3, 119
Now i need to create files with the obtained resultset like
Column 'fname' is the folder name and 'typename' should be the file in the particular folder.
For example the first record should be inserted into file name 'bars.txt' in the folder 'cad' and third record should be created in file name 'bars.txt' in the folder 'nes'.
View 2 Replies
View Related
May 10, 2006
Hi,
I've just started looking at SSIS and have encountered what should hopefully be a simple problem to solve. I have a pipe-separated source file that looks like this (I've added Line numbers for simplicity):
Ln 01: HDR|FEED_CODE|31-MAR-2006
Ln 02: Tom|100|Jones|ZZ1 1ZZ|USA
Ln 03: Tom|200|Singer|
Ln 04: Tom|305||Red|Porche ||Lanzarote |Apple|Carrot| | |
Ln 05: Dick|100|Van Dyke|ZZ1 1ZZ|USA
Ln 06: Dick|200|Actor|
Ln 07: Dick|305||Blue|Ford||California |Tomato | |||Beef
Ln 08: Harry|100|Houdini|ZZ1 1ZZ|GBR
Ln 09: Harry|200|Escapologist|
Ln 10: Harryk|305| |Green ||Triumph |Poland|Banana|Sprout| | |
Ln 11: TRL|9
In addition to a header and footer records, this file contains three record types for each person.
Record types are identified by the second column.
Each record type has a different number of columns:
Type 100 has 5 columns
Type 200 has 4 columns
Type 305 has 12 columns
The Row delimiter for all records is the {CR}{LF} character
I've set up a flat file input source and specified {CR}{LF} as the row delimiter for both header and data rows and the "|" character as the field delimiter.
It appears that SSIS is assuming that because the first data row has 5 columns, then everything must fit that format too. So the {CR}{LF} character that separates lines 02 and 03 is interpreted as text rather than a separation character and all remaining | field separators after 305 are interpreted as text containing in the fifth column. SSIS is also complaining that the last row is incomplete.
A bit like this (I've used tildes to indicate column separation):
Tom~100~Jones~ZZ1 1ZZ~USA
Tom~200~Singer~{CR}{LF}Tom~305||Red|Porche ||Lanzarote |Apple|Carrot| | |
I've seen one other reference to this behaviour but the response seemed to be SSIS doesn't know which columns are missing. In this scenario, we don't have missing columns, rather, we have different types of record in a single file. in DTS I would effectively parse the file once for each record type thus:
if cStr(DTSSource("Col002")) = "100" then
DTSDestination("in_Name") = trim(DTSSource("Col001"))
...
Main = DTSTransformStat_OK
else
Main = DTSTransformStat_SkipInsert
end if
...not the most efficient solution I know but the load only runs once a month so this was an acceptable workaround.
DTS was never this fussy but I'm sure this is user error rather than an SSIS limitiion. Can someone please put me straight?
Many thanks,
Greg
View 7 Replies
View Related
Sep 30, 2015
I have requirement like to develop dynamic package for inserting data from flat file to table.
Find below points for more clarification :--
1) if I changed the flat file values and name in source variable AND  the table name should be also changed based on variable value .
2) it should dynamically mapped with column values with source file as we have to insert data in target table.
See below diagram for more clarification.
View 10 Replies
View Related
May 24, 2007
Hi All
I am facing this problem while loading data from text file into Table.
Scenario is -
There are chances of having spaces for null values in text file.
when i m trying to rum my SSIS package this is getting failed.
How can i avoid this problem? i want null values to be inserted if ther is spaces for that field in text file.
Thanks,
Anshu
View 1 Replies
View Related
Jul 23, 2005
Hi,Currently we get data from more then 200 different sources and all ofour vendors provide data in different file formats. The problem is wehave more then 100 DTS packages now and the maintainance is verydiffucult.Every time our vendor changes the format we have to change in multipleDTS packages.Is anybody know what would be the right way of reducing the no. of DTSpackages.The type of file formats we get are .xls .txt .dat .csv etc. and .txt& .dat files comes with different delimitters. The # of columns alsovaries from file to file. Is it possible to have a DTS package whichcan handle diff file formats and loads data into a staging table andfrom there based of the source of the file we can move data intorespective tables & columns.We are using SQL SERVER 2000Thanks in advance.Subodh
View 2 Replies
View Related
Dec 9, 2007
Hello,
I am reading Microsoft REAL project ETL part. I find that it uses neither transaction nor checkpoints throughtout the packages. I am wondering if something goes wrong with the process and the process aborts in the middle,how are we going to handle this. Any particular reson not using transaction? Is it best practice without using transaction for performance consideration and log volumn? I am new to SSIS. I would like to follow the best practice. Thanks for advice..
Regards
Jerry
View 6 Replies
View Related
Mar 24, 2006
I have a main package that calls several other packages using Execute Package Tasks. I also have OnPreExecute, OnPostExecute, and OnError event handlers at the package level to audit the beginning and completion of each package. I want to prevent each task from bubbling up to the package event handlers as well as prevent each task from bubbling up to the main package event handlers. I've tried setting the Propagate variable for each of the event handlers to False and setting the DisableEventHandlers property of each task to True but neither solution seems to work. Is there a way to do this that I'm missing?
View 8 Replies
View Related
Jun 10, 2015
I Have a table with #Sample like below
=================================
#Sample
id int,
SSN varchar(20),
State varchar(2)
Â
Sample Data:
ID SSN STATE
1 999-000-000 AB
2 979-000-000Â BC
3 995-000-000Â CD
=================================
We used filter logic based on the SSN & State.
We are passing these values through variables like
Declare @State varchar(2)
Declare @SSN varchar(20)
While run time these values are lets suppose @SSN = '999-000-000' & @State='ABC'
Now the Result is displayed with the state data Like 'AB' only.
Output: 1 999-000-000 AB
instead it should give system generated error.
Here I have 2 Questions:
1. Why it is taking 1st 2 Charecters?
2. Why it does not have any system generated for length?
I can do validation with Length function for these 2 variables however if have 100 variables then it should not feasible case. So, what is the reason behind?Â
View 5 Replies
View Related
Jul 29, 2015
I am trying to import data from an excel Sheet to SQL Database using OPENROWSET. After import I found that all the cells containing data of more than 2000 length got truncated to  255 characters only. I tried finding the solution and found that We need to have the data with length more than 255 in first 8 rows of Excel sheet. It worked for me also. But In real scenario the data that I cant do the manual work on excel. I tried out with Dot Net utility and SSIS package also but the truncation is still the issue.
INSERT into tmp_Test
SELECT
*
FROM
OPENROWSET('Microsoft.ACE.OLEDB.12.0','Excel
12.0;Database=D:Book1.xlsx', [Sheet1$])
View 9 Replies
View Related
Nov 16, 2007
Ok everybody. I am new to sql. I have ms sql staging database that pulls data from mysql database. Then once a day I run a ssis package that moves the data to a live database and also creates a flat file that is posted to a ftp site then truncates the table. One problem I am running into is if the mssql staging database has no records the flat file is still created. How do I stop it?
View 10 Replies
View Related
Jan 4, 2007
During my development of a ssis package i've noticed that when creating two control flows that pulling data from seperate tables, each going to its own flat file, that the second keeps the attributes of the column names from the first. So when I create my second flat file, not only does it have the names of its correct columns but has the name of the the first flat file.
I'm hoping that I've explained the correctly. I'll provide more info "OR" I can provide the code to package if anyone would like.
View 3 Replies
View Related
Jun 30, 2014
is there any way or a tool to identify if in procedure the Parameter length was declarated less than table Column length ..
I have a table
CREATE TABLE TEST001 (KeyName Varchar(100) ) a procedure
CREATE PROCEDURE SpFindNames ( @KeyName VARCHAR(40) )
AS
BEGIN
SELECT KeyName FROM TEST001
WHERE KeyName = @KeyName
END
KeyName = @KeyName
Here table Column with 100 char length "KeyName" was compared with SP parameter "@KeyName" with length 40 char ..
IS there any way to find out all such usage on the ALL Procedures in the Database ?
View 2 Replies
View Related
Oct 7, 2007
I have a few flat files that will be retrieved from some SFTP server. One of the flat file will act as a terminal file where it will specify the total number of records expected in each other the flat file.
Data in the terminal.txt
FileName TotalRecords
File1 1000
File2 1500
File3 2000
So, before transforming the data from the flat file sources into the target destination, i wish to do a row count checking for each of the flat file source to make sure that the number of records in the flat file source is tally with the number of records specify in the terminal.txt file. I'm able to get the number of records in each of the flat file by using the RowCount component but don't know how to get the data out from the terminal.txt file in order to make a rowcount comparison.
Can any1 help me on this? Or is there any other way we can do to make sure that the flat file source is alright before proceeding with the data transformation task?
Thanks!
View 3 Replies
View Related
Apr 18, 2007
Trying to figure out the best method of reading in a number of flat files, all with different number of columns and data types and outputting them to a database.
Here's the problem: They are EBCDIC encoded and some of the columns are packed decimal. I've set up one package that takes the flat file, unpacks the decimal (Using UnpackDecimal component) and then sending the rest through a second component to go from EBCDIC -> ASCII.
What I need is a way to do this for every flat file based on the schema for that flat file. One current solution is to write a script/app to create the .dtsx XML file and then execute that for each flat file. It appears like this may be possible, but I haven't gotten far enough to know for sure. So my questions are this:
1) Is there an easier way to do this (ie somehow feed the schema to the package and use it to dynamically set up the column makers and determine which columns get fed to the unpack decimal component.
2) If there isn't a better way, will dynamically creating the .dtsx XML file based on the necessary input/output columns for each flat file work? If so, what is a good source of information on this (information about how the .dtsx XML file is set up, what needs to be changed/what doesn't, etc).
Thanks,
Travis
View 1 Replies
View Related
Jan 22, 2001
Hi,
SQL Intrduction book on page 269 says that the max.amount of data contained in a single row is 8060 bytes. But I have no difficulty creating a table with two varchar 8000 columns. How can this be ? Can somebody explain how to interpret the statement about max.8060 bytes please?
Thanks.
View 3 Replies
View Related
Mar 28, 2007
HelloI have problem with reading from XML when XML is to large.Program delare 1-n variables where is declaration but can no make moredelarations than length 8000 :((drop table tblBooksExCREATE TABLE [tblBooksEx] ([Row_ID] [int] IDENTITY (1, 1) NOT NULL ,[BooksData] [text] COLLATE Polish_CI_AS NULL ,CONSTRAINT [PK__tblBooksEx__17036CC0] PRIMARY KEY CLUSTERED([Row_ID]) ON [PRIMARY]) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]GOinsert into tblBooksEx(booksdata) values('')exec master..xp_cmdshell 'TextCopy.exe /S serv /U usr /Ppsv /D Northwind /TtblBooksEx /C BooksData /F c:SCN.xml /W "WHERE Row_ID=1" /I'/*PART 1*/DECLARE @id intDECLARE @idoc intSET @id = 1 -- or whatever the idDECLARE @datalen intDECLARE @sql varchar(8000)DECLARE @sql1 varchar(8000)DECLARE @cnt int-- get the lengthSELECT @datalen = DATALENGTH (booksdata) / 4000 + 1 FROM tblBooksEx WHERErow_id = @id-- phase 1 collect into @sql declarations of @str1, @str2,...@strnSET @cnt = 1SET @sql='DECLARE 'SET @sql1 = ''WHILE (@cnt <= @datalen)BEGINSELECT@sql = @sql + CASE @cntWHEN 1 THEN ''ELSE ', ' + CHAR(13)END+ ' @str'+CONVERT(varchar(10),@cnt)+' VARCHAR(4000)'SET @cnt = @cnt + 1END-- phase 2 collect into @sql selection of chunks (takng care of length)SET @cnt = 1WHILE (@cnt <= @datalen)BEGINIF LEN(@sql) < 7850SELECT @sql = @sql + CHAR (13) +'SELECT @str' + CONVERT(VARCHAR(10), @cnt) + ' =REPLACE(SUBSTRING(booksdata, ' +CONVERT(VARCHAR(30), (@cnt-1)*4000+1) + ', 4000),CHAR(39),CHAR(39)+CHAR(39) ) ' +'FROM tblBooksEx ' +'WHERE row_id = ''' + cast(@id as varchar) + ''''ELSESELECT @sql1 = @sql1 + CHAR (13) +'SELECT @str' + CONVERT(VARCHAR(10), @cnt) + ' =REPLACE(SUBSTRING(booksdata, ' +CONVERT(VARCHAR(30), (@cnt-1)*4000+1) + ', 4000),CHAR(39),CHAR(39)+CHAR(39) ) ' +'FROM tblBooksEx ' +'WHERE row_id = ''' + cast(@id as varchar) + ''''SET @cnt = @cnt + 1END/*PART 2*/-- phase 3 preparing the 2nd level dynamic sqlSELECT @sql1 = @sql1 + CHAR(13) + 'EXEC ('+ CHAR(13) + '''DECLARE @idocint'+ CHAR(13) +'EXEC sp_xml_preparedocument @idoc OUT, '''''' + 'SET @cnt = 1WHILE (@cnt <= @datalen)BEGINSELECT @sql1 = @sql1 + CHAR (13) + '@str' + CONVERT (varchar(10), @cnt) + '+'SET @cnt = @cnt + 1ENDSET @sql1 = @sql1 + ' '''''' 'SET @sql1 = @sql1 + CHAR(13) + 'DECLARE idoc_cur CURSOR FOR SELECT @idoc'''+CHAR(13) + ')'--debug code/*PRINT @sqlPRINT '@sql length=' +convert(varchar(5),datalength(@sql))PRINT '----------'PRINT @sql1PRINT '@sql1 length=' +convert(varchar(5),datalength(@sql1))*/EXEC (@sql + @sql1)OPEN idoc_curFETCH NEXT FROM idoc_cur into @idocDEALLOCATE idoc_curselect * from OpenXML(@idoc, '//transfer/body', 2) WITH (ng int, nk int, dwnvarchar(50))--When Complete--/*exec sp_xml_removedocument @idoc--*/How to solve this problem??Best RegardsAJA
View 5 Replies
View Related
Jan 2, 2008
I have a peculiar requirement. I need the unmatched data to passed on a bad data table as well as tied to a dimension "Others". Please guide how to suffice these two condition simultaneously
View 1 Replies
View Related
Jul 25, 2007
I have an SSIS package that takes in a flat file and pushes the data into a table. However, once in a while the file will have some bad data in it (for example, this particular time I have too many delimiters on one line). I want the package to redirect the row to an error table and keep going with the processing for the rest of the data. To do this, I have hooked up an error output from the flat file source to an OLE DB destination and assigned all the rows to Redirect Row on Error in the Error Output section of the flat file source. Unfortunately, it does not work! The flat file receives an error and stops. Is this because something different has to happen when it is a problem with the whole row? Any help would be appreciated, thanks.
View 6 Replies
View Related
Feb 24, 2008
I am trying to narrow down this problem. Basically, I added 3 columns to my article table. It holds the article id, article text, author and so on. I tested my program before adding the additional field to the program. The program works fine and I can add an article, and edit the same article even though it skips over the 3 new fields in the database. It just puts nulls into those columns.So, now I have added one of the column names I added in the database to the code. I changed my businesslogic article.vb code and the addarticle.aspx, as well as the New article area in the addartivle.aspx.vb page. The form now has an additional textbox field for the ShortDesc which is a short description of the article. This is the problem now: The command parameters.length is 9 and there are 10 parameter values. Right in the middle of the 10 values is the #4 value which I inserted into the code. It says Nothing when I hover my mouse over the code after my program throws the exception in 17 below. Why is command parameters.length set to 9 instead of 10? Why isn't it reading the information for value 4 like all the other values and placing it's value there and calculating 10 instead of 9? Where are these set in the program? Sounds to me like they are hard coded in someplace and I need to change them to match everything else. 1 ' This method assigns an array of values to an array of SqlParameters.2 ' Parameters:3 ' -commandParameters - array of SqlParameters to be assigned values4 ' -array of objects holding the values to be assigned5 Private Overloads Shared Sub AssignParameterValues(ByVal commandParameters() As SqlParameter, ByVal parameterValues() As Object)6 7 Dim i As Integer8 Dim j As Integer9 10 If (commandParameters Is Nothing) AndAlso (parameterValues Is Nothing) Then11 ' Do nothing if we get no data12 Return13 End If14 15 ' We must have the same number of values as we pave parameters to put them in16 If commandParameters.Length <> parameterValues.Length Then17 Throw New ArgumentException("Parameter count does not match Parameter Value count.") 18 End If19 20 ' Value array21 j = commandParameters.Length - 122 For i = 0 To j23 ' If the current array value derives from IDbDataParameter, then assign its Value property24 If TypeOf parameterValues(i) Is IDbDataParameter Then25 Dim paramInstance As IDbDataParameter = CType(parameterValues(i), IDbDataParameter)26 If (paramInstance.Value Is Nothing) Then27 commandParameters(i).Value = DBNull.Value28 Else29 commandParameters(i).Value = paramInstance.Value30 End If31 ElseIf (parameterValues(i) Is Nothing) Then32 commandParameters(i).Value = DBNull.Value33 Else34 commandParameters(i).Value = parameterValues(i)35 End If36 Next37 End Sub ' AssignParameterValues38 39 40 41
View 2 Replies
View Related
Sep 6, 2007
Hi everyone,
There is a small problem encountered while creating a package in sql
server 2005.
Actually i am using a flat file which has 820 rows and 2 columns which
are seperated by line feed(for ROW) and tab(for COLUMN).after
importing i found that ther are only 800 rows imported into the table.
Ather verifying the input file i found out that there are some null
values in the second column so there is no line feed for those
values.
Can anyone please help me how to give multiple delimiters for the same
input flat file.
View 9 Replies
View Related
Sep 20, 2007
Hello!
I want to make a very simple package: Export all rows in a table to a flat file.
This package I can create pretty much by only using the wizards.
Now to my problems:
1) I need the output to have this format:
H20070920161522
DS3 Plastpall trippelkrage 40 1
E00000000003
H is a header post, in this case with date and time following.
D is a details post, that is all the rows that was exported.
E is and end post, containing only the number of rows in the file, including H and E posts.
2) I need to set the file name dynamically, preferably using date and time to name the file.
I´ve done this very same thing in T-SQL, like so:
Code Snippet
USE AVK
GO
SET TRANSACTION ISOLATION LEVEL SNAPSHOT;
GO
SELECT *
FROM tempProducts
GO
CREATE VIEW EXPORT_ORDERS
AS
SELECT 1 AS ROW_ORDER, 'H' + REPLACE(CONVERT(char(8), GETDATE(), 112) + CONVERT(char(8), GETDATE(), 108), ':', '') AS Data_Line
UNION ALL
SELECT 2 AS ROW_ORDER, 'D' + COALESCE (CONVERT(char(10), LBTyp), '') + COALESCE (CONVERT(char(50), Description), '') + COALESCE (CONVERT(char(5),
Volume), '') AS Data_Line
FROM dbo.tempProducts
UNION ALL
SELECT 3 AS ROW_ORDER, 'E' + RIGHT('0000000000' + RTRIM(CONVERT(char(13), COUNT(*) + 2)), 11) AS Data_Line
FROM dbo.tempProducts AS tempProducts_1
GO
IF @@ROWCOUNT > 0
BEGIN
BEGIN TRANSACTION
SELECT *
FROM tempProducts
DECLARE @date char(8)
DECLARE @time char(8)
DECLARE @sql VARCHAR(150)
SELECT @date = CONVERT(char(8), getdate(),112)
SELECT @time = CONVERT(char(8), getdate(),108)
SELECT @time = REPLACE(@time,':','')
DECLARE @dt char(14)
SELECT @dt = @date + '_' + @time
SELECT @sql = 'bcp "SELECT Data_Line FROM avk..EXPORT_ORDERS ORDER BY ROW_ORDER" queryout "c:AVK_' + @dt + '.txt" -c -t -U sa -P dalla'
EXEC master..xp_cmdshell @sql
--WAITFOR DELAY '0:00:10';
DELETE
FROM tempProducts
COMMIT TRANSACTION
END
DROP VIEW EXPORT_ORDERS
GO
But I´m sure it can be done in SSIS aswell, giving me some nice options for i.e. error handling aswell.
Pointers please
View 5 Replies
View Related
Apr 19, 2007
Hi all,
I am passing flat file source as a variable to Dtexec Utility. (like package.variables[User::varFileName].Value;"D:sourcedata.txt).
Destination table is having one more column.
I want to add custom value in that column at run time by parameter to Dtexec(User::varDate)
I dont know how to do it, please help me.
Madhukar
View 4 Replies
View Related
Feb 23, 2008
Hello Guys,
I am working in one company and currently I am assigned to new project for Data Migration from company X to our company Y using SSIS. I am totally new and i just completed 5 tutorial which was gien on MSDN website.
Basically client is going to send us first flat file with 1 million records with Header, Detail and Trailer records.
I want to create a Package in such a way that it dumps all this first load into 7 to 8 different tables at a time.
we also have to include functionlity for validation and error check.
On successfull load error file should only return Header and Trailer but no detail records.
If there are any errors then error file should contain Header, Detail records which were unable to load plus trailer which we have to sent back to client.
When 2nd file comes that time we have to check whether this is new records or change (update) one depending on Flag which tells it.
This is basically high level idea of my Package what i need to create. If u guys have any question then let me know.
I know you guys are very experienced one. Anyone of you please give me some detail idea on it I would really appricate it.
I have very limited time line for it.
Thanks
Shah
View 4 Replies
View Related