I could not find the exact details on how to create a SSIS script that would ftp files on these forums, so I am adding my code to help save time for anyone else that might be wanting to do something similar. Here is the VB code for my script task to FTP files (hope this helps someone):
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
'Set the properties like username & password
cm.Properties("ServerName").SetValue(cm, "Enter your Server Name here")
cm.Properties("ServerUserName").SetValue(cm, "Enter your FTP User Name here")
cm.Properties("ServerPassword").SetValue(cm, "Enter your FTP Password here")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout
'create the FTP object that sends the files and pass it the connection created above.
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
'Connects to the ftp server
ftp.Connect()
'Build a array of all the file names that is going to be FTP'ed (in this case only one file)
Dim files(0) As String
files(0) = "Drive:FullPathYourFileName"
'ftp the file
'Note: I had a hard time finding the remote path directory. I found it by mistake by creating both the FTP connection and task in the SSIS package and it defaulted the remote path setting in the FTP task.
ftp.SendFiles(files, "/Enter Your Remote Path", True, False) ' the True makes it overwrite existing file and False is saying that it is not transferring ASCII
Is it possible to ftp files using code in a Script Task? I need to read the contents of an xml file and if it has a a specific file name in there then I ftp the corresponding pdf file which is at the same location as the xml file. However I cannot do this using the provided FTP Task in SSIS, I would need to use code to do this as there are close to 50 xml files which I need to read and upload the corresponding pdf file file it meets a certain criteria.
I do not see a way of looping thru all the files in a folder unless I do this in a Script task. Any inputs or alternative comments on doing this will be appreciated.
When my ForEach Loop runs, when a file does not exist on the server, I am getting a File does not exist error.I would prefer to write a mesaage to my log and then move on to the next step successfully.When I got to Event Handlers and select OnTaskFailed, what do I want to do from here?
I'm using the FTP Task in SSIS to send files. They task succeeds but the files get uploaded to the wrong folder. Instead of being sent to the cg268301 folder they are being sent to the cg268300 despite selecting /cg268301 from the remote path field in the FTP task editor.
I've tried uploading this file to the /cg268301 file using an execute process task and it works fine. I just don't know why it won't work with the FTP task.
Hi JayH (or anyone). Another week...a new set of problems. I obviously need to learn .net syntax, but because of project deadlines in converting from DTS to SSIS it is hard for me to stop and do that. So, if someone could help me some easy syntax, I would really appreciate it.
In DTS, there was a VBScript that copied a set of flat files from one directory to an archive directory after modifying the file name. In SSIS, the directory and archive directory will be specified in the config file. So, I need a .net script that retrieves a file, renames it and copies it to a different directory.
I have been strunggling to find solution to convert XLSX files with multiple sheets to csv file.
Requirements >>Â Convert XLSX file with multiple sheets to CSV file >>Â CSV file names : XLSX filename + '_' + sheet name >>Â scirpt has to be in VB as i am using ssis 2005 >> I started develping scirpt using Micorosoft.office.interop.Excel.dll . this dll is referenced to script task. >> found web link as useful.. [URL] ....
I have developed a SQL Server 2005 Integration Services (SSIS) package that includes a Script Task on a 32-bit machine. The PrecompileScriptIntoBinaryCode property is set to True. After I build the package, the .dtsx file includes a <BinaryItem> element for that Task. Package runs fine on the dev machine, both in BIDS and as SQL Server Agent job. When I deploy the package to a 64-bit server, it runs fine when I execute the package ad hoc from SQL Server Management Studio. However, when I schedule the package for execution as a SQL Server Agent job, the package fails with the message: "the script files failed to load." I have reviewed posts on this error from late 2005, but the solutions don't work in this case. Specifically: 1. The Precompile property is already set to True. 2. I have already verified that the script was compiled. Any further suggestions?
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this: 1. If no file(s) found, stop executing and send email saying no file(s) found; 2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1: ----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task.
' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String Dim cFileType As String Dim cFileFlgVar As String WriteVariable("SCFileFlg", False) WriteVariable("OOFileFlg", False) WriteVariable("INFileFlg", False) WriteVariable("IAFileFlg", False) WriteVariable("RCFileFlg", False) cDataFileName = ReadVariable("DataFileName").ToString cFileType = Left(Right(cDataFileName, 4), 2) cFileFlgVar = cFileType.ToUpper + "FileFlg" WriteVariable(cFileFlgVar, True) Dts.TaskResult = Dts.Results.Success End Sub Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object) Try Dim vars As Variables Dts.VariableDispenser.LockForWrite(varName) Dts.VariableDispenser.GetVariables(vars) Try vars(varName).Value = varValue Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try End Sub Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
example for step 2: -------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
I cannot set a breakpoint in a Script task and have it complete successfully. I am running Vista 32-bit and only the workstation tools for SQL Server 2005 SP2. The steps to recreate are:
1) Create a new SSIS project/package. 2) Add a Script Task. 3) Set a breakpoint in the Script Task. 4) Run the package.
When I remove the breakpoint, the script task succeeds. When I put it back the task fails. The execution results say "Error: The script files failed to load." I completely uninstalled all SQL Server 2005-related entries using the Programs and Features control panel, rebooted, reinstalled the Workstation Tools, then applied a freshly-downloaded SQL Server 2005 SP2 and rebooted. If I change "RecompileScriptIntoBinaryCode" to false the script fails whether there's a breakpoint or not. If it's true (the default), the script only fails when a breakpoint is set. I would like to be able to set a breakpoint to assist in debugging the package. For the time being, I can move the package to a different (non-Vista) OS, where it works fine, but I would like to be able to develop and debug on Vista.
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
I am using the "Transfer SQL Server Objects Task" to copy some tables from database A to database B including data.
The tables, primary key constraints, Foreign key, data and all transfers nicely except for "DEFAULT CONSTRAINTS" on the tables.
I have failed to find any option in the "Transfer SQL Server Objects Task" task to explicitly say "copy default constraints". So I guess logically it should happen automatically but it doesn't. I hope it is not a bug :-)
In short, does the €œTransfer SQL Server Objects Task€? support distributed transactions?
In trying to use a €œTransfer SQL Server Objects Task€? in a container using a transaction on the container. The task is set to support the transaction. It is setup to copy table data from several tables from a non-domain server (sql server 2000) to a domain-based server (sql server 2005). I get an error stating, €œThis task can not participate in a transaction€?.
I am wondering if it means exactly what it says €“ this task in SSIS can€™t participate at all. Or does it mean that it won€™t in this scenario for some reason. I attempted a simple copy of data from mssql 2005 to mssql 2005 (same server) and the task still failed). MSDTC appears to be running properly on my machine and such (I can do a simple distributed transaction across linked server to the 2000 server in Query Analyzer (QA)). Also, MSDTC appears to be working on both servers with distributed transaction query tests in QA.
Here€™s the error info€¦
SSIS package "Development BusinessContacts and Products Migration.dtsx" starting. Information: 0x4001100A at Copy BusinessContacts Data: Starting distributed transaction for this container. Error: 0xC002F319 at Copy BusinessContacts database table data 1, Transfer SQL Server Objects Task: This task can not participate in a transaction. Task failed: Copy BusinessContacts database table data 1 Information: 0x4001100C at Copy BusinessContacts database table data 1: Aborting the current distributed transaction. Information: 0x4001100C at Copy BusinessContacts Data: Aborting the current distributed transaction. SSIS package "Development BusinessContacts and Products Migration.dtsx" finished: Failure. The program '[4700] Development BusinessContacts and Products Migration.dtsx: DTS' has exited with code 0 (0x0).
A common issue that I run across with clients is they want only want to process a file if it's finished transmitting to the server. This SQL Server 2005 task reads the properties of a file and writes the values to a series of variables. For example, you can use this task to determine if the file is in use (still be uploaded or written to) and then conditionally run the Data Flow task to load the file if it's not being used. You can also use it to determine when the file was created in order to determine if it must be archived.
I've created my own posting for this. The original post was here, I apologize: http://forums.microsoft.com/forums/ShowPost.aspx?PostID=2906512&SiteID=1
According to the poster it's not possible. But there has to be some way to do it? Reflection (don't know how)?
I need to get a reference to the task host in an SSIS Task component.
Basically the scenario is this:
I have a custom task I have created. However I would like to validate that the ExecValueVariable is infact a string variable during the validate event of the task. I know how to verify its a string variable. But I can't figure out how to read what the user selected (such as User::Myvariable). The only way I've been able to figure out how to do it, but it only works if you open my custom task UI.
What I did is this:
I've implemented IDtsTaskUI and during the initialize section I wrote:
Sub Initialize(ByVal taskHost As TaskHost, ByVal serviceProvider As IServiceProvider) Implements IDtsTaskUI.Initialize ' Store the TaskHost of the task. Me.taskHostValue = taskHost Dim myTask As CustomTask= CType(taskHost.InnerObject, CustomTask) myTask.myTaskHost = taskHost End Sub
My Task is named: CustomTask. I have a public variable in my task as follows:
Public NotInheritable Class CustomTask Inherits Task Implements IDTSComponentPersist Public myTaskHost As TaskHost = Nothing
Therefore I pass back the taskhost value to the CustomTask class, and voila I have it.
Problem is, this only works if the custom task calls the initialize method, and this only happens when you open the custom editor.
I then do the validation in my CustomTask class and it works fine, but myTaskHost is null/nothing until you actually open the custom task UI
How do you specifiy a log file for scheduled tasks. I have consistency checks done at night and when it fails, I cannot find out why because no log file is created. I checked for a log file in /MSSQL/LOGS/*.log. The maintenance wizard creates a log file, but how does a regular TSQL scheduled task create a log file?
Hi i have a FTP task on my SSIS package where i want to select 3 files from a directory, this directory already contains other files but i only want 3 of them how do i only pull down the 3 files from the FTP site i can't seem to find a option on the FTP task to select what files i want from the direcroty.
I'm tring to copy files from FTP address, the problem is that sometimes the FTP folder is empty, and then the FTP Task is failed. Why is it failed if there are no files? Any suggestion how to avoid the error?
I'm using package configurations to store my server/database name in an XML File. I need to be able to dynamically change the database at runtime when I execute the package. Can I use the XML Task in another package to make a change to that xml file? I'm not familiar at all with XML, so I don't really know the syntax for XPath or XSLT or anything. Basically I'm just looking for an example of how to this with XML, but I haven't found anything on the web to explain this to me. BOL isn't very helpful with the XML Task.
I've one Dafta flow task where I'm getting data from OleDb source and then doing some scripting using script component and then generating a file. Now I would need to get the same data and apply some different things and generate another file. Can I used this same task for doing the secondry work? If yes how woulld I put the thing in place, I would need to get the same data but I would need to use a seperate scripting and generate a seperate file?
I have created a package within SQL Server SSIS which includes an FTP Task, deployed it to our SQL Server (2005 SP1) msdb database and am running this job under SQL Agent on Windows Server 2003. Due to company security requirements this job has to be run under a service account within SQL Agent. The problem with this is that even though a directory is specified within the FTP Task to place any downloaded files into, the files are first written to the TIF (Temporary Internet Files) directory of "Default User" which is on the system drive. Based on corporate standards the system drive (C:) on our servers are only configured with enough space for the OS and other system files. All of the files being transferred are compressed, but some are still well over 1GB in size. The result is that many of our downloads are failing due to the system drive running out of space.
I have attempted to run IE by using "Run As" with the service account credentials, and have changed the location of the TIFs to a different drive, rebooted and verified the settings. When the SQL Agent job was run again, the files were still being written to the "Default User" directory on the system drive. I also created a new template account with the TIFs pointing to a non-system drive and used the User Profiles functionality of System Properties to copy the new template account to "Default User", but still the files are being written to the system drive.
My questions are:
is there a way to stop the FTP Task from using TIF (i.e. just directly write the file to the location specified) is there a best practice around how to setup a service account and have it create a proper user profile that can be managed separate from "Default User" short of specifying during the OS install, is there a way to move the "Default User" profile directory to a different drive
I have created a FTP task that logs into FTP server and receives files and scheduled it to run every 15min. However, it fails when there are no files on FTP. How would I check the if files exist? How can I catch the FTP task error and compare it to Hresults.NoFilesFound in a script task?
I have two FTP Tasks configured in my SSIS package. One is for "Receive files" and the other is set for "Delete remote files." Both use variables for the source/destination paths. My remote path variable contains a wild card in the name field such as /usr/this/is/my/path/*.ext and it is working to FTP all the .ext files to my working directory. I then rename the files and want to remove the original files from the FTP server. I use the same variable as the remote path variable in the delete as I do in the receive.
Using the same FTP connection manager for both tasks I am always getting a failure on the delete. The FTP connection manger is setup to use the root user. Using a terminal I am able to open an FTP connection to the server and remove the files manually. There doesn't seem to be any detailed documentation on the FTP Task configured for Delete remote files so I'm hoping someone might have some insight to the problem.
I receive the same message for each of the files that was downloaded: Error: 0xC001602A at MyPackage, Connection manager "FTP Connection Manager": An error occurred in the requested FTP operation. Detailed error description: 550 usr hisismypathdatafile1.ext: No such file or directory. The attempt to delete file "usr hisismypathdatafile1.ext" failed. This may occur when the file does not exist, the file name was spelled incorrectly, or you do not have permissions to delete the file.
With the root user/working manually I'm not understanding the permission reason, the file does exist and is spelled correctly.
For selecting different load targets we use package configurations to override the connection parameters in the connection manager. To organize the different tasks we splittetd our load processes into multiple package files (dtsx) which are then put together into a workflow using "Execute Package Task" component. With the option /CONFIGFILE its possible to override the configuration file for the package that is called with DTExec. Unfortuanetly it does not override the configuration of subpackages calles by the "Execute Package Task". Are there some ways to configure subpackages on the command line?
We are Downloading files from FTP Site using "FTP Task" and we need to remove those files after downloading.
We need to delete files of specific names, like "Names_*.TXT" (All the txt files and Names starting from "Names_") so We are using Delete Remote Files and Remote path we are specifying as "/Names_*.TXT"
This doesn't work its not able to delete the files so, We tried using "*.*" and also given the specific filename like "Names_A1.TXT" then also its not deleting the remote files.
I don't know if anyone faced this issue. We are having a strange problem. Our process was working well when it was implemented on 32 bit processor.IT ran perfectly for 6 months with out a problem. But when we moved the packages to a 64 bit machine, this issue along with some other issues started to show up.
The issue is we are missing files in the source folder.
Our process is designed such that a source process, brings in a file and updates a status for the file in a audit table. The ETL process picks up the file, then assigns the status as €˜running€™ when SRC process is complete and loads into Target DB, and updates ETL status to complete. But current problem is the ETL is losing files after it assigns the status as running. When we looked into the DB weather the data is loaded, we could not find any data related to these files.
we are have mapping level parameters for source path and target path.
We are using a For Each Loop task, and processing files(which are simple flat files) in the source path. The file name is stored in the mapping level parameter. Once the file is process we are moving them into a target path.
Our src and target file paths are on the same drive, just have src folder, inside src folder we have processed folder and failed folder. So files are picked from the source folder and moved into processed folder after processing. The files are not even moved to a failed folder.
There are lot other processing going on this box, and the trend observed is that when more processors are running at peak hour, the missing files€™ count is more.
Right now we are refetching those files, as a work around, but does any one has any suggestion why this is happening or any better implementation suggestions?
I am new to SSIS. I need to read the files in a remote server folder and ftp them to the other remote server folder.
I know , I have to set up FTP task and FTP task must use a variable to provide the path information. But I am not sure how I can do this. Anyone suggest me with the script task to populate the variable and steps involved in this.
I have a FTP task in my control flow that download files from a FTP server. This ftp task is inside a foreach container that loops over a ADO recordset for the file name. The files that the ftp task pulls are huge. If the FTP task fails then I want the FTP task to restart and only download those files that have not been downloaded. Is this possible?
What possible configurations do I have to make to the foreach container and the filetask?