Developing A Custom Task Using OLE DB Destination Adapter
Jul 25, 2007
I am writing a custom task to import data from delimited files into SQL tables. I use the standard Flat File Source adapter, a custom transformation to add a URN column and a filename column to the data and the standard OLEDB Destination Adapter.
Most of my test data files work fine except for ones with a lot of columns (around 350 columns). I get an error when I call the ReinitializeMetaData() method for the destination adapter.
Q1) Is there a restriction on the number of columns (or data row size) that can be imported into an OLEDB Destination Adapter?
Q2) The reason I use this adapter rather than the SQL Server Destination Adapter is that I need to set the destination table name using a variable. I don't believe I can do this with the SQL Server Destination Adapter. Is this the case?
Q3) Anyone know of a better/alternative way of acheiving the above? One way I have thought of is to create a custom destination adapter using the SQL Server Destination adapter as the base but I'm not sure whether this is a) possible and b) worth the hassle.
I have built a custom flat file destination adapter but it appears that the code is not working. When I debug the process I notice that the ProcessInput section is called multiple times. The first time it looks like everything is working, then it call it again and there is no inpout from the DTSInput90.
Will someone please tell me how to use "Destination adapter in store procedure" .............
I'm using Sql server 2005, instead of using BCP i have to use "Sql server destination adapter" in my store procedure, Is it possible to implement destination adapter in store procs, if it is yes please let me know how
Will someone please tell me how to use "Destination adapter in store procedure" .............
I'm using Sql server 2005, instead of using BCP i have to use "Sql server destination adapter" in my store procedure, Is it possible to implement destination adapter in store procs, if it is yes please let me know how
Will someone please tell me how to use "Destination adapter in store procedure" .............
I'm using Sql server 2005, instead of using BCP i have to use "Sql server destination adapter" in my store procedure, Is it possible to implement destination adapter in store procs, if it is yes please let me know how
I have conditional split on a boolean variable, when the value is true, case1 output goes to Oledb Destination, defaukt Output goes to Sql Server Destination.
When I run the package with value "true" the package runs, but when i run the package with bool value "false", the data is inserted through oledb but Sql Server destination fails.
Can any one tell me Why.
[SQL Server Destination [292]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".
I am currently writing a custom source adapter that extracts data from a JD Edwards OneWorld system. In the custom user interface of the source component I need to allow the user to set a query (a custom property), and then refresh a list of output columns that will be extracted based upon that query (similar to the list shown in the advanced editor).
My question is, can I apply the custom property change to the component and build my output column list without closing and restarting my custom UI form? I understand that the IDTSComponentUI interface being implemented allows for transactional editing of the component, in that the changes to the component are not applied until I have returned a result in the implemented Edit() method. However is there a way to apply changes without returning this result (and closing my UI)?
Essentially I am looking to have similar behaviour to that of the 'Refresh' button in the advanced component editor form.
Hi, I'm getting some unexpected behaviour from my SSIS packages when targeting tables that are being referenced by Indexed Views. There's two separate issues:
1. When writing into a pair of tables with a SuperType / SubType relationship concurrently with a pair of SS destinations I'm getting deadlocks between the two. Removing the index on the view that references both of these fixes the problem.
2. Much odder, I'm getting some extremely long waits (10 times longer than the whole package should take to run!) from an SS Destination adapter even when there's no data in the flow for it to bulk insert. Again, removing the indexed views that reference the destination table fixes the problem.
The views aren't mine, and (apparently) are required by the reporting app (BO), so removing them isn't really an option. I realise that there's quite a lot of overhead to maintaining indexed views, but unfortunately, the project is on a very tight timeline, so I can't look into it in as much detail as I'd like. I was wondering if anyone's experienced any similar issues, or would have any ideas as to where to start investigating?
I didn't want to deal with any truncation issues so I edited the SQL created by the Import Wizard. I made the fields VARCHAR instead of NVARCHAR and changed the size from the default 50 to 250.
Now I have a triangle warning on the OLE DB Destination control. So I am wondering, is it limited to only handling 50 characters?
It€™s always said that using the SQL Server Destination adapter is much faster when loading data to a SQL Server database. My question is, if I have a SQL 2000 database, can I still use the SQL Server Destination adapter, or is it only for SQL 2005 db? Is OLE DB Destination adapter my best bet? My data files(flat files), SSIS package and destination SQL 2000 db are all in the same physical server.
For making a custom Task in SSIS, is it possible to reuse the existing code base? For e.g. If i need to append some functionalities to LookUp Transform. Can I inherit the lookUp transform class?
OK. I give up and need help. Hopefully it's something minor ...
I have a dataflow which returns email addresses to a recordset.
I pass this recordset into a ForEachLoop configuring the enumerator as (Foreach ADO Enumerator). I also map the email address as a variable with index 0.
I then have a Execute SQL task which receives this email address as a varchar variable (parameter 0) which I then use in my SQL command to limit the rows returned. I have commented out the where clause and returned all rows regardless of email address to try to troubleshoot this problem. In either event, I then use a resultset to store the query result of type object and result name 0.
I then pass this resultset into a script variable to start parsing the sql rows returned as type object. ( I assume this is the correct way to do this from other prior posts ...).
The script appears to throw an exception at the following line. I assume it's because I'm either not passing in the values properly or the query doesn't return anything. However, I am certain the query works as it executes just fine at the command prompt.
My intent is to email the query results to each email address with the following type of data by passing the parsed data from the script to a send mail task. Email works fine and sends out messages but the content is empty. I pass the parsed data as string values to the messagesource and define the messagesourcetype as a variable in the mail task.
part number leadtime
x 5
y 9
....
Does anyone have any idea what I might be doing wrong?
Everything I've read says that custom data flow components are built by inheriting from the Microsoft.SqlServer.Dts.Pipeline.PipelineComponent class.
But the stock components such as the Derived Column data flow transformation must each be implemented by their own class. So how do I base my custom components on those classes? The documentation for the PipelineComponent class doesn't list any such subclasses.
I am writing a custom task that has some custom properties. I would like to parameterize these properties i.e. read from a varaible, so I can change these variables from a config file during runtime.
I read the documentation and it says if we set the ExpressionType to CPET_NOTIFY, it should work, but it does not seem to work. Not sure if I am missing anything. Can someone please help me?
In the Editor of my custom task, under custom properties section, I expected a button with 3 dots, to click & pop-up so we can specify the expression or at least so it evaluates the variables if we give @[User::VaraibleName]
I am writing a custom dataflow destination that writes data to a named pipe server running in an external process. At runtime, in PreExecute, I check to see if the pipe exists. If it does not exist I want to be able to fail the component and gracefully complete execution of my package.
Question is, how do I do this? If I make a call to ComponentMetaData.FireError() that will only create an OnError event and not actually stop the execution. So, how to I halt the execution of my component and return the error message?
Here is the situation. I have created a package that takes 50 columns from a comma delimited flat file. I then validate and clean the data. Next I add two columns that were not in the original source file. These two columns need to be in the 5th and 9th column position when the file is then re-written to a text file. How do i get those two columns to write out in the desired order? Any ideas?
I am writing a Custom Destination component with a custom UI. The UI contains a combo box which contains the connection names of type €œFLATFILE€?. I also have provided a button which would create a new connection of type €œFLATFILE€? by making a call to CreateConnection method of IDtsConnectionService. The combo box gets properly updated showing the connections of type €œFLATFILE€? but on clicking on the new Connection button the application hangs up. Am I missing something or is there some other way to do it?
The function are the events handlers which are called by the UI.
if (connectionService != null) { ArrayList temp_Connections = connectionService.GetConnectionsOfType("FLATFILE");
args.AvailableColumns = new AvailableColumnElement[temp_Connections.Count]; for (int i = 0; i < temp_Connections.Count; i++) { ConnectionManager runtimeConnection = (ConnectionManager)temp_Connections; args.AvailableColumns.AvailableColumn = new DataFlowElement(runtimeConnection.Name, runtimeConnection); }
args.AvailableColumns = new AvailableColumnElement[temp_Connections.Count]; for (int i = 0; i < temp_Connections.Count; i++) { ConnectionManager runtimeConnection = (ConnectionManager)temp_Connections; args.AvailableColumns.AvailableColumn = new DataFlowElement(runtimeConnection.Name, runtimeConnection); }
I am running SQL 2005 9.0.1399 and VS 2005 8.0.50727.42 (RTM.50727.4200) on Windows Server 2003 Enterprise Edition SP1. Any suggestions would be welcome.
The code sample is an extension to the RemoveDuplicates sample (Dec 2005) which comes along with the SQL Server.
I have created a custom script destination using an article found on TechNet. The script executes fine and completes, but with one small problem. The last package execution log entry states "wrote 0 rows". This is a problem because I use this number to verify that all of the database records have been properly transferred.
I have monitored that database when the script is executing and have verified that the data is being transferred property.
Is there some function that I should be calling with each row prcessed in the custom script destination to "tally" the row count? This way the script component will display the proper row count.
Hi, All the examples I've seen for creating custom data destinations (scripts or components) show that in the ProcessInput method you need to iterate through each record and execute a commit for each row. Here's a sample from http://forums.microsoft.com/msdn/showpost.aspx?siteid=1&postid=70469
Public Overrides Sub MyAddressInput_ProcessInputRow(ByVal Row As MyAddressInputBuffer) With sqlCmd .Parameters("@addressid").Value = Row.AddressID .Parameters("@city").Value = Row.City .ExecuteNonQuery() End With End Sub
This works for me but is extremely slow when processing lots of rows. Has anyone come across a better/faster example for committing rows to a database when creating a custom data destination component? I was thinking about using the ADO.NET batch update. Any thoughts on this or other approaches?
I wrote a custom destination component. Everything works fine, except there is a logging message that is displayed that I cannot get rid of or correct. Here is the end of the output of a package containing my component:
Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning. Information: 0x0 at Data Flow Task, MyDestination: Inserted 40315 rows into C: empfile.txt Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "MyDestination" (9)" wrote 0 rows. SSIS package "Package.dtsx" finished: Success.
I inserted a custom information message that contains the correct number of rows written by the component. I would like to either get rid of the last message "... wrote 0 rows", or figure out what to set to put the correct number of rows into that message.
This message seems to happen in the Cleanup phase. It appears whether I override the Cleanup method of the Pipeline component and do nothing, or not. Any ideas?
How do I retrieve the connections (connection managers) collections from Custom Data Flow destination? ComponentMetadata.RuntimeConnectionCollection is empty. I would like to be able to access all the connections defined in the package from the custom data flow task.
I came across code in which it was possible to access the Connections collection using the IDtsConnectionService for custom task (destination). The custom task has access to serviceProvider, whcih can be used to get access to the IDtsConnectionService interface but not the custom data flow task.
Has anyone had any success sending or receiving file(s) from either Script or FTP task? I've Google and found examples and no luck for me. The main idea is to send a file from local PC/server to mainframe.
I've used this workaround, SSIS Script Task but no good.
SQL Server Feedback Workarounds 281893. SSIS FTP Task - Mainframe When you try to connect to a mainframe (os / 390) to ftp receive a file you get an error message stating that the path does not begin with a "/". Active feedback entered 6/7/2007 by EWisdahl
Entered by EWisdahl on 6/7/2007
Add a script task (as follows) to download the desired files... ' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task. Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain ' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main() Try 'Create the connection to the ftp server Dim cm As ConnectionManager = Dts.Connections.Add("FTP") 'Set the properties like username & password cm.Properties("ServerName").SetValue(cm, "myServer") cm.Properties("ServerUserName").SetValue(cm, "myUserName") cm.Properties("ServerPassword").SetValue(cm, "myPassword") cm.Properties("ServerPort").SetValue(cm, "21") cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb cm.Properties("Retries").SetValue(cm, "1") 'create the FTP object that sends the files and pass it the connection created above.
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing)) 'Connects to the ftp server ftp.Connect() ftp.SetWorkingDirectory("MyFolder.MySubFolder.MySubSubFolder")
Dim files(0) As String files(0) = "MyfileName" ftp.ReceiveFiles(files, "C: emp", True, True)
' Close the ftp connection ftp.Close()
'Set the filename you retreive for use in data flow Dts.Variables.Item("FILENAME").Value = maxname Catch ex As Exception Dts.TaskResult = Dts.Results.Failure
I have a strange problem, which I never encountered earlier, I have a data flow which has one source and multiple destinations, I am also using multicast and derived column transformations too, however when I run the package (either from cmd line or designer) the data flow task hangs when it comes to inserting data into the target tables. and it keeps waiting forever. Initialy I thought it was my data set , however if with very few records the problem persists. I have also tried removing the table lock from the destination , increasing the rows per batch to 10000 and commit size to 10000 but to no change. I am using the fast load option within my destination. When I run sp_who2 on my Management Studio I see that the status of all the processes is sleeping and the command is "awaiting command" , any help appreciated.
I've created a custom task in VB6 and compiled the DLL. wHen I run the task in DTS designer as a step it runs ok, however when I try to exectue the whole package nothing happens??
I have a SSIS package in which at the end i have a excel destination task which imports a table data in excel files and then send it as an attachment using email task but i am facing a issue which is mentioned below -
1. The SSIS package runs as a SQL server agent job once in a day now every time the job runs i need to creat a new excel ( or delete the older one create again ) import data in that excel using Excel destination and then send it as an attachment using email task , How can i dynamically change the excel destination to point to a template file or new file
2. WIth the email task i am able to send only one attachment , i tried seperating the file names using semi colon but it is giving error "You dont have permission to access the file or file does not exists"
Hi there, I'm still pretty new to SSIS, what I am really battling with is getting a custom task to work. I have developed a custom task, digitally signed it added it to the Global Assembly Cache, written the UI for it - Pretty much as it is done here: http://technet.microsoft.com/en-us/library/ms136080.aspx but my problem is that when I drag it from the tool box into the control flow pane, I get this message: "The task user interface specified by type name 'XXXXXXXXXXXXXXXXXXX' could not be loaded. (Microsoft.DatatransformationServices.Design).
Does anybody have an idea of what I am doing wrong here?
We have developed some custom tasks which we currently deploy the standard way. That is, we install the custom task assemblies in the DTSTasks folder and we also install the assemblies in the GAC. Due to some deployment practices we are trying to implement we would like to be able to remove the assemblies from the GAC and install them in some other location. I have tried installing the assemblies in the DTSBinn folder, but this does not appear to be working. When I drop the custom control flow tasks onto the package's control flow tab I get the "Cannot create a task with the name ..." error.
Is it possible to not have the custom task assemblies in the GAC? Are there any tricks to getting it to work?
FYI, our assemblies are signed. I'm not sure if that has anything to do with our troubles.
Had anyone code a custom task UI and used the propertygrid like most of the SSIS components do?
I am having an issue while trying to get the connections from my UI FORM.
They way i did was to get the parent of the form (The package) and then get the connections but that will only work for the FORM if i use the property from the properties panel i throws an error because obviously i dont have the FORM neither the parent...
I now i should use the service provider and get the IDTSconnection... but somehow it always comes = "Nothing"....