SSIS OLE DB Source SQL Command From Variable DataType Problem
Sep 12, 2006
Hi,
I have to read data from an MDB file into SQL Server. Simple, but I need to place a literal into every row. In my OLE DB Source I would use:
SELECT '001' AS col1, foo, bar FROM Table1
But the problem is that I pass my value for col1 into the package from C# using Managed DTS.
So I set up my OLE DB Source using "SQL command from variable." And I have two variables one to receive the value of for col1 (col1) and one that is an evaluated expression that has a dynamic sql statement (dynSql).
col1 is a String as is dynSql. dynSql looks like this:
"SELECT " + @[col1] + " as col1, foo, bar FROM Table1"
Now as long as I set the value of col1 to '001' that should work. Right? Well not exactly. The destination for col1 in SQL Server is a char( 3 ). But the package makes the col1 column from the source a DT_WSTR. Which causes the famed: "Text was truncated or one or more characters had no match in the target code page.".
Is there a better solution?
Thanks. Kenneth.
View 1 Replies
ADVERTISEMENT
Apr 24, 2008
Good afternoon,
I have an issue with an ssis variable datatype.
The scenario is as follows:
I have a stored procedure:
PROCEDURE [dbo].[sp_newTransaction]
@sourceSystem varchar(50),
@txOut NUMERIC(18,0) OUTPUT
AS
insert into scn_transaction (sourceSystemName) values(@sourceSystem);
SELECT @txOut = @@identity
Whose purpose is to perform an insert into a table and return me the identity value of the inserted record, which I'll then use throughout the rest of my package. The identity column in the inserted table is numeric(18,0).
I execute the stored proc with the following sql with an OLE DB connection manager:
exec sp_newTransaction ?, ?
The first parameter is a string variable from earlier in the package, and the second is the output parameter. I have the following parameter mappings to the execute sql task:
User:ystxId output numeric 1 -1
User:ourceSys input varchar 0 -1
The proc is correctly called, and the row insesrted, however I get a type conversion error when SSIS attempts to map the return parameter to my package variable... I've tried all sorts of combonations, and can't seem to get it to execute.
At one point I wasn't returning a numeric, but rather an int from the stored proc, and all was well until I went to use the variable in a derived column later in the package, and the type was converted quite incorrectly (a 1 was 77799789080 or some such), indicating a type conversion error likely related to the encoding of the number.
I'd like to keep the datatypes as numeric and make ssis use those - any pointers are greatly appreciated as to what type my package variable should be to allow proper assignment of a sql server numeric type to it.
Thanks much,
B
View 6 Replies
View Related
Apr 15, 2008
I read several posts on this topic and I would like to confirm my understanding.
This question has to do with parameterizing the select statement in the OLE DB Source editor. Initially, I thought I could use Data access mode: SQL command, and somehow use a user variable in there to build my Select statement.
But in researching further, it looks like I need to store the entire SQL command in a one long string variable (let's call it A), where A is built off of another variable that has my parameter. I then use A in Data access mode: SQL command from variable.
Am I correct? And is there not a way to accomplish the samething with only one variable that is your parameter value?
It seems a bit clumsy to store the entire SQL expression in a string variable.
View 10 Replies
View Related
Apr 10, 2008
Using Jamie's, John's and Rafae post i have got this far.
My scenario demands me to do an incremental load. So i have followed your article and have created a one variable 'vDate' (Dataflow Task, Datetime, 3/3/2008) and another variable SourceSQL (Dataflow Task, String). I have set this SourceSQL evaluation to True and have added this;
"select * from stagingperfdata where startdatetime = "+ (DT_WSTR,10) @[vDate].
I want to call this within a OLE DB source, so i select the data access method to SQL command from variable. My variable name is visible and i am able to select it and then save and debug. I also have a derived column transformation to increment the variable vDate by one day.
The flow is successful but there is no records written to the target.
I tried the same query (available in the variable's value column) in SSMS and it is retrieving 12 records.
I went back to the Ole Db source and wanted to preview the data, but i get "Query timeout expired (Microsoft SQL Native Client)" error.
Am i missing something fundamental?
View 7 Replies
View Related
Nov 4, 2006
Hi All,
i am using a OLE DB Source in my dataflow component and want to select rows from the source based on the Name I enter during execution time. I have created two variables,
enterName - String packageLevel (will store the name I enter)
myVar - String packageLevel. (to store the query)
I am assigning this query to the myVar variable, "Select * from db.Users where (UsrName = " + @[User::enterName] + " )"
Now in the OLE Db source, I have selected as Sql Command from Variable, and I am getting the variable, enterName,. I select that and when I click on OK am getting this error.
Error at Data Flow Task [OLE DB Source [1]]: An OLE DB error has occurred. Error code: 0x80040E0C.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E0C Description: "Command text was not set for the command object.".
Can Someone guide me whr am going wrong?
myVar variable, i have set the ExecuteAsExpression Property to true too.
Please let me know where am going wrong?
Thanks in advance.
View 12 Replies
View Related
Oct 10, 2007
I created a Stored Proc to compute all the data I need to export to a CSV file.
I use the provider MS OLE DB Provider for SQL Server.
It's a very simple package with a single Data Flow Task.
This flow task is using an OLE DB Source to call a simple SQL Command : Exec MyStoredProc.
There are no params.
This Stored Proc is using table variables to compute the data.
It takes about 10 seconds to return anything.
The problem is that the mapping doesn't work with the OLE DB Source.
There are no fields at all shown in the mapping screen.
I tried to replace the Stored Proc by a version which only returns fields and no data.
Then the mapping would work just fine.
The package is then compiling and working fine.
But everytime I put back the real stored proc, even without changing the SQL Command, the SSIS execution breaks at execution.
It keeps saying :
"Error: 0xC0202005 at Data Flow Task, OLE DB Source [477]: Column "RecordType" cannot be found at the datasource."
My guess is that SSIS doesn't wait the 10 secs and thinks the Stored Proc is not returning anything.
What can I do to make this work ?
Thanks,
Vincent
View 3 Replies
View Related
Sep 3, 2007
Hi,
I am migrating one of my DTS package to SSIS.
My task is to read the filename from a database table and transfer the flat file data in to a table.
In SSIS,I am able to fetch the file name using a Data Reader Source; but how to pass this fileName parameter to Flat File Source ?
In DTS I have used ActiveX script to pass filename variable as flatfilecon.Source.
Any help ?
Thanks,
Ravi
View 4 Replies
View Related
Apr 3, 2007
Thanks for any one can give me a help.
I am try to transfer some tables data from one database server into another database server. I create a package in SSIS, and I use a variable to pass each table name. In Data flow, I use a OLEDB Source, but I cannot set the Data access mode to Table name or view name variable. Ever time, I will get this following error info "===================================
Error at Data Flow Task [OLE DB Source [31]]: A destination table name has not been provided.
(Microsoft Visual Studio)
===================================
Exception from HRESULT: 0xC0202042 (Microsoft.SqlServer.DTSPipelineWrap)
------------------------------
Program Location:
at Microsoft.SqlServer.Dts.Pipeline.Wrapper.CManagedComponentWrapperClass.ReinitializeMetaData()
at Microsoft.DataTransformationServices.DataFlowUI.DataFlowComponentUI.ReinitializeMetadata()
at Microsoft.DataTransformationServices.DataFlowUI.DataFlowAdapterUI.connectionPage_SaveConnectionAttributes(Object sender, ConnectionAttributesEventArgs args)".
Some one can tell me what is the reason, or give me some examples.
Thanks in advance.
View 7 Replies
View Related
Dec 20, 2005
Hi,
I want to transform an xml flow to an html flow. For this, I create an XmlDataDocument, I add on it, all that I want, and I store it in a file (in a dataflow task).
After that, in an xml task, I set the input properties like that:
Operation type: xslt
SourceType: File connection
Source: ras.xml
All works fine. But now, I want to use a variable to store my xml data. So, instead of storing my data in ras.xml, I store it in a variable like that:
Dim v As Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSVariables90
Me.VariableDispenser.LockOneForWrite("RAS", v)
v("RAS").Value = doc.OuterXml
v.Unlock()
and in the xml task, I set configuration like that:
Operation type: xslt
SourceType: Variable
Source: user::RAS
But when I execute, I got this following exception:
Error: 0xC002F304 at Format HTML Mail, XML Task: An error occurred with the following error message: "Root element is missing.".
Error: 0xC002928F at Format HTML Mail, XML Task: Property "New Source" has no source Xml text; Xml Text is either invalid, null or empty string.
Furthermore, when I want to debug it, in replacing the xml task, by a script task, I can see that in my RAS variable, I get the xml data which is well formed.
<?xml version="1.0" encoding="UTF-8"?>
<root> Ķ
</root>
(When I store it in a file, I can see it in IE)
Have you got an idea ?
View 3 Replies
View Related
Nov 27, 2011
I need to create an Bulk upload utility using ASP.Net and SQL Server. Below is the process for the uploads -
Excel Template wherein user will enter the details. A Tab-delimited output file will be generated using the VBA.
There are 2 tables - one is Temp Table which is replica of the the final table and second is the final table
Using File.OpenText(filePath).ReadLine() - All the Rows from the tab delimited data file will be inserted into DataTable.
using SQLBulkCopy the tab-delimited data file data will be inserted into the Temp Table.
Data will be validated based on the data inserted in the temp table. If the data as errors then the temp table will be cleared else the data will be inserted from the temp table to the final table.
My Issue is that in both the tables there is a column (Name : PeopleKey (Int PrimaryKey)). If the user enters Alphabetic value then the Bulk Utility is failing. Below are the two options in my mind -
1. I can change the DataType in Temp table from int to VARCHAR. So, the data can be inserted at first and then I can validate and get the data corrected. But i am not sure whether it is the right way to fix issue as the source and target tables columns are different.
2. When the data in inserted into the Datatable by following Step 3. So, once the data in inserted into DataTable then i can validate there. Thus the source and target tables Datatype will be same.
View 1 Replies
View Related
Jun 14, 2007
Hi,
i am trying to load output of count(X) and sum(salesamt) into the same column. if iam using transformation data task what datatype should i be converting the two outputs to accomidate result as
10.00 --count
234.00 --saleamt
22.00 --count
1000.00 --saleamt
View 3 Replies
View Related
Mar 21, 2008
I am trying to migrate data from MySQL 5.0 to SQL Server 2005. The MySQL database has a table which stores the profile description in different languages like (Arabic, Spanish etc). I use MySQL ODBC 5.1 driver for creating the ODBC connection and creating a ADO connection in SSIS using that ODBC. The datareader source connection is set to this ADO connection.
When I view the properties of columns in Datareader source it shows as Unicode, which is good. But when I migrtae to SQL Server 2005 I get junk data instead of the data in Arabic, Spanish etc.
Am I missing something or is there any other alternative to do the data transfer correctly?
View 10 Replies
View Related
Jul 21, 2006
The DATEDIFF(s, time1, time2) function output is of the 'datetime' datatype.
How can I alter the 'datetime' to 'int' (or 'bigint') datatype to make it compatible with other variables in my calculations?
View 7 Replies
View Related
Aug 29, 2005
I have image type col.I'm trying to do the following,DECLARE @Data varbinary(16)SET @Data = (select imageCol from Table1 where id=3)As image datatype returns varbinary value, so I want to store image col value to a varbinary variable(or any other type variable, eg., varchar). But getting following error,========================================Server: Msg 279, Level 16, State 3, Line 2The text, ntext, and image data types are invalid in this subquery or aggregate expression.========================================Is there anyway to store image datatype value to a variable?Cheers.
View 2 Replies
View Related
Mar 12, 2003
I've created a stored procedure that converts an input string in richtext format (input as type TEXT) to plain text. I would like to be able to return this newly converted string, but I need to have some way of storing it in a local variable. My problem is that since I can't use the TEXT datatype as a local variable, I have no way of storing the large amounts of text I converted within the procedure. The VARCHAR(8000) just isn't large enough for my purposes. Anyone have any suggestions on how to go about doing this?
View 2 Replies
View Related
May 30, 2007
hi everyone,
How do i declare a global variable in my package which takes a numeric value like User::VAR1 = 200402 and later on work on it
Later in the properties of the Dataflow i want to have this expression..
"select * from " + "TAB1" + " where Date=" + @[User::VAR1]
Here i want to subtract 190000 from @[User::VAR1] to get it in to myformat i.e the DATE format in the table
I can only see String datatype and othe datatypes wont allow me to to do any kind of manipulation in the expression:
and to be more specific what are
Int16
Int32
Int64
Double
I tried to use all of the above but the expression doesnt allow me as it says:
TITLE: Microsoft Visual Studio
------------------------------
Nonfatal errors occurred while saving the package:
Error at Extract: The data types "DT_WSTR" and "DT_I8" are incompatible for binary operator "+". The operand types could not be implicitly cast into compatible types for the operation. To perform this operation, one or both operands need to be explicitly cast with a cast operator.
Any help would be appreciated..
thanks,
ravi
Nothing much that i can do..!!
View 2 Replies
View Related
May 30, 2007
hi everyone,
How do i declare a global variable in my package which takes a numeric value like User::VAR1 = 200402 and later on work on it
Later in the properties of the Dataflow i want to have this expression..
"select * from " + "TAB1" + " where Date=" + @[User::VAR1]
Here i want to subtract 190000 from @[User::VAR1] to get it in to myformat i.e the DATE format in the table
I can only see String datatype and othe datatypes wont allow me to to do any kind of manipulation in the expression:
and to be more specific what are
Int16
Int32
Int64
Double
I tried to use all of the above but the expression doesnt allow me as once i use any of the above it says :
TITLE: Microsoft Visual Studio
------------------------------
Nonfatal errors occurred while saving the package:
Error at Extract: The data types "DT_WSTR" and "DT_I8" are incompatible for binary operator "+". The operand types could not be implicitly cast into compatible types for the operation. To perform this operation, one or both operands need to be explicitly cast with a cast operator.
Any help would be appreciated..
thanks,
ravi
View 8 Replies
View Related
Mar 13, 2004
Why can varchar datatype variable only 4000 byte?
For example:
in a storedprocedure
declare @aa varchar(8000)
......
while
select @aa=@aa+@otherinfo
end
when the length is more than 4000 ,the data in the behind will be lost
View 1 Replies
View Related
Aug 22, 2007
Hello guys!
I am looking for the best DataType that makes it possible to store strings of variable length (from 15 to 300 caracters, rarely longer).
Thanks a lot for any help !
Regards,
Fabianus
my favorit hoster is ASPnix : www.aspnix.com !
View 3 Replies
View Related
May 3, 2006
Hi,
I am trying to set the OLE DB Source Editor. I am using the following option for Data Access mode
SQL Command from a variable and uses my sqlQuery variable.
sqlQuery variable contains a sql statement but I am getting the following error
ox80040E0C, Command text was not set for command object,
Additional information HRESULT 0xC0202009
Does it mean I have to give the variable name containing the stored procedure?
Please Guide
View 2 Replies
View Related
Sep 14, 2007
Hi
I am having a huge xml file with nested section.
i also have a xsd file for that xml.
i have a destination table where the data from the xml should be loaded into.
i am using the xml source transformation. But o get all the data i need to use multiple merje joins to get the data in a single row which i can insert into the destination.i was not quiet convinced with using so many joins.
so i tried using the script source transformation where i am using xml objects to get the node and dynamically construction the data row. and the output is then inserted into the destination.
on comparing the two approach the one using the script source is working much faster than the xml source transformation.
i wanted to know is there any limitaion using the script source to parse through xml files.
also i would like to know any other better way of getting the data from xml source without using the joins.
Hari
View 7 Replies
View Related
Aug 10, 2006
After several hours of trying, I trow the towel in the ring and come here to ask a question.
Source system uses a timestamp column in the transaction tables. which is equal to a non-nullable binary(8) datatype (sql 2000 bol).
What I want to do is get the timestamp at the start of the transfer and at the end of the transfer of data. and store these in a controltable
I try to do this in 2 sql execute tasks:
sqltask 1: "select @@DBTS AS SourceTimestamp" and map the resultset to a variable. Here come's the first problem what variable type to take ?
DBNULL works (meaning it doesn't give errors) (BTW: is there a way to put a variable as a watch when debugging sql tasks ?)
INT64 and UINT64 don't work error message that types for column and parameter are different
STRING works
Then I want to store this variable back in a table of a different data source
sqltask2: "insert into controltable values(getdate(), ?)" and make an input parameter that takes the previous timestamp ...
if I took DBNULL as a type for the variable there doesn't seem to be a single parameter type that works ???
if i take STRING as a type for the variable I have to modify the sql to do the explicit conversion from string to binary so I change CAST(? as binary). It doesn't return any error but the value stored in the table is 0x00000000000 and not the actual timestamp.
Any help on this one ? Why are the INT64/Bigint not working here, you can perfectly do a convert(bigint, timestampfield) in sql ?
How came the SQL datatypes, and the variable datatypes, parameter datatypes are so badly alligned to each other (and all seem to use different names) ?
tx for any help
Dirk
View 6 Replies
View Related
May 8, 2008
Hi,
I have a script component that contains a script similar to that shown below;
Public Sub Main()
Dim sqlConn As New SqlConnection("Server=myserver;Database=mydb;Trusted_Connection=True;")
Dim SQLStmt As String
SQLStmt = "select top 10 CountID from mytable"
Dim sqlComm As New SqlCommand(SQLStmt, sqlConn)
sqlConn.Open()
Dim r As SqlDataReader = sqlComm.ExecuteReader()
While r.Read()
Dim MyValue As Integer
If r("CountID") IsNot DBNull.Value Then
MyValue = CInt(r("CountID "))
' HERE I WANT TO WRITE MyValue TO A PACKAGE VARIABLE AS A ROW
End If
End While
r.Close()
sqlConn.Close()
Dts.TaskResult = Dts.Results.Success
End Sub
Once i have a value assigned to MyValue, I want to write this as a row to a variable (DATATYPE Object Value VALUE System.Object) in the package which will be an ADO Recordset Destination that i can then loop through later.
Many thanks for any help
Phil Harbour
View 1 Replies
View Related
Oct 3, 2007
I have a data source that I access via odbc in a DataReader Source component in SSIS. I can access the data fine. However, I am having problems with certain fields that are numeric (specifically home prices ranging from 100,000.00 to 99,999,999.00). In the advanced editor for my data reader source under the input and output properties tab, in data reader output under the external columns and output columns, these fields for some reason default to numeric data types with a precision of 4 and a scale of zero, not large enough to hold the data that is coming in. This causes errors that make the data come in as null (after i specify to ignore the errors).
I can change the precision and scale to 18 and 4 in the external columns, but when I try to change the datatype, precision or scale in the output columns I get the following message:
Property Value is not valid.
The details are:
Error at Import DataReader Source: The data type of output columns on the component "DataReader Source" cannot be changed.
Error at DataReader Source: System.Runtime.InteropServices.COMException (0xC020837D)
at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.SetOutputColumnDataTypeProperties(Int32 iOutputID, Int32 iOutputColumnID, DataType eDataType, Int32 iLength, Int32 iPrecision, Int32 iScale, Int32 iCodePage)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostSetOutputColumnDataTypeProperties(IDTSManagedComponentWrapper90 wrapper, Int32 iOutputID, Int32 iOutputColumnID, DataType eDataType, Int32 iLength, Int32 iPrecision, Int32 iScale, Int32 iCodePage)
Any help is greatly appreciated.
Dave
View 1 Replies
View Related
Aug 17, 2015
I'm writing a custom source component that reads data from a SharePoint list with dynamic mapping to output columns. It's my first custom component and it's based on several samples and tutorials from Internet
Output columns are not created by the component itself, they must be added by user at design time. The component makes dynamically an association between SharePoint fields and available output columns at run-time (based on an mapping table).
I made a very basic skeleton and I encounter a problem when I add a column to output: it has no datatype and when I try to set one I have an the error Property value is not valid, The component xxxxxx does not allow setting output column datatype properties.
Imports System
Imports Microsoft.SqlServer.Dts.Pipeline
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
<DtsPipelineComponent(ComponentType:=ComponentType.SourceAdapter,
DisplayName:="SharePoint Dynamic Assoc List Source",
[Code] ....
View 4 Replies
View Related
Nov 15, 2007
Hi all,
I want to pass a parameter into my OLE DB source. For eg, I want to set the condition
SELECT * FROM "tablename" where ID = parameter <-----------------
I do not know how to do this? can it be done by using package variable?
View 2 Replies
View Related
Mar 21, 2008
Hi All,
I have a requirement to create a dynamic SQL Command in an OLE DB Source due to the fact that I need to read data from another database based on a date range. For example, the SQL command would look like
SELECT * FROM Table1 WHERE DateField BETWEEN '17/03/2008' and '21/03/2008'
and I need to change the dates - '17/03/2008' and '21/03/2008' to different dates when the package is deployed in production, how do I do that ?
Regards
Ash
View 12 Replies
View Related
Oct 25, 2006
I'm working on an SSIS package that uses a vb.net script to grab some XML from a webservice (I'd explain why I'm not using a web service task here, but I'd just get angry), and I wish to then assign the XML string to a package variable which then gets sent along to a DataFlow Task that contains an XML Source that points at said variable. when I copy the XML string into the variable value in the script, if do a quickwatch on the variable (as in Dts.Variable("MyXML").value) it looks as though the new value has been copied to the variable, but when I step out of that task and look at the package explorer the variable is its original value.
I think the problem is that the dataflow XML source has a lock on the variable and so the script task isn't affecting it. Does anyone have any experience with this kind of problem, or know a workaround?
View 1 Replies
View Related
Mar 6, 2008
I have a SQL Task that updates running totals on a record inserted using a Data Flow Task. The package runs without error, but the actual row does not calculate the running totals. I suspect that the inserted record is not committed until the package completes and the SQL Task is seeing the previous record as the current. Here is the code in the SQL Task:
DECLARE @DV INT;
SET @DV = (SELECT MAX(DateValue) FROM tblTG);
DECLARE @PV INT;
SET @PV = @DV - 1;
I've not been successful in passing a SSIS global variable to a declared parameter, but is it possible to do this:
DECLARE @DV INT;
SET @DV = ?;
DECLARE @PV INT;
SET @PV = @DV - 1;
I have almost 50 references to these parameters in the query so a substitution would be helpful.
Dan
View 4 Replies
View Related
Mar 2, 2008
I'm having difficulty passing a date parameter in a SSIS OLEDB connection to a stored procedure using SQL command with the error message "Attempted to read or write protected memory ..." and if I create a variable, I get the error "must declare the scalar variable..". Here is the OLEDB command: "EXEC dbo.mic_TGDrop ?" where ? requires a date parameter @ReportDate. Why does the call to the stored procedure fail? What would be the correct syntax of a variable to use SQL Command from variable? The SQL BOL does not cover this; is there another resource that has a simple example? Thanks.
View 2 Replies
View Related
Mar 29, 2006
Why I canīt use the command USE with variable?
declare @banco varchar(20)
select @banco='xpto'
USE @banco
I have try in other way without success
DECLARE @banco varchar(50)
declare @SQL varchar(50)
select @banco='xpto'
print @banco
select @SQL='USE '+@banco
print @SQL
exec(@SQL)
Help me, please.
Rodrigo
View 7 Replies
View Related
Oct 21, 2007
KEEPS SAYING MISSING EXPRESSION:
IN THE VARIABLE EXPRESSION I'VE WRITTEN THE FOLLOWING COMMAND:
"SELECT * FROM MY_TABLE WHERE T1.DATE >= @[USER:ATE]"
I'VE CLICKED EVALUATE EXPRESSION AND IT ACCEPTS THE EXPRESSION
HOWEVER WHEN I GO TO DATA FLOW TO SELECT SQL COMMAND FROM VARIABLE IT BRINGS BACK MISSING EXPRESSION????
View 13 Replies
View Related
Feb 27, 2008
I'm new to SSIS, but have been programming in SQL and ASP.Net for several years. In Visual Studio 2005 Team Edition I've created an SSIS that imports data from a flat file into the database. The original process worked, but did not check the creation date of the import file. I've been asked to add logic that will check that date and verify that it's more recent than a value stored in the database before the import process executes.
Here are the task steps.
[Execute SQL Task] - Run a stored procedure that checks to see if the import is running. If so, stop execution. Otherwise, proceed to the next step.
[Execute SQL Task] - Log an entry to a table indicating that the import has started.
[Script Task] - Get the create date for the current flat file via the reference provided in the file connection manager. Assign that date to a global value (FileCreateDate) and pass it to the next step. This works.
[Execute SQL Task] - Compare this file date with the last file create date in the database. This is where the process breaks. This step depends on 2 variables defined at a global level. The first is FileCreateDate, which gets set in step 3. The second is a global variable named IsNewFile. That variable needs to be set in this step based on what the stored procedure this step calls finds out on the database. Precedence constraints direct behavior to the next proper node according to the TRUE/FALSE setting of IsNewFile.
If IsNewFile is FALSE, direct the process to a step that enters a log entry to a table and conclude execution of the SSIS.
If IsNewFile is TRUE, proceed with the import. There are 5 other subsequent steps that follow this decision, but since those work they are not relevant to this post.
Here is the stored procedure that Step 4 is calling. You can see that I experimented with using and not using the OUTPUT option. I really don't care if it returns the value as an OUTPUT or as a field in a recordset. All I care about is getting that value back from the stored procedure so this node in the decision tree can point the flow in the correct direction.
CREATE PROCEDURE [dbo].[p_CheckImportFileCreateDate]
/*
The SSIS package passes the FileCreateDate parameter to this procedure, which then compares that parameter with the date saved in tbl_ImportFileCreateDate.
If the date is newer (or if there is no date), it updates the field in that table and returns a TRUE IsNewFile bit value in a recordset.
Otherwise it returns a FALSE value in the IsNewFile column.
Example:
exec p_CheckImportFileCreateDate 'GL Account Import', '2/27/2008 9:24 AM', 0
*/
@ProcessName varchar(50)
, @FileCreateDate datetime
, @IsNewFile bit OUTPUT
AS
SET NOCOUNT ON
--DECLARE @IsNewFile bit
DECLARE @CreateDateInTable datetime
SELECT @CreateDateInTable = FileCreateDate FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName
IF EXISTS (SELECT ProcessName FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName)
BEGIN
-- The process exists in tbl_ImportFileCreateDate. Compare the create dates.
IF (@FileCreateDate > @CreateDateInTable)
BEGIN
-- This is a newer file date. Update the table and set @IsNewFile to TRUE.
UPDATE tbl_ImportFileCreateDate
SET FileCreateDate = @FileCreateDate
WHERE ProcessName = @ProcessName
SET @IsNewFile = 1
END
ELSE
BEGIN
-- The file date is the same or older.
SET @IsNewFile = 0
END
END
ELSE
BEGIN
-- This is a new process for tbl_ImportFileCreateDate. Add a record to that table and set @IsNewFile to TRUE.
INSERT INTO tbl_ImportFileCreateDate (ProcessName, FileCreateDate)
VALUES (@ProcessName, @FileCreateDate)
SET @IsNewFile = 1
END
SELECT @IsNewFile
The relevant Global Variables in the package are defined as follows:
Name : Scope : Date Type : Value
FileCreateDate : (Package Name) : DateType : 1/1/2000
IsNewFile : (Package Name) : Boolean : False
Setting the properties in the "Execute SQL Task Editor" has been the difficult part of this. Here are the settings.
General
Name = Compare Last File Create Date
Description = Compares the create date of the current file with a value in tbl_ImportFileCreateDate.
TimeOut = 0
CodePage = 1252
ResultSet = None
ConnectionType = OLE DB
Connection = MyServerDataBase
SQLSourceType = Direct input
IsQueryStoredProcedure = False
BypassPrepare = True
I tried several SQL statements, suspecting it's a syntax issue. All of these failed, but with different error messages. These are the 2 most recent attempts based on posts I was able to locate.
SQLStatement = exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
SQLStatement = exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
Parameter Mapping
Variable Name = User::FileCreateDate, Direction = Input, DataType = DATE, Parameter Name = 0, Parameter Size = -1
Variable Name = User::IsNewFile, Direction = Output, DataType = BYTE, Parameter Name = 1, Parameter Size = -1
Result Set is empty.
Expressions is empty.
When I run this in debug mode with this SQL statement ...
exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
... the following error message appears.
SSIS package "MyPackage.dtsx" starting.
Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "No value given for one or more required parameters.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
When the above is run tbl_ImportFileCreateDate does not get updated, so it's failing at some point when calling the procedure.
When I run this in debug mode with this SQL statement ...
exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
... the tbl_ImportFileCreateDate table gets updated. So I know that data piece is working, but then it fails with the following message.
SSIS package "MyPackage.dtsx" starting.
Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC001F009 at GLImport: The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
The IsNewFile global variable is scoped at the package level and has a Boolean data type, and the Output parameter in the stored procedure is defined as a Bit. So what gives?
The "Possible Failure Reasons" message is so generic that it's been useless to me. And I've been unable to find any examples online that explain how to do what I'm attempting. This would seem to be a very common task. My suspicion is that one or more of the settings in that Execute SQL Task node is bad. Or that there is some cryptic, undocumented reason that this is failing.
Thanks for your help.
View 5 Replies
View Related