I'm probably not looking in the right place, but all I could find when creating a data flow task was OLE DB Commands. I was trying to utilize a dataaccesslayer piece of code that we use every where in our projects, but because it uses ADO and not OLE DB, it caused an issue between the column data types.
Is there an ADO command object available? Or are we forced to use the OLE DB command object? All I was looking to do was to Execute a SQL command. There's an object on the Control Flow level to do that, but not on the Data Flow level---not sure why that is.
I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.
I'm just testing an SSIS package and am having issues with dealing with locked records.
my situation is as follows:
my source table is oracle, my destination table is in SQL server. my data flow is a very simple update with a lookup transformation and then two OLEDB commands for update and insert.
On each of the OLEDB commands I have set the "command timeout" to 5 seconds (just for testing purposes). also each OLE DB command has a failure path that outputs to a flat file. I'm expecting that if the destination table/records is/are locked then after 5 seconds the record will be output to the flat file.
so to test this I begin a transaction on the destination table and don't commit it. then I start the SSIS job. it doesn't appear to even get to the OLE DB commands. it appears to stop at the beginning of the data flow task. the output window shows this:
"Information: 0x40043007 at Import from Phoenix, DTS.Pipeline: Pre-Execute phase is beginning."
but it just hangs there indefinately. the progress tab tells me that it get's from the validating stage and past the prepare for execute stage but hangs on pre-execute - 0 percent.
I've put the command timeout = 5 on everything that I can find. I've mucked arround with all the possible "validateExternalMetadata" properties even though I only guessed that it may be the cause. is there anything that I'm missing? where should I look next?
(yes it does work perfectly when there is no transaction locking the target table)
I developed a custom data flow task in .net 2.0 using Visual Studio 2005. I installed it into GAC using GACUTIL and also copied it into the pipeline directory. This task runs absolutely fine when I run it on my local machine both in BIDS and using the script in windows 2000 environment. However, when I deployed this package into a windows 2003 server, the package fails at the custom task level. I checked the GAC in windowsassembly directory and it is present. Also I copied the file into the PipeLine directory and verified that I copied it into the correct pipeline directory by checking the registry. The version of the assembly is still Debug. I looked up documentation in MSDN but there is very little information about the errors I am seeing.
The error I get is pasted below, Can somebody please help me as I am currently stuck and running out of ideas to fix this problem.
Code: 0xC0047067
Source: DFT Raw File DFT Raw File (DTS.Pipeline)
Description: The "component "_" (2546)" failed to cache the component metadata object and returned error code 0x80131600.
Code: 0xC004706C
Source: DFT Raw File DFT Raw File (DTS.Pipeline)
Description: Component "component "_" (2546)" could not be created and returned error code 0xC0047067. Make sure that the component is registered correctly.
In my SSIS Data Flow Task, I have a query that retrieves data based on a couple of date parameters. Is there a way we can pass/use the Variables defined in the SSIS package in the query ?
(I am assigning values to those variables from C# code)
The query should look like this:
select ordernumber, customerid from salesorder
where statecode=3 and datefulfilled between @variable1 and @variable2
I am creating a customer data flow component for SSIS for use in a package. I've got some custom properties that I am exposing using the supplied advanced editor (no custom property editor here).
Some of my properties are enumerated types, and I have deciphered how to get those properties to show as dropdown lists of their respective enumerations. (For those of you who may be looking as hard as I did as to how to accomplish this, see the end of this post.)
I also have a few properties which request SSIS package variable names - such as an file name variable. However, I can't figure out how to tell the advanced editor that the property is looking for an SSIS variable, so that it can show a dropdown list of package variables, much like virtually any other Microsoft supplied Data Flow component can.
Is there a Type Converter I could specify for those custom properties? Is there another way to instruct SSIS that my custom property is expecting a variable? Or do I need to code a custom UI for editing my Data Flow Task?
To create a dropdown list of values for a custom property that represents an enum, do the following:
1. Create your enum definition, such as "public enum ThisIsMyEnum { one, two }"
2. Create a new class that inherits from TypeConverter, such as "public class MyEnumConverter : TypeConverter"
3. Override "CanConvertFrom", and return true if "sourceType == typeof(string)"
4. Override "CanConvertTo", and return true if "destinationType == typeof(string)"
5. Override "ConvertFrom", and return the enum value (such as "one" or "two" in my example) that corresponds to the string passed in the parameter "value"
6. Override "ConvertTo", and return a string that corresponds to the enum value passed in the parameter "value"
7. Override "GetStandardValuesSupported" and return true
8. Override "GetStandarValuesExclusive" and return true to indicate that ONLY the enum values should be accepted
9. Override "GetStandardValues", and return a new StandardValuesCollection constructed with Enum.GetValues() of your enum, such as "return new StandardValuesCollection(Enum.GetValues(typeof(ThisIsMyEnum)));"
10. Just above your "public enum" declaration, add a "TypeConverter" attribute to link your type converter to your enum, such as "[TypeConverter(typeof(MyEnumConverter))]"
11. In "ProvideComponentProperties", after you've created your custom property like this: "IDTSCustomProperty90 propEnum = ComponentMetaData.CustomPropertyCollection.New()", add another line to specify the TypeConverter property of the property to the full assembly name of your type converter, like so: "propEnum.TypeConverter = typeof(MyEnumConverter).AssemblyQualifiedName;"
Su writes "I'm trying to use a stored procedure to dynamically update a table whenever other staff in other departments update their do any changes to their databaseds. and thanks for your web site taught me how to pass table names as parameters. But I still have problems withe sql command. You have an example in your article ('dynamic sql 2'), showing how to do a sql SELECTION using a table name and a local variable. But the sql command only use a local variable of varchar type. I'm trying to do INSERT with local variables with different data types. For example:
CREATE PROCEDURE KPISU_F_TotalByF @inputT_From varchar(10), @inputT_To varchar(10), @TableName varchar(1000) AS ----------------------------------------------------- --------input variable------------------------------- DECLARE @inputTerm_From varchar(10), @inputTerm_To varchar(10), @sql_empty varchar(2000), @sql_refresh varchar(2000) ---------------------------------------------------- IF EXISTS (select * from tempdb.dbo.sysobjects where id LIKE object_id('tempdb..#tmpOTLTotalByF')) DROP TABLE #tmpOTLTotalByF
Server: Msg 245, Level 16, State 1, Procedure KPISU_F_TotalByF, Line 256 Syntax error converting the varchar value 'INSERT KPISU_F_OTLTotalByF06 VALUES (14-19 Academy, 2005, ' to a column of data type int. -----------------------------------------------------------
I guess I could change all the columns in the table to data type of varchar. But are there any other way to solve this problem?
I need to pass a parameter from control flow to data flow. The data flow will use this parameter to get data from a Oracle source.
I have an Execute SQL task in control flow to assign value to the Parameter, next step is a data flow which will need take a parameter in the SQL statement to query the Oracle source,
The SQL Looks like this:
select * from ccst_acctsys_account
where to_char(LAST_MODIFIED_DATE, 'YYYYMMDD') >?
THe problem is the OLE DB source Edit doesn€™t have anything for mapping parameter.
I have an Execute SQL Task that returns a Full Rowset from a SQL Server table and assigns it to a variable objRecs. I connect that to a foreach container with an ADO enumerator using objRecs variable and Rows in first table mode. I defined variables and mapped them to the columns.
I tested this by placing a Script task inside the foreach container and displaying the variables in a messagebox.
Now, for each row, I want to write a record to an MS Access table and then update a column back in the original SQL Server table where I retreived data in the Execute SQL task (i have the primary key). If I drop a Data Flow Task inside my foreach container, how do I pass the variables as input to an OLE DB Destination on the Data Flow?
Also, how would I update the original source table where source.id = objRects.id?
Thank you for your assistance. I have spent the day trying to figure this out (and thought it would be simple), but I am just not getting SSIS. Sorry if this has been covered.
Dear All! My package has a Data Flow Task. In Data Flow Task, I use a Script Component and a OLE BD Destination to transform data from txt file to database. Within Data Flow Task, I want to call File System Task to move file to a folder or any Task of "Control Flow" Tab. So, Does SSIS support this task? Please show me if any Thanks
I'm currently setting variables at the package level with an ExecuteSQL task. This works fine. However, I'm now starting to think about restartability midway through a package. It would be nice to have the variable(s) needed in a data flow set within the data flow so that I only have to restart that task.
Is there a way to do that using an SQL statement as the source of the value in a data flow?
OR, when using checkpoints will it save variable settings so that they are available when the package is restarted? This would make my issue a moot point.
Hi all! I recently started working with SSIS and one of the things that is puzzling me the most is what's the best way to go:
A small control flow, with large data flow tasks A control flow with more, but smaller, data flow tasksAny help will be greatly appreciated. Thanks, Ricardo
Hi, I'm trying to implement an incremental data pull (Oracle to SQL) based on Andy's blog: http://sqlblog.com/blogs/andy_leonard/archive/2007/07/09/ssis-design-pattern-incremental-loads.aspx
My development machine is decent: 1.86 GHz, Intel core 2 CPU, 3 GB of RAM. However it seems the data flow task gets hung whenever I test the package against the ~6 million row source, as can be seen from these screenshots. I have no memory limitations on the lookup transformation. After the rows have been cached nothing happens. Memory for the dtsdebug process hovers around 1.8 GB and it uses 1-6 percent of CPU resources continuously. I am not using fast load to insert new records into my sql target table. (I am right clicking Sequence Container 3 and executing this container NOT the entire package in the screenshots)
The same package works fine against a similar test table with 150k rows. http://i248.photobucket.com/albums/gg168/boston_sql92/7.jpg http://i248.photobucket.com/albums/gg168/boston_sql92/8.jpg
The weird thing is it only takes 24 minutes for a full refresh of the entire source table from Oracle to the SQL target table. Any hints,advice would be appreciated.
Hi all,I have a table called PTRANS with few columns (see create script below).I have created a view on top that this table VwTransaction (See below)I can now run this query without a problem:select * from dbo.VwTransactionwhereAssetNumber = '101001' andTransactionDate <= '7/1/2003'But when I create an index on the PTRANS table using the command below:CREATE INDEX IDX_PTRANS_CHL# ON PTRANS(CHL#)The same query that ran fine before, fails with the error:Server: Msg 242, Level 16, State 3, Line 1The conversion of a char data type to a datetime data type resulted inan out-of-range datetime value.I can run the same query by commeting out the AssetNumber clause and itworks fine. I can also run the query commenting out the TransactionDatecolumn and it works fine. But when I have both the conditions in theWHERE clause, it gives me this error. Dropping the index solves theproblem.Can anyone tell me why an index would cause a query to fail?Thanks a lot in advance,AmirCREATE TABLE [PTRANS] ([CHL#] [varchar] (100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,[CHCENT] [numeric](2, 0) NOT NULL ,[CHYYMM] [numeric](4, 0) NOT NULL ,[CHDAY] [numeric](2, 0) NOT NULL ,[CHTC] [char] (2) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL) ON [PRIMARY]GOCREATE VIEW dbo.vwTransactionsASSELECT CONVERT(datetime, dbo.udf_AddDashes(REPLICATE('0', 2 -LEN(CHCENT)) + CONVERT(varchar, CHCENT) + REPLICATE('0', 4 -LEN(CHYYMM))+ CONVERT(varchar, CHYYMM) + REPLICATE('0', 2 -LEN(CHDAY)) + CONVERT(varchar, CHDAY)), 20) AS TransactionDate,CHL# AS AssetNumber,CHTC AS TransactionCodeFROM dbo.PTRANSWHERE (CHCENT <> 0) AND (CHTC <> 'RA')*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
I can populate a dataTable with type double (C#) of say '1055.01' however when I save these to the CE3.5 database using a float(CE3.5) I lose the decimal portion. The 'offending' code is:
We have some columns in a table where the date is stored as 19980101 (YYYYMMDD). The data type for this column is NUMBER(8) in Oracle.
I need to copy rows from Oracle to SQL Server using SSIS. I used the Data Conversion transformation editor to change it to DT_DATE, but the rows are not being inserted to the destination.
On Error, If I fail the component, then the error is :
There was an error with input column "ORDER_DATE_CONV" (1191) on input "OLE DB Destination Input" (29). The column status returned was: "Conversion failed because the data value overflowed the specified type.".
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
I am working on importing an Excel workbook, saved as multiple CSV flat files, that has both group level data and related detail row on the same sheet. I have been able to import the group data into a table. As part of the Data Flow task, I want to be able to save the key value for the group, which I will use when I insert the detail rows.
My Data Flow has the following components: The flat file with the data, which goes to a derived column transformation to strip out extraneous dashes, which leads to the OLEDB Destination component.
I want to save the value as a package level variable, so that I can reference it in another dataflow.
Is this possible, and if so, at what point do I save the value?
I am new to asp.net, C#, and sql server, coming from php and ODBC/Access background (trying to leave all that behind) Anyway, I have a textbox that will be used to INSERT a date into a SQL datetime field. The date coming from the textbox will be formatted like "MM/dd/yyyy". For my purposes the day, month, and year are all that I require, the time is irrelevant. Then from a different page in the site I will be using a variable containing the current date derived from "DateTime.Now.ToString("MM/dd/yyyy")" to select records with a matching date. Is there anyone that has an example showing both inserting into a datetime field and extracting records based off the date format that I have specified? Would it be better to use a different format? In the past I have stored dates as a julian in an int field type but I would rather learn to use the datetime field type in sql. Any help would be greatly appreciated.
I am using OLEDB Command transformation in a data flow to update the table. Find the columns for the table EMP as follows.
EmpID - int EmpName - varchar(40) EmpSal - float Status - varchar(20)
I am using the following command to update the table.
Update EMP set status = 'Disabled' where EmpID = ? and EmpSal = ?
when I use the above condition the type of the second parameter is taking as "double precision float" as the incoming input column type is "double - precision float". This is fine.
But when I use the following condition the type of the param is taking as different one.
Update EMP set status = 'Disabled' where EmpID = ? and ( EmpSal + ? ) = 0
It is taking as "four byte signed integer" even though the incoming input column type is "double precision float" thus I am loosing the precision and getting error when the limit exeeds. If I use 0.0 then it is taking as "Numeric" type.
If I use convert function like (EmpSal + ? ) = Convert(float, 0) then it is taking as "double precision float". Update EMP set status = 'Disabled' where EmpID = ? and ( EmpSal + ? ) = Convert(float, 0)
My question is how it behaves in the above situation. Can any body clarify please?
hello all .. I have a form that includes two textboxes (Date and Version) .. When I try to insert the record I get the following error message .. seems that something wrong with my coversion (Data type)"The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value.The statement has been terminated."
in my SQL database I have the date feild as datetime and the version as nvarchar(max) this is the code in the vb page .. Can you please tell me how to solve this problem?Imports System.Data.SqlClient Imports system.web.configuration
Partial Class Admin_emag_insert Inherits System.Web.UI.Page
Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Record_DateTextBox.Text = DateTime.Now
End Sub
Protected Sub clearButton_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles clearButton.Click Me.VersionTextBox.Text = "" End Sub
Protected Sub addButton_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles addButton.Click Dim objConnection As SqlConnection Dim objDataCommand As SqlCommand Dim ConnectionString As String Dim record_date As Date Dim version As String Dim emagSQL As String
'save form values in variables record_date = Record_DateTextBox.Text version = VersionTextBox.Text
'Create and open the connection objConnection = New SqlConnection(ConnectionString) objConnection.Open() emagSQL = "Insert into E_Magazine (Record_Date, Version ) " & _ "values('" & record_date & "','" & version & "')"
'Create and execute the command objDataCommand = New SqlCommand(emagSQL, objConnection) objDataCommand.ExecuteNonQuery() objConnection.Close()
AddMessage.Text = "A new emagazine was added successfully"
Hi,I would like to convert a dollar amount ($1,500) to represent Fifteenhundred dollars and 00/100 cents only for SQL reporting purposes. Isthis possible and can I incorporate the statement into an existingleft outer join query.Thanks in advance,Gavin
I have a field that is currently stored as the data type nvarchar(10), and all of the data in this field is in the format mm/dd/yyyy or NULL. I want to convert this field to the smalldatetime data type. Is this possible? I've tried to use cast in the following way, (rsbirthday is the field name, panelists is the table), but to no avail.
SELECT rsbirthday CAST(rsbirthday AS smalldatetime)
FROM panelists
the error returned is "incorrect syntax near 'rsbirthday'.
I'm rather new to all things SQL, so I only have the vaguest idea of what I'm actually doing.
CREATE PROCEDURE fts_insert_service_tasks( @status_no int output, @status_text nvarchar(255) output, @fts_employee char(100) , @fts_SCCode bigint, @fts_TaskDescription ntext) AS
declare @str_err nvarchar(255) declare @err_no int
set @err_no=0
if ( isnumeric(@fts_SCCode) = 0 ) begin set @str_err ='The fts Sccode is not a number' set @status_text = @str_err set @err_no=@err_no+1 return end
if ( @fts_SCCode = '' ) begin set @str_err ='The fts Sccode can not be null ' set @status_text = @str_err set @err_no=@err_no+1 return end
if ( len(@fts_employee) > 100) begin set @str_err ='Maximum Employee length allowed is 100 characters' set @status_text = @str_err set @err_no=@err_no+1 return end
if ( @fts_employee = '' ) begin set @str_err ='The employee fiedl can not be null' set @status_text = @str_err set @err_no=@err_no+1 return end
if (@err_no=0) begin
INSERT INTO fts_ServiceTasks (fts_employee , fts_Sccode, fts_taskdescription) VALUES(@fts_employee, @fts_SCCode, @fts_taskdescription)
set @status_no=0 set @status_text = 'Add Service Task Ok' end
else
begin set @status_no=@err_no set @status_text = @str_err end GO
and I called it from the ASP
<%function Add_Service_Task(fts_employee,fts_sccode, fts_TaskDescription) cm.ActiveConnection = m_conn cm.CommandType = 4 cm.CommandText = "fts_insert_service_tasks" cm.Parameters.refresh cm.Parameters(3).Value = fts_employee cm.Parameters(4).Value = fts_sccode cm.Parameters(5).Value = fts_TaskDescription on error resume next cm.Execute if cm.Parameters(1)=0 then exec_command=cm.Parameters(2).Value else call obj_utils.ErrMsg(cm.Parameters(2).Value,3000) Response.End end if if err.number <> 0 then call obj_utils.ErrMsg("System error at " & err.number & err.Description & ", please contact the administrator", 5000) Response.End end if Add_Service_Task=exec_command end function%>
I test with SQL 2k, Win2k3 OK But with Win2k i got:
Error Type: Microsoft OLE DB Provider for SQL Server (0x80040E30) Type name is invalid. /fmits/classes/cls_servicecall.asp, line 256
update tblPact_2008_0307 set student_dob = '30/01/1996' where student_rcnumber = 1830when entering update date in format such as ddmmyyyyi know the sql query date format entered should be in mmddyyyy formatis there any way to change the date format entered to ddmmyyyy in sql query?
I'm using SQL RS 2005 and have a report where we want the report to run a different stored procedure depending on if a condition is true. I've set my 'command type' to stored proc and can type in the name of a stored procedure. If I type in just one stored procedure's name, it runs fine. But if I try to use a =IIF(check condition, if true run stored proc 1, if false run storedproc 2) then the exclamation (run) button is greyed out. Does anyone know how I can do this? Thanks.
The table in SQL has column Availability Decimal (8,8)
Code in c# using sqlbulkcopy trying to insert values like 0.0000, 0.9999, 29.999 into the field Availability we tried the datatype float , but it is converting values to scientific expressions€¦(eg: 8E-05) and the values displayed in reports are scientifc expressions which is not expected we need to store values as is
Error: base {System.SystemException} = {"The given value of type SqlDecimal from the data source cannot be converted to type decimal of the specified target column."}
"System.InvalidOperationException: The given value of type SqlDecimal from the data source cannot be converted to type decimal of the specified target column. ---> System.InvalidOperationException: The given value of type SqlDecimal from the data source cannot be converted to type decimal of the specified target column. ---> System.ArgumentException: Parameter value '1.0000' is out of range. --- End of inner exception stack trace --- at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData metadata) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData metadata) at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table) at MS.Internal.MS COM.AggregateRealTimeDataToSQL.SqlHelper.InsertDataIntoAppServerAvailPerMinute(String data, String appName, Int32 dateID, Int32 timeID) in C:\VSTS\MXPS Shared Services\RealTimeMonitoring\AggregateRealTimeDataToSQL\SQLHelper.cs:line 269"
Code in C#
SqlBulkCopy bulkCopy = new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.Default); DataRow dr; DataTable dt = new DataTable(); DataColumn dc;
try {
dc = dt.Columns.Add("Availability", typeof(decimal)); €¦.
dr["Availability"] = Convert.ToDecimal(s[2]); ------ I tried SqlDecimal €¦€¦€¦.
During our DR drill we found that the same code which used to run perfectly fine on our Primary Data Centre is running very slow on Disaster Recovery DB Server and there are Lot's of open transaction with sleeping status and waittype as 'AWAITING COMMAND'.CPU, Memory and disk utilization are good. The ping reply between the app server and the DB server is well within the limit's even blocking is not their, also nothing is reported in the error logs. We are using SQL server 2014 STD 64 BIT on Windows Server 2012.