Hi
I was told that using DTS will allow me to schedule stored procedures to keep an sql database up to date. For example if a user registers but does not activate the registration, his details will be removed by a stored procedure which is scheduled to run every 24 hours. I use to use the global.asax file to fire a update by using a file containing a the date of the last update and then by adding 24 hours to it, it would execute a SP to delete unwanted data.
I have tried to install DTS with no success. I am running the following
Visual web studio express
SQL 2005 Express. (From SQLExpr_exe) and I have told it to install all the extra components
Installed SQLEXPR_Toolkit.exe with all its options
Installed SQLServer2005_DTS.MSI
When I go into the sql server using MS SQL Server Management Studio Express. I cannot see the Data transformation services node. I have also just installed server reports which I had no problems installing.
Can somebody please help me.
My vendor requires data to be sent in Excel format. Some of my tables have rows over 65,536 so I need to use Excel 2007 (Max of 1,048,576). Right now my data sits in SQL 2000. I am using MS SQL Enterprise Manager 8.0 to prepare the data. Is there some kind of add on or selection I am missing to use DTS to export from SQL to Excel 2007?Thanks in advance.
I'm using DTS package, a tool to transfer data from a txt file to database(Bulk Insert).
The Bulk Insert task provides an efficient way to copy large amounts of data into a SQL Server table or view.It seems that the Bulk Insert task supports only OLE DB connections for the destination database. But I want to use sql server authentication as OLEDB connection requires windows authentication.
So can the bulk insert be done using SQLServer authentication ? if yes then please help me.
I have given the code snippet below.
Code Sample:
Dim oPackage As New DTS.Package2() Dim oConnection As DTS.Connection Dim oStep As DTS.Step2 Dim oTask As DTS.Task Dim oCustomTask As DTS.BulkInsertTask Try oConnection = oPackage.Connections.New("SQLOLEDB") oStep = oPackage.Steps.New oTask = oPackage.Tasks.New("DTSBulkInsertTask") oCustomTask = oTask.CustomTask With oConnection oConnection.Catalog = "pubs" oConnection.DataSource = "(local)" oConnection.ID = 1 oConnection.UseTrustedConnection = True oConnection.UserID = "Tony Patton" oConnection.Password = "Builder" End With oPackage.Connections.Add(oConnection) oConnection = Nothing With oStep .Name = "GenericPkgStep" .ExecuteInMainThread = True End With With oCustomTask .Name = "GenericPkgTask" .DataFile = "c:dtsauthors.txt" .ConnectionID = 1 .DestinationTableName = "pubs..authors" .FieldTerminator = "|" .RowTerminator = " " End With oStep.TaskName = oCustomTask.Name With oPackage .Steps.Add(oStep) .Tasks.Add(oTask) .FailOnError = True End With oPackage.Execute() Catch ex As Exception MsgBox("Error: " & CStr(Err.Number) & vbCrLf_ & Err.Description, vbExclamation, oPackage.Name) Finally oConnection = Nothing oCustomTask = Nothing oTask = Nothing oStep = Nothing If Not (oPackage Is Nothing) Then oPackage.UnInitialize() End If End Try
I am working on a SSIS Package and I have to do a data transformation but this one is a but tricky. For example
7/6/2015 is my date and when I apply SUBSTRING(ColumnName,5,4) + "-" + SUBSTRING(ColumnName,3,1) + "-" + SUBSTRING(ColumnName, 1,1)
I get 2015-6-7 which is good
But then when the date is like 10/12/2015 I would have to modify my code
My question is, Is their anything I can add to the code to make the single digits have a zero before them so I can use the same code throughout. My updated code would then be
hi there, I have never use DTS before, now I am reading textbook for some special demand with DTS the textbook not talk very much for the detail of skills. seems the easy way to finish this query is using DTS wizard. but my requirement seems can't be done by DTS wizard. here are my requirement below. [move online Database to offline Database ] 1. the time of data preserve will have to reference separate firm's history data backup time ( for example, A company used to preserve data 6 months, and B company used to preserve data 12 months and so on..) 2. we will have only 2 kind of preserve time one is 6 months another is 12 months 3. The online DB only keeps 6 months data ( for example, when we do the DTS on 11/1 , we will only keep the data which from 5/1~10/31) , all data have to move to off-line DB except the past 6 months data 4. We will have to reference the history data preserve time to delete data after finished data movement those requirement looks very diffcult for me because I have never use DTS before , can you please give me a simple example or maybe some article I can reference?
From SQL Server 2014, using SQL Server Data Tools for Visual Studio - BI, I'm trying to edit a Script Component within an SSIS Data Flow Task. The 'Edit Script...' button is enabled and turns a nice shade of blue when moused over, but a click has no effect. Perhaps I'm missing a component of VSTA? Everything else seems to work correctly. What might I be missing?
I am importing the values for field Atype from a .csv file as DT_STR, 13 and I need to fit them into a bit type CType field.
When I write the conditional split ((ISNULL(Atype)?"a":Atype)!=(ISNULL(CType)?"9":CType)) it says that the DT_WSTR and DT_I4 types are incompatible and that I need to explicitly cast with a cast operator. I haven't been able to make it work, how to explicitly cast?
I am running dts in Sql Server 2005 management studio from Management, Legacy and data Transformation Services.
Once the dts has run, I get this error message "Error Source : Microsoft Data Transformation Services (DTS) Package Error Description : Error accessing Windows Event Log."
In my package I am using lookup to get new and similar record. I want to filter the rows for Lookup Reference Data Set by using Variable Value.
I have created variable @[User::CustId] with Int32 datatype, having default value 2 when I am trying to evaluate below query I am getting error
"select CustId,PartNm,LocId,LocTyp from loc where CustId= "+ @[User::CustId]
Error. The Data types "DT_WSTR" and "DT_I4" are incompatible for binary operator "+".
The operand types could not be implicitly cast into compatible types for the operation. To perform this operation , one or both operands need to be explicitly cast with the operator.
We are using the cache transformation in our project , while doing the cache transformation our disk space goes to 0 MB free and SSIS package execution not completes even after 3 hr..Initially we have around 34 GB free space on C: drive .Our server configuration is 64 RAM. We are caching the data from table which contains around 21 million records.We changed the path in properties (“BLOPTempStoragePath”,”BufferTempStoragePath”) of Data Flow task of SSIS in which we are using Cache Transformation.
I have the following 2 fields that are sourced from an Excel spreadsheet
DocNumber - a 10 digit number PostingRow - a number between 1 and 999
I would like to produce a new column that is a concatenation of these two fields, but the PostingRow needs to be a 3 digit number eg. 1000256153-001 ....
In SSIS I use the DQS Cleansing transformation component. I've got a knowledge base (KB) in place and this KB holds various domains and my data source has more input columns than would like to use for a particular clean up operation. I want to use some of the input columns to map against some domains in the KB. It is my understanding that it should be possible to select only the required input columns, but all i can do is select all input columns.
Is it possible to parameter the connection of a Lookup Transformation task - specifically the table/view name? I would like to be able to dynamically set the table that the Lookup Transformation is connecting to at runtime.I've looked into the "Use results of an SQL query" on the connection screen (which correlates to the "SqlCommand" property), but I'm unable to pass in a parameter this way.I've also looked into the SqlCommandParam, but that doesn't allow me to use a parameter in the "FROM" clause of the sql syntax.
I am using SSIS in SQL Server Enterprise 2005. I have two OLE DB data sources from two disparate databases (IBM DB2 and Microsoft SQL Server), some columns from each of which are to be included in the merged output results. I have noted the various requirements in the forum postings with regard to sorting the OLE DB sources and specifying the output source columns as being sorted, as well as the requirement that the join fields in the two sources be close/exact matches. Yet, when I run this in VS, while the work area reflects the expected number of rows being input into the Merge Join transformation, no count is reflected as output from that transformation into the final destination table.Specifically, my two data sources (IBM DB2 and MS SQL) are configured as follows:
IBM DB2 contains an SQL statement that uses Cast operations to create the result columns.and an ORDER BY clause to ensure that the output is sorted by the desired two columns.. The OLE DB source property setting for IsSorted is set to true; the Output Columns folder column definitions for "key_ source_dtsy" and "key_source_dtrt" have their SortKeyPosition properties set to 1 and 2, respectively. Those field are both defined as data type DT_STR, with lengths of 4 and 2, respectively. Below is the Path metadata from the Data Flow Path editor from the path from this source:
IBM DB2 source"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source Component""ID_CODE" "DT_STR" "0" "0" "10" "1252" "0" "" "Source F0005 User Defined Codes""CODE_DESCR_1" "DT_STR" "0" "0" "30" "1252" "0" "" "Source F0005 User Defined Codes""CODE_DESCR_2" "DT_STR" "0" "0" "30" "1252" "0" "" "Source F0005 User Defined Codes""key_source_dtsy" "DT_STR" "0" "0" "4" "1252" "1" "" "Source F0005 User Defined Codes""key_source_dtrt" "DT_STR" "0" "0" "2" "1252" "2" "" "Source F0005
User Defined Codes:
MS SQL contains an SQL statement that takes the columns as they are in the MS SQL table (no Cast operations needed); it also uses an ORDER BY clause to ensure the output is sorted by the join columns. The OLE DB source property setting for IsSorted is set to true; the Output Columns folder columns for "key_source_dtsy" and "key_source_dtrt" have their SortKeyPosition properties set to 1 and 2, respectively. Those field are both defined as data type DT_STR, with lengths of 4 and 2, respectively. Below is the Path metadata from the Data Flow Path editor from the path from this source:
MS SQL source"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source Component""id_code_name" "DT_I2" "0" "0" "0" "0" "0" "" "Source CodeName in db dwVdFY""key_source_dtsy" "DT_STR" "0" "0" "4" "1252" "1" "" "Source CodeName in db dwVdFY""key_source_dtrt" "DT_STR" "0" "0" "2" "1252" "2" "" "Source CodeName in db dwVdFY"
The Merge Join transformation specifies an INNER JOIN using the columns named "key_source_dtsy" and "key_source_dtrt" from the respective data sources.I know there are alternative ways of accomplishing my intent (Lookup, port MS SQL table to IBM DB2 so join can occur in SELECT statement, etc.; however, I'd like to use this functionality and assume that it should work.
If you have two synchronous transformation components and the input of the second is connected to the output of the first, does the first transformation process (loop through) all rows in the buffer before outputting these rows to the second transformation? Or does the first transformation output each individual row to the second transormation as soon as it has finished processing it?
My source has 2.2 million of records. I'm performing the incremental load.In the lookup transformation i used the destination table for the reference using Full cache mode.For the first time package executed successfully but when i executed the package second time, Suddenly Package hangs while running.Than i truncate the data from the destination table and restart the SQL Server Services.After doing all this i executed package again and it worked but when i executed package second time, again package hangs up .I have 8GB RAM and i5 2.5 GHz Processor laptop.
We are transferring data between AS/400 and SQL Server 7.0 using DTS. Some of these transfers may need to be very close to real time. It doesn't seem like a continuously running job is the best solution for that.
Do you know any tools or utilities that can help us to move the data?
name type amount ==== ==== ====== mary saving 123.00 mary chequing 246.00 mary investment 135.00 john saving 678.00 john chequing 987.00 john investment 0.00
what should i do to present the data in the following format?
name saving cheq investment ==== ====== ==== ========== mary 123.00 246.00 135.00 john 678.00 987.00 0.00
I have an Access Database that I have imported into SQL Server2000 and that worked great, but now I have to get it into 2005. My question is, How can I get the tables and all info in the tables into an SQL Script so I can run that script on the 2005 server?
The SQL 2000 is on my dev server and I have all the Tools, (Ent Manager, Query Analyzer,etc...) but the 2005 Server is Godaddy's and they only have the basic web interface. I can run Sql files and create databases and tables, but thats about it.
I have begun using SSIS and I am a little taken aback by the complexity of it especially since I just want to do a simple data transformation such as in DTS. Are there any tutorials for data transformation for SSIS on the web/this forum and what if I want to do a simple transformation from Access to SQL Server?
I tried transforming data from one server to another using DTS. and then i got an error as below, ----------------------------------------------------------------------------- Details: Error Source: Pump Data Step Details: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.
Desc: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.
Source : Microsoft Data Transformation Services(DTS) Package
Source : Microsoft Data Transformation Services(DTS) Package Code: 0x80040428 Description: Package failed because step'Pump data step' failed. Error Message: IDispatch error #552 --------------------------------------------------------------------------
This is the full description of the error dialog...
I'm running a DTS package on SQL Server. The source is MS Access and thetarget is Oracle.On a "Drop Table" command the process just hangs. There are no foreign keys onthe table. Several tables have already been processed successfully by thistime.I think I've ruled out corruption by dropping and recreating the targetdatabase on Oracle.Any ideas?M Man
I have a lookup transformation that retrieves a key for a certain column of values, in this case, a name. So, I go in to the lookup table with a name and come out with its key. I had it working and then I added new entries to the lookup table for a bunch of new names. Now, for some reason, I am not getting the matches for the new names. But I am still getting the matches for the names that existed before I added the new ones.
I'm wondering if the lookup transformation is using the old set of data and some how not picking up the new names. Do I have to trigger something in the lookup transformation to let it know that the lookup table data has changed?
I have a student table that needs some clean up. My first task is to remove all the periods (.) from the middle name column. Some people have two char middle initials with two periods. Can this be done via a SSIS package (I sure it can, but not how)? I can't think of a simple update statement that would accomplish the same thing.
i am developing one custom transfer component, where i am building one custom object and want the same to be transfered from ComponentUI to component.I explored in this issue and came to know that we can make use of SaveToXML and LoadXML methods of IDTSPersist90 interface. The problem is i could not able to make use of this interface.If any body faced same issue and got the solution, let me know the same.
I am having a simple difficulty regarding my transformation component that I have created.
I followed all of the relevant documentation to create the component, UI and such, and when I run a program that transfers data through my component (without it doing anything to the data) to a flat file source, all I get in the flat file is ,,,,,,,,,,
Meaning that the data is not actually being passed through my component to the destination, but it does recongize the correct number of columns. I get this warning for each column: [DTS.Pipeline] Warning: The output column "PurchaseOrderDetailID" (20) on output "OLE DB Source Output" (11) and component "Source - PurchaseOrderDetail" (1) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.
To fix this issue, all I need to do is "check" the boxes in the advanced editor for my component. However, I want to be able have these boxes "checked" automatically.
My question is, when you "check" a box next to a column name in the advanced editor, what does that exactly do that allows you to transfer the data? What do I need to program in order for it to replicate that? I actually want it to happen automatically, so by default, all data in all columns are transfered through my component out the other side, so I just need to know how to do this by code, and not how to replicate the UI.
My component, I thought, did that automatically, as I specified everything that I thought was required based on all of the documentation I read. Obviously, I am missing something. Here are the methods that I believe would all be involved.
Please let me know what I am missing.
int[] inputColumnBufferIndexes; // ... int[] outputColumnBufferIndexes; // ... used in PreExecute PipelineBuffer outputBuffer; // used in ProcesInput
public override void ProcessInput(int inputID, PipelineBuffer buffer) { //base.ProcessInput(inputID, buffer);
if (!buffer.EndOfRowset) { IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID); while (buffer.NextRow()) { // TODO: Examine the columns in the current row. // Add a row to the output buffer. outputBuffer.AddRow(); for (int x = 0; x < inputColumnBufferIndexes.Length; x++) { // Copy the data from the input buffer column to the output buffer column. outputBuffer[outputColumnBufferIndexes[x]] = buffer[inputColumnBufferIndexes[x]]; } } } else { // EndOfRowset on the input buffer is true. // Set EndOfRowset on the output buffer. outputBuffer.SetEndOfRowset(); } }
I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.