How To Add New Values In SQL 2000 DTS Enumeration Task
Apr 10, 2008
Hi. I am using SQL 2000 and DTS package to transfer data between two databases. In my DTS package, I need to create a enumeration task with a enumeration data task. However when I tried to add enumeration values, I cannot find any ways to create new values by right click to select from drop down menu or use Control + New in the enumeration properties. Could anyone point out how I can add new values in DTS Enumeration type.
throw new Exception("Record " + intCount + " cannot be imported : ", ex);
}
}
I did a search on MSDN, I found the option available are:
Code Snippet
Member name Description
Insensitive The ResultSet does not detect changes made to the data source.
None No ResultSet options are specified.
Scrollable The ResultSet can be scrolled both forward and backward.
Sensitive The ResultSet detects changes made to the data source.
Updatable The ResultSet allows updates.
SqlCeResultSet.Scrollable Property : True if the ResultSet is scrollable; otherwise, false.
SqlCeResultSet.Updatable Property : True if the values in the record can be modified; otherwise, false;
SqlCeResultSet.Sensitivity Property :
The sensitivity of the ResultSet indicates whether the ResultSet is aware of changes to the data source. A ResultSet that is sensitive is aware of changes; an insensitive ResultSet is unaware of changes. If no sensitivity is set, the ResultSet is asensitive and will use the optimal configuration based on other settings.The default value is asensitive.
Since I just insert only, does it mean I no need "Scrollable" ( i think insert is mere forward scroll)?
But what about "Updatable" ? which one from SIUD (Select/Insert/Update/Delete) need to use this "Updatable"?
I planned to write a VB.NET SMO app enumerating all tables in a given database, all fields in each table, the properties (length, NULL, PK, etc) of each field, and finally store the result in a table. I think that should be quite doable.
But I noticed that the ForEach Loop editor has a SMO enumerator. (I don't know if it can be used to do exactly what I want.) By clicking on the options in the ForEach Loop editor I can get the Enumerator as follows:
Database[@Name='AdventureWorks']/Table[@Name='AWBuildVersion' and @Schema='dbo']/SMOEnumObj[@Name='Columns']/SMOEnumType[@Name='Names']
but it's not clear to me how to work with this thing! ( It must return a collection of items that I access through a variable.) Anyone know of an example illustrating SMO enumeration?
I am trying to move files from one directory to anonther using the For Each Loop Container and a File System Task. However, on the FIRST iteration of the ForEach Loop the variable that I am setting to the 0 index of the For Each Loop is returning me a valid directory with no file ( path only ) from who knows where ????
On the second iteration of the For Each Loop everything works as expected and I get the full path and file name and it iterates through all the files in the directory o.k.
So I have had to put a Hack to skip the fist iteration of the loop and then execute the File Task on the second iteration but I dont see anyone else doing this.
Where in the heck is it getting this odd directory from on the fist iteration ( its not my User varible because I am initializing it to \nowhereofile ). What am I doing wrong?
For first time I'm testing this task and surprisingly, when I try "Edit Package" option:
1)The DTS host failed to load or save the package properly 2)The selected package cannot be opened 3)Error HRESULT E_FAIL has been returned from a call to a COM component
But after these messages you can see all the tasks but they haven't name!!
It seem as if RCW mechanism has failed between managed and unmanaged coded-partially.
I don't dare to follow doing more stuff, I don't know if that package is well-loaded or not from there. ?¿
Hi, This is my save procedure. Please check and give me some advice.looping is need or not? i get error "Collection was modified; enumeration operation may not execute" plz help me. -------------------------------------------------------------------------------- Private Sub btnSave_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnSave.Click Dim con As SqlConnectionDim cmd As SqlCommand Dim da As SqlDataAdapter Dim userid As String Try userid = Request.QueryString("userid")con = New SqlConnection(ConfigurationSettings.AppSettings("strcon")) ' con.ConnectionString = ConfigurationSettings.AppSettings("strcon") con.Open()cmd = New SqlCommand("dbo.sp_AddAns", con) cmd.CommandType = CommandType.StoredProcedure
For Each Item As DataListItem In dlAQ.ItemsDim paramid As New SqlParameter("@id", SqlDbType.Int, 4) paramid.Value = userid.Trim cmd.Parameters.Add(paramid)Dim paramans As New SqlParameter("@proansw", SqlDbType.NVarChar, 50) Dim txtbox As New TextBoxtxtbox = CType(Item.FindControl("txtAns"), TextBox) paramans.Value = txtbox.Text.Trim cmd.Parameters.Add(paramans) Dim paramprodesc As New SqlParameter("@prodesc", SqlDbType.NVarChar, 50) Dim lbldesc1 As New Labellbldesc1 = CType(Item.FindControl("lbldesc"), Label) paramprodesc.Value = lbldesc1.Text.Trim cmd.Parameters.Add(paramprodesc) Dim paramproid As New SqlParameter("@proid", SqlDbType.Int, 4) Dim lblproid1 As New Labellblproid1 = CType(Item.FindControl("lblproid"), Label) paramproid.Value = lblproid1.Text.Trim cmd.Parameters.Add(paramproid)Dim paramreso As New SqlParameter("@proreso", SqlDbType.NVarChar, 50) Dim lblreso1 As New Labellblreso1 = CType(Item.FindControl("lblreso"), Label) paramreso.Value = lblreso1.Text.Trim cmd.Parameters.Add(paramreso)Dim paramchk As New SqlParameter("@chk", SqlDbType.Int, 4) paramchk.Value = "2" cmd.Parameters.Add(paramchk) 'Dim rowaffected As Integer cmd.ExecuteNonQuery() bindData()
OK, a new package, with a Foreach container enumerating CSV files in a directory.
I create the container pointing it at the directory and retrieving the fully qualified name, and create a variable (called 'CSVFiles') with a package scope, but no value.
Inside the container is a bulk insert task. The destination db/table is set, and the input flat file connection manager for the CSV files is defined with the connection string set to the variable created above.
As it iterates through the files, the variable is correctly set to the next file in the directory (I put a message box in the stream to display the file name/variable). It resembles 'C: empLocation1.csv'.
But when it gets to the bulk insert, I get this error message:
[Bulk Insert Task] Error: The specified connection "CSVFiles" is either not valid, or points to an invalid object. To continue, specify a valid connection.
What's going on here? Can I not use a bulk insert task in the container? Or some other parameter needs to be set?
We migrate data from a legacy system to new system using SSIS. The primary key of legacy system is a user-defined sql server which holds alpha-numeric values. The primary key of new system is a big int(sequential numbers).
When we migrate data, we generate a sequential number for each legacy key(the primary key of legacy data) and insert data in to new system tables. The newly generated sequential numbers and the legacy keys are persisted in an intermidiate table for look up operations of child tables.
We are facing problem when we try to migrate tables which has self referring coulumns. For example a table called Employee has a column ManagerKey which refers to Key column of Employee table. We are struck up in defining data flow tasks to replace legacy ManagerKey column values with the new values(sequential values) generated during the migration process.
I have a task I wrote which does not always update the property value (as seen in the properties pane) Basically, change something on the form, then update the task host property with: this.taskHostValue.Properties["Duration"].SetValue(this.taskHostValue, Convert.ToInt32(spnDuration.Value));
Stepping through this, it does exactly what it is supposed to. Having a look at the property value, it confirms it has changed. Reopening the UI and resetting all the controls returns the expected results.
The package however does not realise it has changed. There is no * next to the package name in the top tabs. As long as the package thinks it is unchanged, SaveXML does not get called either so the tasks do not persist.
Changing the value on the properties pane works fine though.
The frustrating thing is this is slightly random. Slight in the sense that sometimes it works but most of the time it does not.
The sample code I used was the MS download IncrementTask (Which works BTW) so I can't see it as being a VS / SSIS bug but rather something I am / am not doing. 3 tasks I have written all behave the same. I have to "nudge" them before savign the package.
I have One package that it contains one Execute SQL task in that i have placed a Stored procedure . Now i want to pass values to Stored procedure parameters from a databse table by dynamically .For this i am trying to use " Script task " How can i pass that table column values to that stores procedure thru using Script Task?
I have a data transform from a flat-file to a SQL server database. Some of the flat-file fields have NULL values. The SQL table I'm importing into does not allow NULL values in any field, but each field has a Default value specified.
I need to have it so that if a null value comes across in a field using the data transform, it takes the table default on import. I could of sworn I had this working a few days ago, but I get errors now that state I'm violating table constraints. Has anyone done this before?
I have a object variable named "UserList" which gets populated through execute sql task - stored procedure.I wish to concatenate all the values in "UserList" using ssis script task - in lang vb.Output should be like (User1,User2,User3,....)
I am working on a project that is creating a new application, my area of the project being the migration of data from the old application database, transforming it, and populating the new database.
The transformations to the old data are done in a staging database.
At the end of the process, the staging database ends up with a lot of new applications tables, populated with the migrated legacy data.
We need to move these tables from the staging database to (initially) our test databases, but ultimately what will be the live database.
We have tried using the "Transfer SQL Server Objects Task" in SSIS, but have ran into a problem that a lot of the database tables have default values for columns.
These default values are not brought over.
Example. Tables contain a "GUID" field, which has a default of value of newid()
Right clicking and the table generating the CREATE script generates
[GUID] [uniqueidentifier] ROWGUIDCOL NOT NULL CONSTRAINT [DF_tbCRM_Client_GUID] DEFAULT (newid()),
However, the Transfer objects task does not create this default of newid()
Examining the SQL generated by the Import / Export Wizard when investigating this shows that the wizard generates this column as
[GUID] uniqueidentifier NOT NULL
and the column default value is lost.
Is there something i should be setting somewhere to force SSIS to bring these column definitions over correctly?
I am writing my first custom SSIS task and I can see that, if I put a public property into the task, I see that property in the standard Properties window. If I add a property of type String, I can put a value in that property, in the task code, and when I instantiate the task in a package, I can see the value I entered.
To try and get a drop down list property in the Properties window, I declared a property of type Combobox, and indeed a drop down list appears in that property.
My problem is trying to get values in that property. I have used the test items:
   Property.Items.add("Fred")    Property.Items.Add("Jim")
But I do not see the values in the drop down list. All I do see is one item with a value of "(none)".
Hi, DTS now has a dynamic properties task in sql 2000 which at a first glance is pretty cool. However i do not know if it would sort out a particular issue I have.
I need to have a generic DTS package that pulls data from different locations based on the project selected by a user. The source is a btrieve database The name of the tables change with the project, so for example the F24 project would have its source table named F24TACT and the IFO source would have its source table named IFOTACT.
I need to be able to dynamically retrieve the table name from a local SQL table and assign this name to the data pump for the data extraction..
I don' t know if this is achievable in DTS. I can retrieve the value into a look up variable but how can i set it at runtime ?
I'm converting a package from DTS to IS. My manager does not want to spend the time re-writing it to make use of the Script task so that's why I'm using a DTS 2000 task which contains some ActiveX.
The tasks themselves run fine when I'm in the designer and click the run button, but when part of the entire package it fails. I search on the error message and nothing comes up.
This is the error message: Error: System.Runtime.InteropServices.COMException (0x8004043B): Exception from HRESULT: 0x8004043B at DTS.PackageClass.Execute() at Microsoft.SqlServer.Dts.Tasks.Exec80PackageTask.Exec80PackageTask.ExecuteThread()
I've set them to run on the main thread but that doesn't seem to help.
I have imported a series of SQL 2000 DTS packages to our 64bit AMD SQL 2005 server. I then created SSIS pkgs to run each SQL 2000 package using the Execute DTS 2000 package task. Then I deployed these packages to the Yukon server as well.
I can run these packages when launching them via Integration Services from Management Studio on my workstation, but when I try to run them via a SQL Job or the DTExec command on the server, I get this error. Anyone know of a workaround?
OnError,xxxxxx,xxxxxxx,Execute DTS 2000 Package Task,{94769783-575C-4D1E-90F6-C2BDB3EA3CE2},{E05551C1-CDDE-40E3-87B4-C65D4E1B3A53},4/26/2006 3:01:37 PM,4/26/2006 3:01:37 PM,0,0x,This task does not support native Win64 environment. Please run the package in 32-bit WOW environment instead.
OnError,xxxxxx,xxxxxxx,Execute DTS 2000 Package Task,{94769783-575C-4D1E-90F6-C2BDB3EA3CE2},{E05551C1-CDDE-40E3-87B4-C65D4E1B3A53},4/26/2006 3:01:37 PM,4/26/2006 3:01:37 PM,-1073594105,0x,There were errors during task validation.
Is the DTS 2000 Package Task not supported on Win64?
I have installed the SQL Server 2000 DTS components from the November feature pack on both my workstation and the server.
I am building a website in asp.net 1.1 with vb.net 2003 which will have the standings of the teams in our baseball league. Below is the database table I have created.
ID(int) home_team (nvarchar) away_team(nvarchar) win_teampf(nvarchar) lose_teampf(nvarchar) 1 Elmwood Murdock 7 22 Louisville Manley 4 33 Manley Elmwood 9 8 ID is the primary key. What I am attempting to do is add each instance of Elmwood from the win column to output the total number of wins from Elmwood and do the same for Elmwood in the losing team to output the total number of losses. The result will look something like this: Elmwood: 1 Win 1 Loss. .500 Thanks for your reply.
How can I remove -1.#IND values from a float column in SQL Server 2000 (8.00.2039 SP4).
So far I've tried - Using Query Analyzer, setting the value to a proper float value. I receive error €œA floating point exception occurred€? - Using Query Analyzer, deleting the record with the invalid float. I receive error €œA floating point exception occurred€? - Using Enterprise Manager, setting the value to a proper float value. I receive error €œA floating point exception occurred€? - Using Enterprise Manager, deleting the record with the invalid float. I receive error €œA floating point exception occurred€? - Previously I was able to convert the column to nvarchar, but now this fails because the table is being replicated.
I don€™t really care what happens to the records, I just need them to go away.
I am running a package on a 64-bit server using the 32-bit dtexec. It contains an embedded Execute DTS 2000 package. I deployed the package to the server using the sa account. I set up a SQL Agent job that runs under an account that should have complete admin privileges. The network guys tell me that Legacy components have been installed (although I believe that shouldn't be necessary because SSIS is installed). This is SQL Server 2005, SP2.
When I execute this job, I receive this error message:
Executed as user: Domainuser. ...age Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 11:13:18 AM Error: 2007-12-17 11:13:35.65 Code: 0xC0010018 Source: Execute DTS 2000 Package Task Description: Error loading a task. The contact information for the task is "Execute DTS 2000 Package Task;Microsoft Corporation; Microsoft SQL Server v9; ? 2004 Microsoft Corporation; All Rights Reserved;http://www.microsoft.com/sql/support/default.asp;1". This happens when loading a task fails. End Error Error: 2007-12-17 11:13:35.71 Code: 0xC0010026 Source: Execute DTS 2000 Package Task Description: The task has failed to load. The contact information for this task is "Execute DTS 2000 Package Task;Microsoft Corporation; Microsoft SQL Server v9; ? 2004 Microsoft Corporation; All Rights Reserved;http://www.microsoft.com/sql/support/default.asp;1". End Error Error: 2007-12-17 11:13:35.71 Code: 0xC0024. The step failed.
OK. I give up and need help. Hopefully it's something minor ...
I have a dataflow which returns email addresses to a recordset.
I pass this recordset into a ForEachLoop configuring the enumerator as (Foreach ADO Enumerator). I also map the email address as a variable with index 0.
I then have a Execute SQL task which receives this email address as a varchar variable (parameter 0) which I then use in my SQL command to limit the rows returned. I have commented out the where clause and returned all rows regardless of email address to try to troubleshoot this problem. In either event, I then use a resultset to store the query result of type object and result name 0.
I then pass this resultset into a script variable to start parsing the sql rows returned as type object. ( I assume this is the correct way to do this from other prior posts ...).
The script appears to throw an exception at the following line. I assume it's because I'm either not passing in the values properly or the query doesn't return anything. However, I am certain the query works as it executes just fine at the command prompt.
My intent is to email the query results to each email address with the following type of data by passing the parsed data from the script to a send mail task. Email works fine and sends out messages but the content is empty. I pass the parsed data as string values to the messagesource and define the messagesourcetype as a variable in the mail task.
part number leadtime
x 5
y 9
....
Does anyone have any idea what I might be doing wrong?
I've got the following C# function to add a customer record to the database. The record gets added without any problems but the OUTPUT PARAMETER (Parameter[10]) is always NULL and I can't see why. I'm also using the Microsoft Data Application Block.
Here's the C# function:
public void SaveCustomer(int customerId,string customerName,string address1, string address2, string town, string county, string postcode, string webSiteAddress, string mainTelNo, string mainFaxNo)
HiI've got a quandry - I have a detailed database that handles advertorders between a design agency and printers / magazines etc.I want to add up the total spent by the client and put the results to afield.I've actually done that using a query table in access - it should bequite simple as i can bind the 'total amount' to my table - the onlything it does not currently do is filter the total based on the monthselected.For example if you look athttp://www.daneverton.com/dg2data/months/2006-12.aspThe data here is filtered by the issue equaling Dec-2006The actual order total is £13,622 but the column is showing the totalfor all entries to date (a years worth = £ 422,048)I'm sure that there is only a basic tweak required but i'm banging myhead over what to doThe sql is "SELECT * FROM monnodraught, q_monodraught_total WHERE[Issue / Edition] LIKE ? ORDER BY Publication ASC"Any help gladly received.
After setting up the linked server connection at the standby server, Itried to xcopy a file through the sql server 2000 schedule task to thestandby server's shared directory. But it keeps giving me the errormessage with'Invalid Drive Specification'.My whole process includes1) set up linked server connection on the standby server2) set up job to xcopy file as operating system commend in sql serverfrom the production box(xcopy c:directoryfile.bak\standby_servere$directory /c)3) test, but not successful - I am already running the whole scheduletask as a Windows user with Admin authority.)What did I do wrong or did I miss something?Thanks in advance
I have a "Execute DTS Package 2000 " task in SSIS. The SQL 2000 DTS has one task which precedence is "completion". Using SQL2000 it works properly, but when I invoke it from SSIS it doesn€™t respect the precedence. So, when the task above fails it ends the DTS execution.
Is it possible to configure the task to respect the precedences?