I have a SQL 2000 SP4 server running on Windows 2003 SP2. About 6 months ago I started experiencing problems editing existing DTS packages. At that time it was just 1 package. (I tested that I could edit all others but this one package the first time I experienced the problem.)
But now it seems like anytime we experience import problems, I cannot edit the package so that I can verify that the import file is good. (So far the initial problem is always with the import file and once we figure out what that is and resolve it the package runs ok again.)
The problem is I'm planning on migrating to SQL 2005 and since I can't import DTS packages, I'm going to have to rewrite them as Integration Service Projects. If I can't edit the workflow tasks in DTS, it will be time consuming figuring out what the actual import file is. Also, most of these imports are coming from our AS400 which we are phasing out so I have to rewrite these to work with the new ERP package we are going to as we bring on each division.
I have not installed any post SQL 2000 SP4 hotfixes. So if anyone is aware of a hotfix that resolves this issue, please let me know.
I have several DTS packages that connect to various Oracle databases. An upgrade has recently been done to one of the databases from 7.3 to 8i. The other databases were always 8i. Last week, I could edit data transer tasks normally, this week, DTS hangs and I have to use task manager to kill the process. It worked fine last week. I can successfully run the packages, I just can't edit them. I have no trouble editing or running packages that connect to databases other than the one recently upgraded. I have tried both OLE DB and ODBC connections with the same results. Does anyone have any ideas on how to fix this?
First post, new to SQL Server, the usual apologies apply ;-)
I used the Import/Export Wizard in SSIS to define a source and target for data migration. I accidentally omitted ticking off a box to force the drop and re-creation of the target table. I can't find any way to edit a package, once I've finished with the wizard! If someone would point me in the right direction, I'd appreciate it.
Trying to set up a tranform task between a mysql db using and ADO.NET connection and sql server. My query to pull from the mysql db is something like "select x,y,z from table where last_updated" > @User::LastUpdated. This command is set up as an expression for the Data Flow Task and is the value for the [DataReader Source].[SqlCommand]
I have two questions.
Why does the package attempt a query against the mysql database all the time? And Why is the query attempting to pull the entire table instead of having any regards for my where clause?
I've even added where last_updated > greatest('2006-08-15', '" + @User::LastUpdated to attempt to get it a where clause even when the parameter isn't set yet.
What is the trick? This is not feasible when pulling from multi-million row tables.
I have a package that I was able to edit a week before. But now it is consuming all CPU memory (100%) and not letting me to edit the package (When I try to edit that it says Visual Studio Is busy even after an hour waiting).
Even though I have not changed anything, the package is behaving like this.
I am having problems of my DTS package hanging when run through the sql server agent through a scheduled job. If I execute the job manually it runs fine? If I run the package manually it runs fine? Of course there is no way to actually tell what is being hung? Any ideas?
I have a very simple SSIS package that executes a .bat file. Here's the actual file it executes:
@Echo Off c: F: cd ReportingServicesScripts rs -i NoteBlankSnapshots.rss -S http://10.90.160.13/ReportServerTest EXIT
The .bat file executes a Reporting Services .rss script using the rs.exe utility. When I run the command from the command prompt, it runs just fine. When I execute the .bat file manually, it runs and completes. When I execute the actual package, it also completes. But when I schedule the package as a job, it just hangs...it never finishes.
The owner of the job is an administrator in SQL Server. I have the SQL Server Agent configured to interact with the desktop - although my .bat file requires no input from the user.
I've created other jobs that just execute plain old SQL using the same owneer and these jobs complete just fine.
Dear Folks, I have a package that calls winzip to extract files(command line usage) in an Execute Process task. The package runs fine if I am logged in to the server. It hangs on the winzip task otherwise. The package is stored on the server (as opposed to the file system) & is run under a proxy account using the SQL Server Agent. I tried adding folder and WinZip32.exe permissions for the domain user who the proxy account was created under to no avail. Any Ideas? Thanks for your help!
When I set the transaction option to required my package just hangs when I try to execute it. The status bar says "Validating" and then the name of the first destination data flow component (whatever that happens to be). I've let it sit for long periods and nothing happens. Any suggestions?
Hello everyone, I am having an odd behavior with an SSIS package that transfers data from one database to another. It seems to me that with some of the larger databases the services just stops working (0% utilization) when it is almost done. I don't get errors and the job stays in the "executing" state.
This doesn't happen always, but it will happen once the package is ran many times over the course of the week, I was wondering if you guys could point me in a direction on how I can find out the source of these hangs. Right now I am leaning towards it being a memory issue (or other misc. hardware problems) but I would like to rule out that it is software first.
I have an SSIS package that executes in about 1:20min from Visual Studio on my local machine. While executing, my machine is somewhat unresponsive.
When I deploy the package to the database server -- the very same database server that I am accessing from my local machine -- the package executes but eventually hangs. It appears to be running out of memory, and I usually have to kill the process to get the machine to respond. While it's hanging, the machine is unresponsive to all users. The hardware (including memory) is identical between my local development box and the server.
How should I troubleshoot this? I've tried deploying the package to MSDB, file system, running from dtexec, and running from dtexecui. This is very frustrating!
I have data flow tasks, one which validates the import file and one which processes the import file if the validation passed. The validation runs and pushes the three row types to three different recordset destinations. When I enter the processing data flow task, I have three parallel trees processing each recordset saved in the previous task. I'm using a script component to generate the rows which are then sorted and merged with the production database to find existing records. Based on this, I split to an OLE DB command (running an UPDATE command) or OLE DB destination (to simply insert the records.)
In this particular case, all records are being updated and nothing is being inserted new. Two of the three trees will complete the sort but hang on the merge, split, and OLE DB command components. The other will do the same but also hang on the split.
In another case, I truncated each destination table before running the package and the package runs fine.
Are toes being stepped on in the data flow task and causing a deadlock?
Update: I removed the sort transformation and sorted the rows before pushing them to the recordsets and I still get the same results.
I have a number of packages that I have moved from an old server. Each package was scheduled with a SQL Agent job. On the old server everything ran fine. All of the packages run fine from VS, from DTEXECUI and I have tried one from the command line with DTEXEC and it worked.
When I run from the SQL Agent job, I don't get a failure, the package just hangs. I let one of the agent jobs sit for an hour with no progress. The package typically takes about 15 minutes to complete.
Below is the output from my package log up to the point that it hangs:
We are on SQL Server 2000 SP2 version. I have 3 DTS packages that are running successfully every day. We need to change them as the source side tables are going to be changed pretty soon. When I go into the designer view Enterprise Manager hangs when I do any of the following:
1. click on Properties for the transformation task 2. Click on Disconnected Edit 3. Click on Properties of connection 1(IBM DB2/400 Source), properties window pops up, now change the userid/pwd and click ok.
We have a problem with visual studio. It hangs when i use the "execute package" option. New packages are running correctly but the packages which i've already built are not executing anymore...Any ideas to get things on track again?
When I push my SSIS packages up to my production server (which has a different data source than my developement environment) and I try to open the package on the production server, it takes forever for to validate all the steps of the SSIS package because it's trying to validate against a datasource that isnt there, so it just waits for each element it's validating to time out. This is exceptionally annoying.
Is there a way to turn off this validation 'feature'?
We're experiencing a problem where intermittently our SSIS packages will hang. There are no log errors or events in the event viewer. It will happen whether the package is executed from the SQL Job Agent or run from BIDs. When running from BIDs it appears to hang inside one of the data flows (several parallel pipes with sorts, merge joins etc...). It appears to hang in multiple pipes within the data flow component. The problem is reproducable, we just kill it and re-run, and it appears to hang in the same places.
Now here's the odd thing: as we simply open and close some of the components in the pipe line after the place it hangs, a subsequent run will go further in the pipeline before hanging. If we open and close all the components after the point it initially hung, the data flow will run fine, from there on out. When I say "open and close" I mean no changes are made, we simply double-click the component, like a merge join, then click 'close.'
To me this does not seem like a memory problem but likely something is wrong with the metadata, where opening a component and closing it somehow alters the metadata to "right it".
This seems to occur intermittently after we make modifications to the package. It's like if you make any mod, even unrelated to the data flow, you then have to go through and open and close every component in your package to ensure it will work. Again, no errors or warnings are fired.
Help! I am using Script Transformation to output a new column as image[DT_IMAGE] field to store serialized object. In the VB script, the sample code as
The package always runs fine on my developing machine and will halt on other machine at AddBlobData after certain number row records were processed. I am stuck here. Anyone has any suggestion?
What I need is reading data from mutiple tables in one database and writing into a single table in another datable. In order preserve all the columns data, I use input column fields to construct a new object and then serialize it, and store the serialize data into detination db table. (The object and serialization function is coming from c# dll.)
Dim b As BusinessLicense = New BusinessLicense() b.ApprovalDate = Row.approvaldate b.BusinessId = Row.busid b.BusinessName = Row.busname b.NaicsCode = Row.naicscode b.NaicsDescription = Row.naicsdescr b.OwnerName = Row.ownername b.Phone = Row.phone b.Pkey = Row.pkey b.RenewalDate = Row.renewaldate b.StartDate = Row.startdate b.Suite = Row.suite
Row.serializedobject.AddBlobData(Serializer.Serialize(b)) '''----This is blocking line Row.infoType = BusinessLicense.TYPE
Both machine is xp with sp2. and standard SQL Server 2005 - 9.00.1399.06
If I have a *.dts file, and want to change the originating server database from SERVERA DB_A to SERVERB DB_B, is there an easy method to do this, besides editing the file in the GUI form?
I need help transferring a SQL database. I currently have my website database on a SQL 4.0.27 (vdeck) and want to transfer it to a new server (VPS) which has SQL 4.1.2
I have imported the database to my desktop computer. It will not let me import to the new server. I have to edit/reconfigurate.
Any suggestions, comments, advice, help would be greatly appreciated.
hey guys, is it possible to add or edit columns once there is data in the tables?? not the data in the columns but the columns themselves.. for example lets say i have a table with 4 columns and for some reason or another, i want to add a 5th or 6th column after data has already been entered, like a few months down the line..
I have scheduled a T-SQL job that runs every morning using Enterprise Manager. Now I want to change the SELECT and UPDATE statements that this job runs, but I can't find anywhere to edit a job that has already been scheduled. Any help would be appreciated.
can someone help me why it produces an error.... error is: ERROR 22001 Microsoft ODBC SQL Server Driver SQL Server STRING or data of BINARY was cut short. ERROR 01000 Microsoft ODBC SQL Server Driver SQL Server statement was ended. This is my code here for editing a record....
Protected Sub Button_save2_Click1(ByVal sender As Object, ByVal e As System.EventArgs) Handles Button_save2.Click '||||| Create string connection Dim StrConn As String = "Dsn=MS_PKG01;UID=emiline;APP=Microsoft® Visual Studio® 2005;WSID=MSHNP200603;DATABASE=MS_PKG01;Trusted_Connection=Yes" '||||| Create connection object Dim MyConn As Odbc.OdbcConnection = New Odbc.OdbcConnection(StrConn) '||||| Open connection MyConn.Open() '|||| Create odbcCommand object Dim Update_record As New Odbc.OdbcCommand("UPDATE TM0001 SET TM0001.syain_name = ?, TM0001.syain_pass = ?, TM0001.office_id = ?, TM0001.birth_date = ?, TM0001.empl_date = ?, TM0001.user_iden = ? ", MyConn) Dim hireYear As String Dim hireMonth As String Dim hireDay As String Dim date_hire As String hireYear = DropDownList_hire_yr.Text hireMonth = DropDownList_hire_mo.Text hireDay = DropDownList_hire_day.Text date_hire = hireYear + "/" + hireMonth + "/" + hireDay '|||| Add command parameters Update_record.Parameters.Add("@P1", OdbcType.Char, 8).Value = TextBox_id.Text Update_record.Parameters.Add("@P2", OdbcType.Char, 20).Value = TextBox_name.Text Update_record.Parameters.Add("@P3", OdbcType.Char, 20).Value = TextBox_pswd.Text Update_record.Parameters.Add("@P3", OdbcType.Char, 40).Value = DropDownList_office.SelectedValue Update_record.Parameters.Add("@P3", OdbcType.Char, 2).Value = date_hire Update_record.Parameters.Add("@P3", OdbcType.Char, 10).Value = TextBox_bday.Text Update_record.Parameters.Add("@P3", OdbcType.Char, 1).Value = DropDownList_iden.SelectedValue '|||| Execute command Update_record.ExecuteNonQuery() '|||| Close connection MyConn.Close() End SubEnd Class
I want to get (SELECT??) data from a db (SQL), edit the data (+1) and then update (UPDATE??) the table with the edited value. How do I get the value from the db and then edit it and then update the field in the db?? Does someone has an example so I get on the way??
Is there anyway possible to edit table design with data already entered in the table.If not, is there anyway I can cut and paste the info back in. Ive tried importing to access and then back to SQL, however, when i tried to view my table design in ASP.NET webmatrix, it gave me an error.
I've just created a new maintenance plan (sql2k sp2) and attempted to re-enter it to make a change. right click - choose properities (or double click)...I see the dialog box 'flash' very briefly but it's not making itself available for editing.
I've deleted and recreated the plan ... same result.
We have an application that used Ms-Access for its backend. We wanted to move up to MSDE so I wrote a conversion utility that creates the SQL database, copies all the data from the Access database, creates all the indexes and relationships.
Now, the application used a lot of sql statements which would have to be rewritten for SQL Server so we decided to create a ms-access database with linked tables to the SQL server (ODBC). Don't tell me not to do that because I have no control over it :).
The weird thing happening now is that: All the records that were moved from access are fine. New records can be added fine. However, when trying to edit/delete the new records, I get an error saying that: Write Conflict. I can't change the record because it has been changed by another user since I started editing it.