I am hoping someone can help me with a problem I am having with SSIS. I am attempting to move certain sql agent jobs from our older sql 2000 server to our new sql 2005 server and I am receiving the error [Transfer Jobs Task] Error: Execution failed with the following error: "The specified '@server' is invalid (valid values are returned by sp_helpserver).".
Does my server name need to be the same if I am using SSIS to transfer jobs? Is there a way to update the server name during the execution process?
This seems like it should be a relatively easy task but has not been so far.
I am transferring DB2 data to SQL server 2005 in ssis. the connection i I am using is the IBM OLEDB driver for DB2 which uses the ODBC connection on the machine. The issue is that while transferring data i get errors reg unicode vs non unicode data. the db2 data is showing unicode data, but when i see the datatype in the db2 database it is a varchar. so when i transfer the data to a varchar in sql server 2005, it errors out....meesageis unicode vs non unicode data. Why is the data from db2 coming across as unicode string. Also the transfer works fine in SQL server 2000, why is is not working in sql server 2005. Please help !!!!
I want to create an automatic job on SQL 2005 to transfer logins between 2 servers, (Principle and Mirror). If I use the transfer logins task I get the error:
"The database "x" cannot be opened. It is acting as a mirror database."
I guess this is because it is trying to assign the default databases for the logins??
I've also looked at [sp_help_revlogin] procs but that requires manual intervention.
How does everyone else transfer logins in this situation?
I have a SSIS package that transfers a couple of tables from one database to another database on the same server. It works fine in most of the machines we tested with. However on one of the customer machines, it consistently fails with the error message, The Return value was unknown. The process exit code was -1073741795. The step failed.
This package runs as a scheduled job on the sql agent. When i did a sql profiler to see what is going on, i noticed that the last step before the bulk insert, it gets the collation and the schema id of the 2 tables. My guess is, it compares these values from the source and destination databases and makes sure everything is ok before copying.
On a machine where this works, it goes ahead with the the next step which is the bulk insert itself. Whereas on the machines where it doesn't work, it stops right after this step. i.e. it does not even bother to call the 'bulk insert' api. Which makes me think it is doing some kind of validation with these values and it is getting something that is not expected.
If the collation or the schema was an issue, why throw an 'return value unknown' error? Can the error be more specific? Any other possible reasons for such a behavior? Any clues?
I set up a basic Transfer Database task (online, copy) to copy a DB. It works great except for the fact that it isn't transferring the PK's and FK's. It also looks like it did not transfer the views. Any idea why? Anything else the Transfer Database task doesn't actually transfer?
Hi, I have to transfer data from JDEdwards source system to SQL Server 2005 using SSIS. Is it possible for me to connect to JDEdwards directly from ODBC Connection Manager provided by SSIS. If not then what is the way to implement this
Hi I have a strange scheduling problem. I have a job, i test it through BIDS - Works I test it through SSMS - works I run the job - Works I schedule for a run time job - Works I schedule for a recurring job - Works
However (here is the really annoying part) it fails to run at the night run (11.20PM) there are 6 other packages that run fine after it , 11.30 , 1AM .. 4 AM etc ...
So last night I changed the package to run at a later time, and it worked, but the other package that ran 2nd (which now ran 1st) failed !!
its almost as if the 1st package has to fail !!!!! When i look at the history and detailed log all it says is "The step Failed"
anyway , I can set up a workaround by starting a dummy job that is supposed to fail..
FYI the jobs are run under administrator and they are protected by a password (encryptSensitiveWithpassword) .
With a DTS scheduled job if I needed to make a modification, I just changed the DTS and saved it, and the job would automatically Run the latest DTS (so long as the name was the same)
When I migrate a package to SSMS and schedule it , if I need to modify the package, do I need to modify it through BIDS (visual studio), then re-import it to SSMS and then reschedule it ?
Please can anybody help me in transferring existing SSIS Packages saved in a shared folder location from development server 2ED to Live server TWD1. Both has SQL server 2005 running and has visual studio 2005 Currently about 25 SSIS packages are executed from the development server transferring data on Live server TWD1...these ETL process is called from development server but executed on live server. Now the problem is when i call these packages from the shared folder from live server it crashes.....i need to changes something to shift the whole package to the live server..and execute on live server itself instead of recreating the whole 25 process from scratch.....also i use optimize for many tables ..and run in a single trancastion....so how can i see the mappings of source and destination tables.
Please let me know the process how i can achieve this. Thanks George
I have just started to use Integration Services Catalog with SQL Server Agent. After I deploy a new version of the project the any Server Agent job that uses any of dstx still uses the old one. How do I fix this?
I have a package that has a file connection. We have set the connection in the package for development - we also have the connection in a config file.
We are calling the package from SQL Jobs. In the step we call the package in, we try to set the value for the connection in the Data Source tab. However, the package does NOT use the value we enter.
I then removed the connection from the config file - then the package uses what the connection was set to in development. It still will not use what we enter in SQL Jobs.
Furthermore, when we run the package we lose the values that we put in the DataSource tab!!!!
My work around has been to change the SQL Job type from an INtegration services package to a straight command line - this works.
The question is - can you set a package's connection strings from SQL Jobs using the SSIS Job Type ???
On my laptop I've got SQL2005 developper edition SP1 with 3 SSIS-jobs scheduled through SQL-agent.
These 3 SSIS-jobs are in fact maintenance plans which are also used on our production SQLServers.
These take backups (full, transactional), Cleanup old backups, reorganize indexes, update stats ...
One of these jobs runs every hour to take transaction logbackups. After installing SP2 this started failing.
When I opened the sources of that job in visual studio It seemed like the unit of thime was changed for cleanup jobs fe cleanup bak-files after 4 days changed into after 4 hours. The backup task for transactional backups which had the property to do that for all databases whas changed into specific databases which no database selected. Even when I changed that into All databases and saved it, when I opened that backup task again, this was changed back into specific databases with no database selected.
I uninstalled SQL2005 and reinstalled it, including SP1 and everything works well again.
I'm having an issue in SQL Server 2005 with jobs that execute SSIS packages. The jobs run fine for a week or so, then I'll come to find that four or five (of the ten or so jobs) are hung in "executing" status. They seem to hang indefinitely (as some have been "executing" for hours with no end. The schedules of the hung jobs are all different, varying from every 10 minutes to nightly. The packages perform completely diffent tasks, as well. I can't seem to find any common thread with the jobs that get hung, other than they are all executing SSIS packages.
I've tried manually stopping the jobs and restarting the agent and SQL Server but the jobs hang again on there next scheduled run. The only thing that fixes the issue is rebooting the box, and then the jobs hang again in a week or so. Could some sort of memory leak be consuming resources throughout the week and be causing the jobs to eventually hang? I just rebooted the box and the sqlagent90.exe process is currently using about 7mb of memory. I'll keep an eye on it. Any other suggestions?
I've thought of creating another job that stops jobs that are hung, but what's to say that this job won't get hung as well? Plus this seems like a band-aid fix...
I don't recall having these problems until installing SQL Server 2005 SP2. Could this be related? I've searched like crazy and still can't find a resolution to this. It's becoming a big PITA...
Anyway, any suggestions would be very much appreciated!
I would like to create an SSIS package in 2005 and run it in 2000. Is there anyway to do this? Or does SQL Server 2000 have a precursor to SSIS? I am trying to create a job to automatically catch and kill orphaned processes. -Kyle
I have created simple package loading data from source to destination.in BIDS working fine, but when i created job through sql server agent job I am getting below error.
Error:- The Job was invoked by User . The last step to run was step SQL AGENT JOB.
Hello All, My database application has certain stored procedures which are scheduled to run at certain predefined interval (say once everyday). Now to implement this logic I have two options with me, first being to use the EXECUTE SQL TASK of the SSIS package and schedule it using the windows scheduler and the other is to use the SQL Server Agent Jobs and schedule it there itself. Now the current implementation is using the SSIS packages and I even know that the second approach of using the SQL Server Agents Job is better. The only thing I want to know is whether it makes sense to invest time in converting my SSIS packages (20 in all) to SQL Server Agent jobs or not. Can somebody briefly explain or give some pointers regarding the performance benefits of Agent Jobs over the SSIS packages.
Just an FYI for those scheduling SSIS packages on 64bit Itaniums. This came to light when using a 32 bit OLEDB driver on a 64bit Itanium.
When manually executing the package on the server through SQL Management Studio, we got the expected results. However, when we scheduled the package as a job, it failed every time. The error was: "The AcquireConnection method call to the connection manager 'ConnectionName' failed with error code 0xC0202009€?.
We tried everything to fix this, but nothing worked. Finally we found that when manually running the package, SQL Management Studio was calling the 32bit version of dtexec.exe, while the Agent calls the 64bit version of dtexec.exe. The fix is to not use the "SQL Server Integration Services Package" type in the job step, but use "Operating system (CmdExec)" type. Then in the command box, call the 32bit (x86) version of dtexec.exe.
Environment: SQL Server 2005 Enterprise Edition x64, 3 server cluster. Two active servers with seperate instances and one passive server. SQL Server was installed on the two active servers.
Problem: When I fail over either of my instances to the passive server in the cluster my maintenance jobs fail to run and there are error messages in the application event viewer "SSIS Subsystem failed to load". I am guessing that all of the needed components are not installed on the passive server? Is this a close guess? If so, exactly what components are missing and do you have to have another license to install them?
I'm trying to execute my package using schedule in SQL Server Agent, I've already tested my package by run it manually in Integration services and it works. The table created, the data from my flat file also inserted into the table correctly and the result return with success.
The question is why when I execute my package using SQL Server Agent, the SQL Server Agent keep executing my package like infinite loop until I stop the job. after I stop the job there is no error generated by sql server. Could you figured why this happen?? I've already tried to upgrade into SP2 and set the package protection level and still not get good result from it. thank you.
I have deployed to production a number of nested packages (parent packages that call child packages) to the SQL msdb via the Save As option rather than building a deployment utility. These packages reference configuration files in a static location off of the c: drive on the production server. In the development environment, when connection changes are made and I run the Reload with Upgrade option the connection manager takes on the new server and user id settings. However, out on the production side I get the following error from the SQL job log:
Cannot load the XML configuration file. The XML configuration file may be malformed or not valid.
As a result the SQL job uses the default connection information which references the development database rather than the production database. I did research the error but found no good solutions. Is there a way to ensure the configuration files are formed correctly and that the packages are correctly referencing the configuration files? We are trying to run the ETL updates via a SQL job.
Hi I created some databases on my pc at work (using Visual Web Developer 2005 and Microsoft SQL Server Express), I now wish to move them onto my home machine, Can I do this by simply copying the .MDF files and then moving them into the App_Data folder of my new app? Do MDF files contain everything (schema, data, stored procedures etc) or am I missing something? Cheers, Adam
Currently I need to transfer existing database to another server. Unfortunately, after restoring all database file, user information is not available. It is stored in Master database.
I would like to know how can I transfer user login, database access, store procedures to another server ?
In Sql 6.5 there was an option to Transfer the db. I used this quite often to make a duplicate test db of my production db. Under 7.0 I haven't found it so easy to do this. DTS wizard hasn't worked well for me. I can get it to transfer tables but not all of the objects. Any ideas? Thanks.
I have to transfer a database to a new machine. I am planning on doing a backup and restore. My question is, will the logins and roles be transferred along with the database?
This is the situation: I had a SQL 6.5 server with some databases. I still have access to the .dat files, however, SQL 6.5 will not start anymore. Is it possible to copy the .dat files to the new sql 7 server and somehow get them imported into the new SQL 7 server? Any help would be greatly appreciated.
Thanks Charl (please also copy my email address if possible when you reply)
Hi Everyone, Have anyway come across any easy method of transfers DTS packages across two servers which are existing on two different networks.
I haven't been able to use the method mentioned in article "DTS How to...Transfer Packages between Servers.." since the two servers I'm trying to move TO and move FROM are on two different networks which aren't visible to each other for my to register from the same box.
Appreciate if someone to point me to a means of doing this or any articles that explain how to do this..
We have a SQL 6.5 database that was expanded larger than was really necessary. I have created a smaller size database and am wondering what the easiest way would be transfer ALL of the data and objects from the existing database into the new one. Is SQL transfer manager reliable? Are there any better ways?
How do i transfer DTS packages from SQL 2K server to another? Since the no.of packages is high i don't want to edit them individually and do a 'Save As'. Thanks
I appreciate this may appear to a bit of a noob question but bear with me! ;)
I'm having a problem when I transfer my locally developed MS SQL server db to my hosts server.
I am logging into the host server and then importing the tables and data from my local machine. All the tables and data then get transferred alright but the ID colums lose the primary key and identity setting and any default values in the table structure also get lost.
This means I then have to go through each table on the host server an put things straight. A pain in the bum... I've tried creating the table structure using a SQL script which keeps the ID colum but not the default values.
i have a couple of tables interlinked, and in one table it has a column named contact with such info in as S Smith, John Burn. and i need to split this up, so it can go into a number of other columns in another table, such as initials, name surname.
the best ive come up with so far is, isnt very good.