Can A DTS Package Be Scripted Or Transferred To Another Server?
Jul 20, 2005
I have production SQL Server database that must be moved to a new
machine. There is a fairly complex DTS package on the original server
that is used to handle the weekly updates to the database.
Is there a way to export this DTS package in order to set it up on the
new machine as well?
I have transferred my program to a 2000 server and while programming it was automatically using the local ASPNET login and when transferring it to the server it says that it is having problems logging into the sql server using an anonymous login. How do i change the program so that it can use the sa login or some other assigned login?
FYI: I've posted this on a couple of forums and haven't gotten any response. I hope someone here can help since this is way past due.
First I'll give a little background on our situation.
Log Shipping and Replication are out, so I am scripting a backup locally, an xcopy to a remote box, and then a restore.
In the early stages of this, I'm trying to do 3 databases. 2 of them work fine alone. It's when I add the 3rd one that I have a problem. I noticed that in the 2nd stored procedure that I probably need to take out the WITH REPLACE if I'm dropping it beforehand as well. I don't have time to test it on this box until later tonight. I don't think that's the issue because it was doing the same thing before I added the drop. I'm overwriting the .txt file so I don't have the exact error that it's giving. I believe it's something similar to "Server: Msg 11, Level 16, State 1, Line 0 General network error. Check your network documentation." I believe it also said [SQLSTATE 42000].
Now for the code. Props to Tara and the code she's put online.
Any help would be appreciated and I'll be glad to help answer questions related to what I've got.
CREATE PROC sp_backup_user_dbs3 AS SET nocount ON DECLARE @Now CHAR(14) -- current date in the form of yyyymmddhhmmss DECLARE @cmd SYSNAME -- stores the dynamically created DOS command DECLARE @Result INT -- stores the result of the dir DOS command DECLARE @RowCnt INT -- stores @@ROWCOUNT DECLARE @DBName SYSNAME DECLARE @filename VARCHAR(200) -- stores the path and file name of the BAK file DECLARE @loglogical VARCHAR(1000) DECLARE @datalogical VARCHAR(1000) DECLARE @restoreData VARCHAR(255) DECLARE @restoreLog VARCHAR(255) DECLARE @backupFile VARCHAR(255) DECLARE @physicalNameData VARCHAR(255) DECLARE @physicalNameLog VARCHAR(255) DECLARE @physicalNameDataStripped VARCHAR(255) DECLARE @physicalNameLogStripped VARCHAR(255) DECLARE @ExecStr NVARCHAR(4000) DECLARE @strSQL VARCHAR(1000) DECLARE @restoreToDataDir VARCHAR(255) DECLARE @restoreToLogDir VARCHAR(255) DECLARE @path VARCHAR(100) SET @path = 'I:ackupMoveTo14'
--we need to delete all the old backup files from the I:ackupMoveTo14 folder -- Build the del command SELECT @cmd = 'del ' + @path + '*.BAK' + ' /Q /F' --PRINT @cmd EXEC master..xp_cmdshell @cmd, NO_OUTPUT
CREATE TABLE #whichdatabase ( dbname SYSNAME NOT NULL )
INSERT INTO #whichdatabase ( dbname ) SELECT [name] FROM master.dbo.sysdatabases WHERE [name] IN ( 'db1', 'db2') ORDER BY [name] -- Get the database to be backed up SELECT TOP 1 @DBName = dbname FROM #whichdatabase SET @RowCnt = @@ROWCOUNT -- Iterate throught the temp table until no more databases need to be backed up WHILE @RowCnt <> 0 BEGIN SELECT @filename = @Path + '' + @DBName + '.BAK' BEGIN backup log @dbname WITH truncate_only END
-- Backup the database BACKUP database @DBName TO disk = @filename
DELETE FROM #whichdatabase WHERE dbname = @DBName
-- Get the database to be backed up SELECT TOP 1 @DBName = dbname FROM #whichdatabase SET @RowCnt = @@ROWCOUNT -- Let the system rest for 5 seconds before starting on the next backup WAITFOR delay '00:00:05' END
DROP TABLE #whichdatabase SET nocount OFF BEGIN SET @cmd = '' SET @cmd = 'xcopy I:ackupMoveTo14*.BAK \RemoteServer /C /Y' EXEC master.dbo.xp_cmdshell @cmd END BEGIN
EXEC [RemoteServer].master..usp_restoreDbsFromDir2 END RETURN 0 GO
3rd Step(the code for the usp_restoreDbsFromDir2 on the remote server):
-- Get files sorted by date SET @cmd = 'dir ' + @restoreDir + '*.BAK /OD'
CREATE TABLE #Dir (DirInfo VARCHAR(7000) ) -- Stores the dir results CREATE TABLE #BackupFiles (BackupDate varchar(10), BackupFileName nvarchar(1000) ) -- Stores only the data we want from the dir CREATE TABLE #RestoreFileListOnly ( LogicalName nvarchar(128), PhysicalName nvarchar(260), Type char(1), FileGroupName nvarchar(128), [Size] numeric(20,0), [MaxSize] numeric(20,0) )
INSERT INTO #Dir EXEC master.dbo.xp_cmdshell @cmd
INSERT INTO #BackupFiles SELECT SUBSTRING(DirInfo, 1, 10), SUBSTRING(DirInfo, LEN(DirInfo) - PATINDEX('% %', REVERSE(DirInfo)) + 2, LEN(DirInfo)) FROM #Dir WHERE ISDATE(SUBSTRING(DirInfo, 1, 10)) = 1 AND DirInfo NOT LIKE '%<DIR>%'
-- Get the newest file SELECT TOP 1 @bkpFile = BackupFileName FROM #BackupFiles ORDER BY BackupDate DESC
SET @rowCnt = @@ROWCOUNT
-- Iterate throught the table until no more databases need to be backed up WHILE @RowCnt <> 0 BEGIN
SET @cmd = @restoreDir + @bkpFile
INSERT INTO #RestoreFileListOnly EXEC('RESTORE FILELISTONLY FROM DISK = ''' + @cmd + '''')
--get the dbname from the bkpFile name --SET @strSQL = CHARINDEX('_db_', @bkpFile) --SET @dbname = LEFT(@bkpFile, @strSQL - 1)
SET @backupDisk = @restoreDir + @bkpFile SELECT @datalogical = LogicalName FROM #RestoreFileListOnly WHERE Type = 'D' SELECT @loglogical = LogicalName FROM #RestoreFileListOnly WHERE Type = 'L' SELECT @PhysicalDataPath = PhysicalName FROM #RestoreFileListOnly WHERE Type = 'D' SELECT @PhysicalLogPath = PhysicalName FROM #RestoreFileListOnly WHERE Type = 'L'
SELECT @strSQL = 'alter database ' + @dbname + ' set offline with rollback immediate' --alter database MyDatabase set offline with rollback immediate --PRINT @strSQL EXEC (@strSQL)
SELECT @strSQL = 'DROP database ' + @dbname --alter database MyDatabase set offline with rollback immediate --PRINT @strSQL EXEC (@strSQL)
I am thinking about moving some backups from the local machine to a remote server. Right now, I am using Ola Hallengren script for backups. The script performs checksum on the backup while it's being written on the disk. I am going to move the files using Robocopy utility. How do I check the data integrity once the backups are transferred to the remote server?
Does anyone have any great suggestions on how I can import an online XML file into an SQL 2005 table?
So far I tried to accomplish it with SSIS Data Flow Task where my source is XML Source (Data access mode: XML file location; XML location: URL, Use inline schema = True). This set up properly identified the columns to be imported.
I used Copy Column data flow transformation task to load data to OLE DB destination table that has same structure.
When I run the task it does execute with no errors, however the table remains empty. It looks like I am failing to read the actual data.
Do you have any suggestions? I am willing to go around this approach with stored procs/com/you name it €“ just make it work!
Hi,I need to delete rows from my user tables dependant upon there nonexistence from another table:delete studentwhere student_id not in (select student_id from tblStudent)The reasons is convoluted, simplest explanation is that our operationalsystem allows the change of business keys. This wreaks havoc in thedata warehouse.So, I'm look for help on how I can delete rows from tables that have acolumn STUDENT_ID. I'd like the script to search for the tables, thenperform the delete.I don't know where information about user tables are stored, nor how toloop through the results to do the delete.Any Ideas are appreciated.
I have SQL Server 2005 x64. I found in the studio I can generate allsorts of scripts, including the creation of indexes.I works, however the script contains the CREATE TABLE statements aswell. I want the scripts to recreate the scripts on another filegroup(testing a theory from previous post).Am I missing a switch in the wizard to exclude CREATE TABLE?I did create the script, but then I manually deleted all the CREATETABLE statements, which was a pain.ThanksRob
Hi all :)My apologies if I posted in the wrong groups, but I just jumpedin MS SQL waters, so any guidance will be appreciated.What I'm trying to do is the following process:[1] present operator with a web page (script)[2] once filled with db name and username, script will create..sql file[3] osql.exe will be called with, i presume, -i file.sql andcreate a databaseI have limited SQL knowledge, but I got the information fromEnterprise Manager when I ran 'All Tasks -Generate SQLScript' on how the .sql file should look like.I realized what are the commands that would create a fresh DB(I ran this for newly created DB), and figured my .php scriptshould create such a file.It's fairly basic, and I'm almost sure all of you know howoutputed .sql file looks like, but anyway here it is:Script is called with parameters 'six' as database name and'magarac' as user name:---------------------------------------------------------------IF EXISTS (SELECT name FROM master.dbo.sysdatabases WHERE name= N'six')DROP DATABASE [six]GOCREATE DATABASE [six] ON (NAME = N'six_Data', FILENAME = N'E:Databasepathsix_Data.MDF' , SIZE = 1, MAXSIZE = 20,FILEGROWTH = 10%) LOG ON (NAME = N'six_Log', FILENAME = N'E:Databasepathsix_Log.LDF' , SIZE = 1, MAXSIZE = 20, FILEGROWTH= 10%)COLLATE SQL_Latin1_General_CP1_CI_ASGOexec sp_dboption N'six', N'autoclose', N'false'GO....use [six]GOif not exists (select * from dbo.sysusers where name =N'guest' and hasdbaccess = 1)EXEC sp_grantdbaccess N'guest'GOif not exists (select * from dbo.sysusers where name =N'sinisam')EXEC sp_grantdbaccess N'magarac', N'magarac'GOexec sp_addrolemember N'db_owner', N'magarac'GO---------------------------------------------------------------I managed to get an exact replica of .sql file that EnterpriseManager created. It leads me to believe that this way ofautomated database creation is indeed possible.Really sorry for making you to go through all this text, butafter I get a green light on this from you guys, I'll have abit more problematic question.Is there any reason why this should not be used, or would fail?Thanks in advance :)P.S.Just as a heads-up, next part of my problem is automatedcreation of new MS SQL server login to use with new DB.
I am trying to figure out a method of transferring all objects from a SQL 6.5 database to a new SQL 7.0 server. Right now, the only method I can see is to upgrade my SQL 6.5 server to 7.0. I tried to utilize 7.0's DTS but I could not figure out how to transfer ALL objects into 7.0.
Does anyone know where to declare the target directory for scripted output from Enterprise Manager?
Every time I get EM to script Tables I want the output not to go to c:mssql7inn . . but somewhere else. Trouble is I can't find where to set this variable. It won't work in a shortcut - I've tried that, and I can't find an ini file either.
Seems like a simple problem to fix if you just know where to look !
Hi i have an web app demo that allows users to add info and change attributes within a SQL 2005 Express DB. I'd like to restore a clean copy of this database every couple of hours from a .bak file using a Windows scheduled task on the server. Has anyone got a .sql script for database restoration that i could use and call using a .cmd script file? Thanks.
I backed up a database from one server and restored in it on another server the database was installed with data, but the security info did not get across.
Apart the IDENTITY property, what other properties or attributes are not transferred to the target schema?
I know that one can use NOT_FOR_REPLICATION for identities, but I am interested in a (complete?!) list of metadata objects that transactional replication *prefers* not to transfer across to the target by default.
We are currently running sql 2000 and are moving our database onto sql 2005 running on a different box.
We have managed to move the entire database, with users however the users permissions on specific tables/views/stored procedures have not been transferred, does anyone know a way of transferring user permissions rather then doing them all by hand?
The system is a large (over 500 table/views/stored procedures) and a very active one and therefore downtime is not optional.
I am working on a project that is creating a new application, my area of the project being the migration of data from the old application database, transforming it, and populating the new database.
The transformations to the old data are done in a staging database.
At the end of the process, the staging database ends up with a lot of new applications tables, populated with the migrated legacy data.
We need to move these tables from the staging database to (initially) our test databases, but ultimately what will be the live database.
We have tried using the "Transfer SQL Server Objects Task" in SSIS, but have ran into a problem that a lot of the database tables have default values for columns.
These default values are not brought over.
Example. Tables contain a "GUID" field, which has a default of value of newid()
Right clicking and the table generating the CREATE script generates
[GUID] [uniqueidentifier] ROWGUIDCOL NOT NULL CONSTRAINT [DF_tbCRM_Client_GUID] DEFAULT (newid()),
However, the Transfer objects task does not create this default of newid()
Examining the SQL generated by the Import / Export Wizard when investigating this shows that the wizard generates this column as
[GUID] uniqueidentifier NOT NULL
and the column default value is lost.
Is there something i should be setting somewhere to force SSIS to bring these column definitions over correctly?
I created a package which passes some infornmations( through parameters) to its child package.
I need to do some processing in parent package based on execution status of child package.i.e.
if child fails then some operation and if child succeeds then other operation.
To determine the status of execution of child package I am using two differnt constraint ..one constraint is having value "Success" and other having value "Failure".
My problem is that when child packge is executed successfully the constraint with value = "Success" works properly but when child fails the constraint with value "Failure" does not work.
I'd like to know if there's a way to pass parent package parameters to a package executed by SQL Server Job Agent? It appears that sp_start_job doesn't have any variable that could accomodate this.
The master package has a configuration file, specifying the connect strings The master package passes these connect-strings to the child packages in a variable Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s. We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1: The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €œcannot connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2: When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3: When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4: When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €œtimeout when connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
I have packages stored in SQL store. I was letting users run the packages from a .net app that I made with
Microsoft.SqlServer.Dts.Runtime
Now I have noticed this causes the packages to run on the client pc cpu, as well as the network traffic is done via the client pc, in my particular case this is slow.
From the doc and in this forum I have found that you can run a package on the Server cpu through sql agent, let packages be run in a sql job. after that you can start a package from an application with the SQL sp_start_job .
But How do you set a user::varibale in a package if you have to start the package from a sql agent job ?
I have two calls to stored procedures that in an SSIS package fails silently. They are simply not executed in production but works fine in test, nothing happens and the sql server agent reports that everything has gone just fine.
In test they have 1 server with db A and B. No issue here.
In prod they have 2 servers with db A and B. On server 1 sql server agent executes a job that includes an SSIS package that on server 2 runs a couple of sp's. That user is db owner on server 2 db B and yet nothing happens. The sp's are not executed.
If I in prod run the job manually then it works, but not when run with the sql server agent account that as said is even db owner.
I am using SQL Express 2005 server and sql compact edition for my PDA. For synchronization, which SQL Server 2005 Compact Edition Server Tools installation package do I use?
Is it Sqlce30setupen.msi or sql2kensp4.msi or sql2kensp3a.msi? I know for sql server 2005 its Sqlce30setupen.msi. Is it the same for the SQL Express 2005?
Sql server agent is running under a domain account that is a member of administrators and domain users amongst others, and the package is executed as the service account. Connecting to servers on the same domain works and when I run it from the msdb package store in ssis (ssis runs under the network service account ...) it will connect to the pop server as well. Permissions, fiddly proxies .. the answer's out there somewhere
Some of you guys seem to be gurus with the new Integration Services technology, so I hope someone can lend me some advice, as I haven't worked much with integration before.
Ok, here we go. What I want to do is select some data from my database and export the result to a flat file (really a .csv file with values delimited by semicolons). Sounds like a simple thing to do? Well, I'm sure it is when you know how to do it :) I know I could manage the same thing by writing a C# class that creates that .csv file, but the decision has been made to use Integration Services for these kind of operations.
I created an SSIS project in Business Intelligence Development Studio, and created a package (I defined the task flow etc.). By choosing "Execute package" from the IDE I managed to create the flat file, and everything seemed sweet. However, When trying to execute the package (package.Execute();) from C# code, it only results in a failure. I have read on several sites that this has to do with my program lacking the rights to run the package from the client side. OK, fair enough. I need to create the package on the server, and use an SQL Server Agent to execute the package through the agent.
Can anyone tell me how I need to do this? How can I ensure that the package is created on the sql server instead of locally on my development computer? When I create a new SSIS project the package is already made, and it is created locally on my PC.
I hope someone can give me some help. Even a little nudge would be appreciated ;)
I copied and added an existing package as a new package to a project and have been having trouble with settings reverting to those for the original package after I modify and save the changes for the new package. Sometimes happens with the save itself, other times it happens when I close and re-open the package. Most cases are with connections that revert back to the original file reference, but there are also control flow and data flow elements that keep reverting back to either settings from the original package or defaults that result in the re-opened package being in error. Not sure how to get around this issue short of developing the new package from scratch which I'd rather not do since it is fairly complex. Any help anyone can provide is appreciated. Thanks.
I have a for each loop that populates from a set of flat files into a Sql Server table, I run the Flat file Import via a dts package embedded into Execute DTS 2000 Task. I want to pass the Sourcefile Name that is fetched by the For Each Loop to assign it Global Variable in DTS. how this can be made ?
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?
I have developed an SSIS package for ETL purpose. I am invoking the SSIS package through .Net console application by referencing the ManagedDTS Assembly. I am able to execute the package in Sql Server 2005 Developer Edition and it runs fine till completion.
But when i try to execute the packahe in Sql Server 2005 Standard edition, by invoking the package through .Net console application the status of the package is failure.
Can any one help me how to over come this problem.