The scenario:
9 db tables populated by 9 Excel Import Files via DTS.
Will I need to create a DTS package for each import? Columns are identical in all 9 - the only thing different is the destination table name and source file name.
I've had to map over 80 columns using DTS and don't want to do it for each instance!
I have SQL 2000 running on Windows 2000 server. I have created a DTS Package, which is running fine. Now I want to transfer/restore/copy this DTS Package to another SQL 2000 server(not an instance). How do I establish this?. Can anybody guide me?.
Is it possible to automatically or programatically transfer or copy a DTS package from one server to another? If so, could anyone out here give some pointers on how?
Is there a way to strip out the annotations from a dtsx package. I'm looking at quite a few and would like to automaticly pull those out and store them somewhere in either a table or a text file.
Is this possible or am I going to have to hand copy them out?
I am new to sql server and the database concepts and just started learning. I want to copy a database and local package from a old sql server to a new server. can anybody guide me with the steps? I am very new to this field so if you could give me detailed steps i would very much appreciate it.
Here is my problem: I'm trying to copy a few tables from an ODBC database located on a PC on my LAN to a SQL Server on a remote server.
To do this I have created a DTS package on the SQL Server which runs smoothly, copying the data as it should, but only as long as it copies the data from the ODBC database on my own computer where I have Enterprise Manager installed. The moment I try to copy data from another PC in the network where Enterprise Manager is not used to execute the DTS package the task fails to run.
I'm quite at a loss here and would very much appreciate a notch in the right direction. Perhabs some source or book which I may read to solve the problem.
Trying to build a deployment package. I have a number of dtsx in a project that share a connection config file. When I build, the error states: 'Could not copy file "whatever.dtsconfig" to the deployment utility output directory. ... The file already exists'
I'm trying to migrate a SQL 2000 database to SQL 2005. I'm using the Copy Database Wizard and can copy the database using the detach and attach method, but I would rather use the SQL Management Object method. I get the following error when trying to use this method:
Package "ShellPackage" failed.
This error occurs directly after the following step:
Event Name: OnInformation Message: Transferring data to database RestoreTest from RestoreTest Operator: <WANfirst.lastname> Source Name: BP-BLM-TESTSQL2_BLM-JCAMPVSSQL2_Transfer Objects Task Source ID: {E6765B9E-1B40-49ED-B0CE-F99252AA34B6} Execution ID: {213272C4-37E9-4A1E-A5B9-A2F9A61348B3} Start Time: 12/20/2005 3:05:58 PM End Time: 12/20/2005 3:05:58 PM Data Code: 0
The database is created successfully but the data is not transfered. Also, the logins are created on the new server successfully. Has anyone seen this error or have any ideas on how to solve this problem? I would greatly appreciate any help!
On my workstation the option is there, but on a couple other workstations that option is not available from the file menu. I tested doing the exact same thing and the option just isn't there. Anyone have any idea's?
Hi I have a SSIS package that I run from dtexec command prompt in parallel. they run completely isolated. Sometime when I push 3 instance of the packege at the same time, one of the instances will fail. I have implemented detail Log on the package to see exactly where it's going wrong.
to brife you on what the package does, I can say in nutsheel that it does copy tables between servers.
due to the nature of the problem, the point in failer can is completely random.
if I run just one instance it will work fine (always). if it is more than one that there is a chance that it might fail. (but there is agood chance that they will run successfully).
my guess is, as this packages share resources (CPU,Mempry and disk I/O) sometimes is there is anyshortage on of the packages can fail. is there anyway to specify how log for example a sql object transfer task will wait before raise and error message.
ALSO is there a way to truncate the memory bufferafter each table copy as it seems like when it is loops for differnet tables they data copy content get's piled up in memory and it get's truncated only when the whole instance is finished not after each table copy
I create a new SSIS package, drag in an Execute DTS 2000 Package Task, select and embed a DTS package which consists only one one task (as above), and then change the source & destination details (svr + user/pwd). Then when I go to the Copy tab, I get the following error when I hit Select Objects, to view the objects which the embedded DTS package should copy:
SQL-DMO error 21776: general error.
On further inspection, none of the objects selected for copy within the atomic/original DTS package, remain selected for copy within the embedded DTS package.
I have googled to search for an answer to this one, but to no avail. Any ideas would be greatly welcomed.
I found a possible bug. If I open/create a new Integration Services Project and then try to save a copy of the package to SQL Server I found that for the option to "save Copy of Package As..." is only available if I am in the package itself. If I click (highlight) on the package in the Solution explorer and then click on the File tab, the "save Copy of Package As..." option is not available.
I notice when I copy an SSIS package 'A' to a new package 'B', the new package 'B' will generate a "login failed for user" message in the data flow components. To copy I "save copy of Package 'A' as."
Some config info:
Package ProtectionLevel = EncryptSensitiveWithPassword Connections are Data Sources Connection strings with password are stored using PackageConfigurations to an SQLServer table. I've verified Package 'A' is in fact using the config table (e.g. it is not using a password or user stored in the package) Data connectios are all SQLServer Native OLE DB Client The account is an SQLServer account (not integrated security)
The original Package 'A' works flawlessly and I get success when I test the connections in Package 'B'.
But executing package 'B' I get: The error message I get is: [Connection manager "MyConnection"] Error: An OLE DB error has occurred. Error code: 0x80040E4D. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Login failed for user 'dwuser'.".
Does anyone know why this would occur and/or how to work around it? I saw another thread where a potential workaround is to create a new data flow task and copy all the data flow components to that task. That won't work well for us because the data flow is moderately complex and when you copy and paste it, SSIS completely re-orders the layout.
This is a typical data warehouse ETL setup where there is a master package that executes child packages (e.g. 'A', and 'B' mentioned above) that each perform the ETL for a specific dimension or fact table.
I am trying to copy a database from our company's external SQL Server(production) to our local SQL Server(development). The Copy Database wizard fails on the step "Execute SQL Server Agent Job". Following is the error in the log file.. Please advise
InnerException-->An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)
According to the help for SSIS, one method of deploying an SSIS package to a SQL Server, http://msdn2.microsoft.com/en-us/library/ms137565.aspx, is to use the File...Save a Copy of <package file> as... menu option.
I don't have that menu option at all. And yes, the package is in focus. My save menu options are simply; Save Selected, Save <package file> As... and Save All.
I am using Version 9.00.1399.00 of the SSIS Designer.
At one time I did have the Management Studio's CTP installed. However it was uninstalled before installing the tools from the Standard Edition. (it would seem like not completely however)
Your help would be greatly appreciated. Thanx much.
p.s. Almost forgot to mention... I am already aware of using the DTSInstall utility as a workaround. It should be noted, however, that despite enabling the "CreateDeploymentUtility" property, the DTSInstall.exe is not copied to the binDeployment directory.
I set up DB mirror between a primary (SQL1) and a mirror (SQL2); no witness. I have a problem when I issue command:
alter database DBmirrorTest Set Partner = N'TCP://SQL2.mycom.com:5022'; go
The error message is:
The remote copy of database "DBmirrorTest" has not been rolled forward to a point in time that is encompassed in the local copy of the database log.
I have the steps below prior to the command. (Note that both servers' service accounts use the same domain account. The domain account I login to do db mirror setup is a member of the local admin group.)
1. backup database DBmirrorTest on SQL1
2. backup database log
3. copy db and log backup files to SQL2
4. restore db with norecovery
5. restore log with norecovery
6. create endpoints on both SQL1 and SQL2
CREATE ENDPOINT [Mirroring]
STATE=STARTED
AS TCP (LISTENER_PORT = 5022, LISTENER_IP = ALL)
FOR DATA_MIRRORING (ROLE = PARTNER)
7. enable mirror on mirror server SQL2
:connect SQL2
alter database DBmirrorTest
Set Partner = N'TCP://SQL1.mycom.com:5022';
go
8. Enable mirror on primary server SQL1
:connect SQL1
alter database DBmirrorTest
Set Partner = N'TCP://SQL2.mycom.com:5022';
go
This is where I got the error.
The remote copy of database "DBmirrorTest" has not been rolled forward to a point in time that is encompassed in the local copy
Hi! I did: alter database mydb set single_user with rollback immediate; exec sp_detach_db @dbname='mydb', @keepfulltextindexfile='true';
then I tried to copy files to new location on other drives, same server but got >>Cannot copy <myfile>: Access is denied Make sure the disk is not full or write-protected and that the file is not currently in use<<
I also tried rename of file without success. I also tried with db service stoppet (not preferred) without success.
How to find out, which process locks the files? Best regards
if i have a given database (a model) and i want to copy this database in the same database instance. Is it ok to copy the mdf and ldf file and attach the files with a new database name in the same instance.
I am attempting to use the copy wizard to copy databases from SQL Server 2005 to SQL Server 2008 R2 w/ FP1.
The copy fails with a login failure to SQL Server 2005. I have a user id & password under Windows for both servers. I have a user id and password under SQL security with the called for admin security rights.
The 2005 server has two instances, 20 databases, two dozen maintenance plans, and over a hundred users. I really would like to use the utility so I don't have to recreate everything manually.
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
I copied and added an existing package as a new package to a project and have been having trouble with settings reverting to those for the original package after I modify and save the changes for the new package. Sometimes happens with the save itself, other times it happens when I close and re-open the package. Most cases are with connections that revert back to the original file reference, but there are also control flow and data flow elements that keep reverting back to either settings from the original package or defaults that result in the re-opened package being in error. Not sure how to get around this issue short of developing the new package from scratch which I'd rather not do since it is fairly complex. Any help anyone can provide is appreciated. Thanks.
I have a for each loop that populates from a set of flat files into a Sql Server table, I run the Flat file Import via a dts package embedded into Execute DTS 2000 Task. I want to pass the Sourcefile Name that is fetched by the For Each Loop to assign it Global Variable in DTS. how this can be made ?
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?
I created a package which passes some infornmations( through parameters) to its child package.
I need to do some processing in parent package based on execution status of child package.i.e.
if child fails then some operation and if child succeeds then other operation.
To determine the status of execution of child package I am using two differnt constraint ..one constraint is having value "Success" and other having value "Failure".
My problem is that when child packge is executed successfully the constraint with value = "Success" works properly but when child fails the constraint with value "Failure" does not work.
I have developed an SSIS package for ETL purpose. I am invoking the SSIS package through .Net console application by referencing the ManagedDTS Assembly. I am able to execute the package in Sql Server 2005 Developer Edition and it runs fine till completion.
But when i try to execute the packahe in Sql Server 2005 Standard edition, by invoking the package through .Net console application the status of the package is failure.
Can any one help me how to over come this problem.
While Creating a script task in Control Flow, I am getting "Package Validation Error". Here is the complete message:
Error at Validate File and Load Data: The task is configured to pre-compile the script, but binary code is not found. Please visit the IDE in Script Task Editor by clicking Design Script button to cause binary code to be generated. (Microsoft.DataTransformationServices.VsIntegration)
As mentioned in the message, I opened the script IDE and added the code I need. When I close the VSA IDE, package designer displays the same error message.
The worst part of whole story is that if I close the package designer and reopen it, I find that all the code I wrote in the script task has been deleted by the package designer. This is not at all acceptable as I saved the package the and still lost all my work. I did all the coding from scratch for that task.
Please respond if anyone faced similar problem.
Thanks in advance!
Anand
PS: If any one from Microsoft is reading this, please see what you guys are coding there. Due to the buggy software you deliver, I am loosing my credibility.<P< P>
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
I would like to standardize SSIS development so that developers all start with the same basic template. I have set it up so it is an available template ( http://support.microsoft.com/kb/908018 ) but I would like it to be the default when a new project or package is created. Is this an option?