I'm having serious problems with the IDE for SSIS for projects that contain more than 5 packages. Especially if these packages call each other with a run package task thats configured with a file connection. Especially annoying are the 20+ "Document contains one or more extremely long lines of text.." messages that pop up during loading / validating. For my project with around 30 packages it takes me around 10 minutes to click through all these pop ups in addition to the long loading time.
Anyone got any tips on this specifically or how to improve performance in the IDE in general? As it is now, the product is a REAL pain to work with for large projects.
We're having a performance problem with a package since an error occurred. The original error came from the Job Manager whicj was unable to start a thread. Our understanding is that this is a memory related issue and we'll deal with that.
What is realy odd is what happened after that. The package executes from a SQL Agent job that includes 3 other packages. Each package is stored on the file system. Package execution time for the affected package changed from less than a minute to over 5 minutes. The other packages continued to execute normally.
In checking the logs there is large time gap between the start of the SQL Agent step and the first pre-validation message that accounts for 4 minutes, as if there is some issue in loading the package. The issue is ongoing. Does anyone know what happens between the start of the SQL Agent job and the first prevalidate message? Is this some type of caching issue?
SQL Agent step starts at 4:05:27 First Prevalidate message: 4:09:24 Package Execution start: 4:09:57 End Execution 4:10:59
I have a general question concerning the impact on the performance of massive parallel data imports in one SSIS-package.
We have a Database on a SQL2005 SP1-Server (2 Xeons 3,8 Ghz, 4GB of RAM) for a report web-app which is updated every day with data of the last year/3 years. The data is extracted from several different DBs on multiple machines at different locations. Right now, there are imports/transformations from 7 companies at 3 locations. The table has ~80 columns and about 2 Mio. Rows. I built a SSIS-Package with one companies import and added the others by c&p-ing all the tasks in the package and changing connection parameters and values. Soon there will be 6 more companies to do imports with, and there will possibly be about 20 some day.
Now, when these 7 imports run parallel, there are 3 simultaneous imports from the same Source Server. Sometimes one of these imports seem to hang up. I cannot reproduce it, when I run the package 2 or 3 times, its gone. So I put some of the imports in line to reduce the parallel working tasks to 4. Then the problem dissappears. The "MaxConcurrentExecutables" Value is set to 6. "Retain same connection" is set "TRUE".
My questions, regarding stability and performance, are:
1.) Is it better to do those imports in seperate packages, if yes can I schedule multiple packages to execute parallel at the "SQL Server Agent"?
2.) Or should they be combined in one package, running (partly) parallel?
3.) What is the appropriate value for the "MaxConcurrentExecutables" Value and what options do I have to speed up those imports?
I have several independant DTS packages that I would like to schedule and run as 1 job stream. (sql2000) I can schedule them individually, but I would prefer to have each one be a step of 1 big scheduled job.
I have a schedule job that errors out on the first step when I attempt to run it. There are 3 packages that I created and I can run them all manually just fine.
I get this error when I try to run the job: -----------------------------------------------
An OLE DB error has occurred. Error code: 0x80040E4D. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80... The package execution fa... The step failed.,00:00:01,0,0,,,,0
I have a rather dumb question, it appears that we can have more than 1 package (.dtsx) in a solution. If I have multiple packages within the same solution, how do I invoke them from the main package in the solution? Thanks in Advance.
We have a folder in SSIS called ETL and there are about 25 SSIS packages in it. Now we need to update it. I tried to delete the folder but you get the message that the folder is not empty. So I renamed the folder to ETL_old and my deployement works fine.
But now I want to get rid of all the old folders. Delete a folder didn't work. Selecting multiple packages doesn't work.
To delete a single one you select the package, right click and select delete, click yes. But then I have to do it 25 times. I was unable to set a short cut with the keyboard for that action.
Hi Guys and Gals. Thanx in advance for any help or input. I have multiple DTS Packages that run each nite on a Server. The packages import data from SQL Tables on another Server. All of a sudden, the Jobs have started failing. There is nothing in the SQL Logs. The DTS Package Logs (that I save to the Server) specify an unspecified DTS error. Any suggestions? Even better - any suggestions what/where to look to find the cause of these errors? I re-run the Jobs in the morning when I get in and they complete without any problems. If you need more information just let me know. Thanx. billy
I am trying to execute around 3 SSIS packages using Execute package task by having all the 3 in one SSIS package, I am getting the below error: Error: Error 0xC001000A while preparing to load the package. The specified package could not be loaded from the SQL Server database. .
I would like to use one configuration file for multiple SSIS packages. How I want to do this is to save the connection string of my production server with the initial database field excluded and set that per package file. I am able to read in the configuration file into 2 different packages without issue but I cannot see how to extract the connection string into a ODBC Database Connection object in order to set the initial database. Is there some way I am not thinking of to do this, either through the designer or programatically?
Thanks for any help you all can provide! I am so n00b to SSIS.
I built 3 diffrent packages and i want to be executed in order. The first one is exporting some tables to another database and the other two packages are based on the database built in the first package.
Basically it seems that whilst you can indeed share a config file, it doesn't aggregate sets of say different connection managers, into a superset - you choose to reuse, but it actually overwrites. Any ideas, anyone??
I am trying to execute around 3 SSIS packages using Execute package task by having all the 3 in one SSIS package, I am getting the below error: Error: Error 0xC001000A while preparing to load the package. The specified package could not be loaded from the SQL Server database. . Then I did the following: 1. Right click on MSDB 2. Import package Inside Import package window: Package location: SQL Server Server: servername authentication: windows authen Package path: \serverC$folder1SSIS Packagesfolder2 Import package as: Package name: package.dtsx Below is the error I am getting: TITLE: Import Package ------------------------------ Cannot find folder "\serverC$folder1SSIS Packagesfolder2". Can I use Execute package task for this purpose?? Thanks!!
I have created several DTS Packages those are working fine, tested and verified. I need to run them by a single click, either by batchfile or some sort of command.
I have several packages that all use the same DTSConfig file to map the OLEDB connections. This works fine in the development environment, however when i run a rebuild with Deployment set to true i get the error
Error 75 System.ApplicationException: Could not copy file "C:Dev_TFSTowergateTowergate.BdwTowergate.Bdw.Ssis.TransformPackagePath.dtsConfig" to the deployment utility output directory "C:Dev_TFSTowergateTowergate.BdwTowergate.Bdw.Ssis.TransforminDeployment". ---> System.IO.IOException: The file 'C:Dev_TFSTowergateTowergate.BdwTowergate.Bdw.Ssis.TransforminDeploymentPackagePath.dtsConfig' already exists. at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath) at System.IO.File.InternalCopy(String sourceFileName, String destFileName, Boolean overwrite) at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.CopyFiles(ICollection fileNames, String outputPath) --- End of inner exception stack trace --- at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.CopyFiles(ICollection fileNames, String outputPath) at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.CreateDeploymentUtility(IOutputWindow outputWindow)
The DTSConfig file is included in the project under Misc in the development studio as all developers need to have access to the same config file.
The net result is that the MSI deployment package doesnt get built so we cannot do an automated delivery of the data.
Is there a work round or a setting that i can implement to remove the error?
I have a question regarding Raw Files. I am breaking a large package into more modular components for better processing and debugging.
The process will start with a preparatory dataflow that will create a Raw File(s). This Raw File will then be used as the source in possibly 6 data flows and/or packages.
My question is whether 1 Raw File can be read concurrently by the multiple jobs and how this would affect processing. I'm assuming that this would slow processing.
My other option is to Multicast the writing of the Raw File to 5 other versions of the file. All would be identical except for filename. Obviously this would use more disk space but this is not a concern as we have lots of disk space. Our concern is for speedy processing.
If you have experience with Raw Files, please let me know how you approached this issue. As always, blogs and specific examples are always great!
My current environment has multiple packages stored in SQL server (MSDB). When working on a set of packages I want to bring them into my local development area Add existing package only allows you to pull one package at a time - anyone have the secret to selecting multiples
I have 200+ plus packages that need to be flexible in how they are run. For example, an end user may choose to run packages 1,2,3 and the next end user may choose to run packages 2,3,7, etc. Prior ro running a package, I set an "instance id" inside the group of packages so I can tie them all together in the logfile - I know that packages 1,2,3 were all run as group and that's distinct from packages 2,3,7 that were run in a differnt group.
Initially I embarked on a scenario where I had a queue table that loaded up the packages to be run and then had a little c# app that read the queue, generated the "instance id" and ran all the packages (either thru dtexec.exe or the Microsoft.SqlServer.DTS.Runtime). But now I wonder if using a master package that uses the Execute Package Task is the way to go. My 200+ packages are all independent and run based on a single config file and it seems as though going the parent package route will destroy some of that independence because I'll now be relying on parent package variables.
I've read that a single config file can be used across multiple packages, but am unsure if it applies to what I'm trying to achieve. As an example, I have 2 packages, PackageA and PackageB. Each package has a connection manager defined - ManagerA and ManagerB and I want a single package configuration file to contain the connection string for each manager.
I can define the package configuration of PackageA with no problems. I choose XML file, define the filename and add the connection string to the config.
I then open PackageB Package Configuration and point it to the existing config file, IS tells me the file already exists and allows me to choose the option to "Reuse Existing". I then supply a configuration name and the wizard finishes, without letting me expose any properties. At this point, the configuration file still contains config details from PackageA.
If I then choose to Edit the package configuration in PackageB, I get a warning saying the config values in the config file (relating to PackageA) cannot be found, and the wizard continues. The subsequent PackageB properties which I add overwrite the existing PackageA properties.
So it would appear that what I'm trying to do cannot be done, and I need a separate configuration file for each package?
130 dtsx packages using 4 matching connections. 3 of those connections are stored in an SSIS Configuration table in an Operational database. The last connection is in a shared data source and points to the Operational database so the packages can grab the others.
Problem:
It's time for deployment and all of those connections must change to production servers. The 3 are no issue, just change the ConfiguredValue in the SSIS Configuration table on the production box to point to the other production servers. However, the fourth one... I had made an assumption that when you changed a shared data source it filtered down throughout all the packages. We all know what assumptions do to you.... So. I need a way to change all 130 connections (and be able to change ALL packages quickly and simply for other projects in the future)
Solution:
It has been suggested that we use another package to run though all of the packages and change the connection with a script task. I can live with this (and more importantly so can our DBS's who have to deploy).
I have one snippet of code to ADD a connection using a variable holding the connection string, but we dont' want to add one, just change an existing one.
Has anyone else done this? Or had a similar problem and way to fix?
We are likely to have many projects in which the connections MUST change at deployment, and the idea of going into every package to make the change is sad at best... We would be more likely to move back to a competitor's product that has a connection repository, than continue with SSIS.
I am currently migrating from Oracle to MS SQL Server 2005 using SSIS. Since the new schema being used on the SQL Server is very different, we have created separate packages for each target table.
We have several different sets of data stored as separate Oracle schemas, and I would now like to change to a different schema for the OLE DB Source objects in my packages to use (Table/column names to remain the same). While I can go through all packages and search through the drop-down list to select the table of the same name from other Schema, doing so is an extremely slow process.
Is it possible to use configuration files to set a schema to be used for a number of OLE DB Sources over a number of packages?
At the moment I am doing a find and replace over the .dtsx files, but can't believe I have to resort to this for something that I imagine many people would want to do with SSIS.
I am working on a project currently where we have many SSIS packages and we want to minimize the number of config files to be used. What I was thinking was using one master config file which will have Server/Database info which will be used by all the packages. Now the thing is there are multiple packages which are being executed from within a master package and I was wondering if this will be an issue. Is there anything else that I will have to keep in mind? Any help is appreciated.
This seems like a no brainer, but it's driving me nuts. I want one XML file for the entire solution. There are multiple packages in the solution which have different Connections in the conntion Managers. The packages share some connection names, and some are unique to the package.
Example:
LoadData.dtsx would have a source database connection named (SourceDB_OLEDB) and a oledb connection (DataWarehouse_OLEDB).
LoadDataMart.dtsx would use the same name for the (DataWarehouse_OLEDB) connection and have another oledb connection (DataMart_OLEDB)
I want one XML config file that has all the connection strings, but the problem is that the LoadDataMart.dtsx will throw an error:
Error 1 Error loading LoadDataMart.dtsx: The connection "SourceDB_OLEDB" is not found. This error is thrown by Connections collection when the specific connection element is not found. c:ssisLoadDataMart.dtsx 1 1
I'm pretty new to SSIS but I've managed to cobble together a number of individual packages to refresh SQL tables from a 3rd-party database.
Now, what I'd like to do is have a single package that I can use to invoke each of the individual ones. Since it will run on a quad, I'd like to invoke them such they'll run in parallel.
I have 5-6 table and wanna to make sure if I'm using any of them in my Packages and I do have 100 Packages.shortest way to search these tables in Packages.
Based on advice from this board, I am setting up our newly-converted DTSX packages (sql server 2000 (DTS) to 2005 (SSIS) package conversion) to utilize package configurations to make it easier to run them in multiple environments (dev, testing, production, etc...). This seems to be working just fine, but I now have another question.
We have approximately 12 DTSX packages which need to be executed in a specific sequential order. I was wondering what the best / most widely accepted method is to allow for running them as an automated process. If you double click on each DTSX file individually, the "Execute Package Utility" pops up, and you can run the package. I'd like to know the best way for us to be able to run each package, one after the other (each one waiting until the previous one completes before it begins), until they have all processed. It would be nice if it did not rely on SQL Server being installed on the machine on which it is being run...
Need some help... When we tried to run mulitple packages one after the other from a windows service, first one succeeds but later ones are throwing below error : "The script threw an exception: The element cannot be found in a collection. This error happens when you try to retrieve an element from a collection on a container during execution of the package and the element is not there. A deadlock was detected while trying to lock variable "System:ackageName, User::BusinessDate, User::Environment, User:ortfolioName" for read access. A lock could not be acquired after 16 attempts and timed out."
Later, we tried to create separate AppDomains for each package and execute via console application, but ended up with below error (The below expressions were defined in OnError Event) : "The result of the expression "@[User::ReportErrorFrom]" on property "FromLine" cannot be written to the property. The expression was evaluated, but cannot be set on the property. The result of the expression ""Error At :" + @[System:ourceName] + "" + "Error Description : "+ @[System::ErrorDescription] + "" " on property "MessageSource" cannot be written to the property. The expression was evaluated, but cannot be set on the property."
At last, we tried to span a separate process (System.Diagnostics.Process) for each package. this seems working but taking very long time: A package that normally takes 2 min, is taking 60 min.
We also tried creating an SSIS Package that executes mulitple packages. But only first package is getting executed, and second one is throwing below error (Here the variable it is trying to lock is of first package): "Failed to lock variable "UniqueInstrumentsQuery1" for read access with error 0xC0010001 "The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.".
Please help us with some work around for this. Thanking you in advance,
I have a MASTER package containing multiple packages using execute package task in SSIS.When am running this MASTER Package in Sql server agent in SSMS, am getting error as below.
steps how to run this and let me know what information am missing to run the Master package.
ERROR :Failed to decrypt protected XML node “DTS: Password” with error 0x8009000B “Keynot valid for use in specified state”. You may not be authorized to access this information.
This error occurs when there is cryptographic error. Verify that the correct key is available.
We are building a dataload application where parameters are store in a table. And there are multiple packages for each load.There is a column IsChecked column if it is 1 then only the child package should execute.Created a master package. In which i have taken execute SQL task in that storing a results in variable and based on the result the child package should execute. But In executesql task i selected result set as full result set. I am getting the below error.
[Execute SQL Task] Error: Executing the query "SELECT isnull(ID ,0) AS ID FROM DataLoadParameter..." failed with the following error: "The type of the value (DBNull) being assigned to variable "User::LoadValue" differs from the current variable type (Int32). Variables may not change type during execution. Variable types are strict, except for variables of type Object.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.