We are planning a shared SQL Server 2005 enviroment where users can create databases/applications for themselves and/or departments. With the consideration that there can be multiple SQL Servers on a box, can each instance limit a user's disk space? Thanks for any enlightenment.
For those of you who have had a hard time like me trying to figure out using Protection level for an SSIS package whilst deploying the package via the SQL Server Agent, here is a piece of advice:
Firstly the protection level is set by Default to - "EncryptSensitivewithUserKey".
The encryption actually takes place only if you have things like - passwords etc..
From my experience - using both - "EncryptSensitivewithUserKey" and "EncryptSensitivewithPassword" Security features have turned out to be unreliable when deploying through SQL Server Agent (even while using a Proxy account having all previliges).
This is it seems because of issues with the user who created the package being different from the user who deployed the package. (which is really ridiculous).
So I used the ProtectionLevel - "DontSaveSensitive" - which means it is not going to encrypt anything in the package and so ur sensitive information would be blank. You would have to then supply your password etc using a configuration XML file. - using SSIS "Package configuation" in your menu....
This has been the most reliable way of solving the whole problem with encryption.
bear in mind that you might want to put the XML file in a secure location to which no one else has access to.
Lately, I have been experimenting with SSIS and I created a generic custom error logging component that saves all offending data on data flow component failure. However...
Instead of re-directing rows at the data flow level and handling/logging the data at that level, is it possible to catch all of this information at the package level and handle/process it there?
My project currently has task which have their own individual event handlers that get called onError (setup event messages). I also have a package level event handler that performs a generic task (sending events to the windows eventviewer) In the package level event handler there is a script task that decides on a boolean variable whether to "Success" or "Failure" to different task. When I fail one task of the main control flow, the task level event handler runs, then the package level event handler runs, and then it also runs again for some unknown reason. The second time it runs it picks up the value of a variable set in the variables window. However, I change this value at runtime to the value from a database. I can't understand why it would run the second time, and if it did run why it would have the value from the variables window and not the value that is set in memory. It's like the event handler runs with the value from memory and then runs and picks the values back out of the variables window, replacing the db values and re-runs.
Maybe the package itself is failing all together and then re-runing the package level event handler?
Can anyone explain to me why my ssis packages will not work when DontSaveSensitive protection level is selected? My package configurations are set as SQL server configuration type, and I have a table in a database that contains all the sensitive information (passwords and such). If I select "EncryptSensitiveWithUserKey" everything works, but I will be the only one able to execute the packages (not good, I need others to be able to execute them as well).
The error I'm getting tells me that the connection is not configured correctly or I may not have the right permissions on the connection.
My guess is that the DontSaveSensitive drops the passwords, but when I edit the data source and re-enter the password, it still does not work. Also, the database table I use that contains the sensitive data is not affected, all data remains.
What the fudge is going on here? I'm a newbie at this, can someone help me out?
while executing the package following error message is received as :
Error: 2006-07-28 15:12:36.60 Code: 0xC00470FE Source: Data Flow Task DTS.Pipeline Description: The product level is insufficient for component "Data Conversion" (202). End Error
and at the end as :
DTExec: The package execution returned DTSER_FAILURE (1).
Same error appers while executed from Integration Services - ->stored packages - - >name of the package -> mouse right button, run package.
But the same executes perfectly from visual studio, where it is developed.
SQL Server 2000 SP4. I built a large DTS package that grabs a numberof tables from an Oracle DB, does some scrubbing and date verificationand loads to a SQL Server DB. Most of the tables are full refresh anda few are incremental.Main DW: DwSQLStaging Area: DwLoadAreaSQLThe DW is about 60 Gigs. The Staging Area is about 80 Gigs. This isall good.However, the log file for the staging area is 50 Gigs and I'm tryingto find ways to not require such a large log file. I tried adding afew "BACKUP LOG DwLoadAreaSQL WITH TRUNCATE_ONLY" statements in theDTS package but figured out that because it's 1 DTS package it's all 1transaction. I've thought about breaking it up into multiple DTSpackages and truncating the log between running them but was hoping toavoid this. To be clear, I know how to shrink DB's and LogFiles...that's not the issue.Any Ideas? Thanks.
1) How can I keep my package from running more thatn 1 instance at a time?
I tried changing "MAXCONCURRENT" to "1" in my DTEXEC command in batch file, however, this doesn't limit the # of instances. (If I run the batch file twice, one after the next, I get 2 instances running simultaneously).
2) What "executable files" is this definition referring to?
MAXCONCURRENT is defined as:
"Specifies the number of executable files that the package can run concurrently. The value specified must be either a non-negative integer, or -1. A value of -1 means that SSIS will allow a maximum number of concurrently running executables that is equal to the total number of processors on the computer executing the package, plus two."
Hello all, I have a SSIS package that is importing data from a DB2 database. I am using SSRS (BIDS/VS2005). I want to be able to access the last time a particular package ran in my report, like in the footer of the page. I have found the globals.executiontime for grabbing the time the report was ran.
I want to do the same thing, except I want to call the date/time the last import was made to the database. Is there some easy way to grab that from SSIS, or will I need to maybe reference the DB creation time? (The database is dropped and recreated (for now anyway) in the SSIS package.)
I have a DTS in SQL Server 2000, Where I am importing some data from a remote server (SQL 2000) to local server (SQL 2000).And this is working fine. it is taking max of 1 min to execute the package.
Now I have created the same DTS in sql server 2005 (SSIS) where the source server is sql server 2000 and the destination server is 2005.and I have created the ssis in that server. The same logic which i have created in sql 2000. But here it is taking almost 10 min to execute the package.
Where as the same in sql server 2000 taking max of 1 min. Why this happening.. Is there any configuration to execute the SSIS package.?
I have SSIS Projects taking a long time to open with packages with a large number of data flows. Is there a way to turn off validation of metadata when a package opens? Turn off validation during execution on SSIS Service (after previously validated in dev)? Or be able to control when validation takes place in general?
In my one package (1 of 5) I have 43 data flows (with a single source to target mapping) in 4 sequence containers, and it takes approximately 2-3 seconds per source to target mapping and sequence container to validate which will translate to 1 ˝ to 2 ˝ minutes to open. When the project with all 100+ tables for the data warehouse goes through validation, I can make coffee in the time it takes to open the project. I have to delete *.suo file (or verify all packages are closed in the designer and save the project file), and when I open the project, I have to jump immediately to SSISĂ Work Offline to set it to not validate the metadata to be able to work in a timely fashion. DelayValidation=TRUE does not help much.
Running in debug mode, has an effect of causing packages that were not open and validated to go through validation though I am not running those packages. Validate once during design and run forever.
Even if I re-open a package that I just closed from designer and had gone through validation, it will go through the validation process again.
It would be great if there could be an on-demand option off the menu bar to allow one to control when validation can take place for a project, or a more granular validation option for a specific data flow or container.
I'm planning to consider writing a SSIS package for a new project that requires downloading large chunk of data and transform into the diverse databases such as MS SQL or Oracle depending on the Client's Datbase.
For the clients having SQL Server installed at their end, i had no issues in deploying this package on their server and run it in their licensed instance.
What should be the case for others having Oracle database? Wouldn't installing the SQL 2005 client tools install the necessary run-time services for running SSIS packages? What i understood from the MSDN library (http://msdn2.microsoft.com/en-us/library/ms403355.aspx) is that there's no run-time support available for running the SSIS packages (unlike DTS run-time support) in production environment!
Would that mean that it requires a SQL Standard edition, at minimum, (as Integration Services is OOTB from Standard Edition onwards) to be installed at the production site to run this package?
If so, the client wouldn't be ready (which is fair too) to buy the new license just to run this package. Is there any work-around/suggestions for this case?
If not, can somebody please point me to the right location where i can download the run-time support for running SSIS packages?
I have a small requirement in SSIS Error Logging Mechanism. Presently in my SSIS package i am using a File Connection Manager for creating a Log file. I have a problem on this regard. Every time when i am executing my DTS package, the error log messages are getting appended to my error log at OS level (say D:error_messg.log). And for this reason whenever my DTS package is getting executed the size of the file is keep on increasing and there by killing my disk space.
I have a requirement for this error logging mechanism. At any time my log file should not exceed more than 20MB. Or can we remove the log events a week ago or say more than 2 days or say. Just ensuring the log file do not fill up the disk space eventually.
How can we do this? Any suggestions are greatly appreciated.
The issue is in the data flow for loading and setting the Fact table dimension keys (the dimensions are all loaded fine). After 16 rather pedestrian Lookup Transformations, I have an escalating problem adding additional Lookup transforms to the Data Flow. The problem is not in execution; the problem is adding more transforms in design mode.
Lookup # Fields in Data Flow Time to validate that lookup <17 47 Sub-second 17 48 2 sec 18 49 4 sec 19 50 8 sec 20 51 16 sec 21 52 32 sec 22 53 64 sec
While I€™m intrigued by the mathematical progression that is forming here, the issue is that I have at least 6 more Lookups to perform. I hope you can see my dilemma.
I have gone to where it takes a little over 4 minutes each to validate the lookup transform and its associated Derived Column transform and Union transform (Total 12 Minutes). Not only does this add up to many idle minutes to each design step, BUT it breaks the debugger as it pre-validates the ENTIRE data flow before it ever switches into debugging mode.
Some notes: 1. It doesn€™t matter what order the Lookup transforms occur in, the timings are exactly the same. 2. I tried many Data Flow execution optimizations, but they don€™t improve the validation times (or even get a chance to improve the execution times!)
I realize this may be somewhat of a unique problem.
Hey, I've a few jobs which call SSIS packages. If I run the SSIS package, it runs fine but if I try to run the job which calls this package, it fails. Can someone help me troubleshoot this issue? None of my jobs that call an SSIS package work. All of them fail.
We have a reasonably large (several TB) database that was recently migrated from 2008 to a new box running 2014. Before giving it back to the users we forgot to change the compatibility level of the DB to bring it up to date with it's new environment.
We want to do some testing with backup compression so want to change the compatibility level, but we are unsure whether making the change on such a large database would cause slowness or downtime for our users.
Does the process of changing the compatibility level simply allow options that are not available in the older version or does it make structural changes to the database that would cause the users to notice slowness or downtime?
I am using check point in my packages , but i am not able to run my packages where it exactly got failed. The scenario is i am 100 rows at source system and i was loaded 95 records into target and due to the some data formatting issues i got failed at the 96th record. Later i am trying to re-execute the package, Surprisingly my package start run from the 1 st record(nothing but the start point of dataflow task).
How can i achive to run from where it excatly got failed(96th record) ?? is it possible using check points else is there any work-around approach ??please respond this post , it is very helpfull for me..
Hello, I have a package that contains a connection manager to a DB2 source. The password is configured within the connection mananger. The configuration string was included in the package configurations (SQL Server). The package is saved in VSS, and locally on my hard drive.
When a colleague attempts to open the package is Visual Studio (from VSS), He gets a message similar to the following.
Error 1 Error loading 'Geac_RK502.dtsx' : Failed to remove package protection with error 0x8009000B "Key not valid for use in specified state.". This occurs in the CPackage::LoadFromXML method. . C:Documents and SettingscdunnMy DocumentsVisual Studio 2005ProjectsGeac_RK502Geac_RK502Geac_RK502.dtsx 1 1
I'm very sure the package protection level was to encrypt sensitive data with user key. After he ran into this problem, I tried again to open the package from my computer (the machine that the package was created on) and now I get the same message. If I attempt to open the package anyway, I first get a message that there were errors in the package while it was being loaded, and that the package might be corrupt. After that message, I get one that states the document contains one or more extremely long lines of text....do you still want to open the file. If I click yes, the package opens read only with the following message:
Microsoft Visual Studio is unable to load this document. Failed to remove package protection with error 0x80090008. "Key not valid for use in specified state". This occurs iin the CPackage:LoadFromXML method.
I'm looking into more information about package protection. What can I do to avoid this problem, and what protection level should I be using so that my colleague can open the package? How can I correct the problem with this particular package, and have the package open?
I am trying to load previous days data at 3 am via a SSIS job.
The Date variable is initiated as DATEADD("dd",-1, GETDATE()) in the for loop.
Now, as this job runs at 3 am, and I set the variable as GETDATE() - 1, it excluded the data from 12 am to 3 am in the resultset as Date is set as YYYY-MM-DD 03:00:00:000 I need this to be set as YYYY-MM-DD 00:00:00:000
I read other threads with this problem and one solution was to install SP1 on the workstation. Our server is running 9.00.2047(SP1) and my workstation SSMS version is 9.0.2047.00 so I'm assuming it's SP1 and I am still getting this error.
To make matters worse, I have jobs set up to run these packages ON the server, but they aren't working either because I have '...error authenticating proxy...Logon failure unknown user name or bad password'. I'm trying to use my Windows login because it has sysadmin rights; I created credentials, assigned them to a proxy under SSIS Package Execution; in the SQL Agent log I get the error '...SQLServer Error: 22046, Encryption error using CryptProtectData.'.
I have a package on which i've applied a package level OnError event handler. The OnError event handler includes a Script Task (that builds up a string of errorCode, errorDescription, MachineName etc...) and a WebService Task that calls a webservice to send an email including the built up string from the script task in the body of the email. This has worked fine in one package where i've applied it but for some reason in a second package the existence of an OnError package level event handler seems to be completed ignored. I'm causing various package object to fail but the OnError handler never fires. I know the obvious answer is find what's different between the two packages but i can't see that any is different (in relation to package level OnError event handling).
Has anyone else come across this? Any suggestions?
I am trying to reference a package level variable in a script component (in the Code) and am unable to do so successfully. I have it listed as a ReadOnlyVariables in the custom properties of the script component, however unable to reference it in the code.
I have created a Integration Services package that takes a table in a database, and transfers it to a flat file. This package has successfully run through visual studio 2005 as a .dtsx package, and given the output that I expected.
However, now, I am trying to excecute the package (as xml) using C#, and I am receving this error:
Error in Microsoft.SqlServer.Dts.Runtime.TaskHost/DTS.Pipeline : The product level is insufficient for component "Flat File Destination" (31).
I do not understand how a working package would have this kind of error.
Considering that it runs when I do not use C# code to execute the package means that I have SSIS properly installed, and I have the proper versions (or it should not execute ever). I have SP1 for both SQL Server 2005, and Visual Studio 2005 installed.
Other packages that I have created using C# code also have the same problem.
And there is a task (Execute SSIS package) in First package that calls the execution of second package.
I m continuously receiving an error "Failed to decrypt protected XML node "PackagePassword" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available."
As we are running first package by job, job runs successfully logging above error
The protection level of second package is set to "EncryptSensitiveWithUserKey"
I have been struggling trying to read and/or write package level variables from within my custom task. I'd like to be able to get and set values from within the Execute method of my custom task. I have searched this forum and the books online and can't seem to find the answer. I thought maybe I could use an expression on my task (mapping the package variable to a custom task public property) but that doesn't seem to be working for me. I also would have thought I could use the VariableDispenser object from within my task but the collection is empty. I have 3 package level variables configured and can't seem to find a way to access them (with intentions of getting/setting). Could someone point me to a good doc or provide an example that may accomplish this? Thanks!
(I'm using package level variables as a means of passing simple information between tasks that are not using a DB, if there is a better way I'm open to suggestions.)
I am developing a table driven ETL system, where I store large PL/SQL queries in a varchar(max) field.  I read the field into an object variable at the package level using a SQL task.  Next I have a C# script task that creates a connection to our Oracle database using the Oracle provider.  I am trying to set the Command.Text property to the PL/SQL statement I've stored in the package level object variable, but I'm having difficulty getting the Pl/SQL text back out of the object.
I've tried Object.value.tostring, but it just returns a generic "system__text". how I can proceed?
Running eval. edition of Sql Server Standard 2005. "Insufficient product level" error is thrown during validation phase of an OleDBCommand data flow task. This task type is not licensed in Sql Server 2005 standard? The component runs a very simple sql update statement against a one row table in Sql Server 2005.
If it works from BIDS, should it not work from dtexec.exe on the same box?
Does dtexec run under the security context of the logged in user?
I am receiving the following error in Sql Job agent when I try to run an SSIS package : The task "Create Excel File" cannot run on this edition of Integration Services. It requires a higher level edition. It then goes on to tell me : The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. I have tried reseting the error count to allow for the "errors" , but it stills fails. The job suceeds in Visual Studio, but not when scheduled in Sql Management Studio. Any suggestions?
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?