HiWe have need for an SSIS package that would perform routine purging of the growing data in some of the tables used to support notification services. While running Sql in a regular job would seem to suffice, an SSIS package would be more in line with the other processes for the alerts in terms of manageability. The SSIS package should follow the following guidelines:Deletes records from a given number of tables, based on a specified date column for each table and a specified number of days for each table, or other conditions. SystemAlertQueue: 30 days old based on the SubmitTimestamp column. SystemAlertChron: 30 days old based on the EventTimestamp column. SystemAlertNotifChron: 30 days old based on the NotifTimestamp column. WMICheck, related tables (based on WMICheckID, see WMIAlerts database diagram in SqlServer) 15 days old based on BeginCheckDate column. Each table€™s deletion routine should be distinguishable in the package. Does any body know how to do this.. Please help meRegardsDeepu M.I
I have an SSIS package, that move file from one folder (Download) to another folder (Working), where it will be processed and passed to (Processed) folder. The folder (Working) is created at run time and deleted after finishing process. I ran this package using SQL Server Agent (I created a sql job). My problem is that the package fails to move the file from Download to Working, Although it can move it to other folders (say I skipped Working and move it directly to the already-created folder "Processed").
I traced the problem and found the error "Access is denied", when run the package without Agent (double click). I provided the necessary permissions to all levels of folders to the user XX, which I made it the (SQL Server Agent Service Account) as well as the Job Owner. By this, the package executes successfully (again by double clicking it), but with Agent it FAILS.
Why Agent cannot move the file to the run-time-created folder (Working) ?
We have found that it is common for Visual Studio 2005 to crash when editing or running SSIS packages -- from CTP versions through beta versions and including the release version.
Of course we kept hoping that newer releases would become more stable, or at least more robust -- and now I'm hoping there will be a service pack, which might make it more robust?
has anyone built a t-sql script that performs the same as a dts import package. I have done every thing for creating the table for the data to go into and just need to know of any procedures to create the connection to the access db to import from.
Hi,Did anyone successfully set up a local package to first ftp a db.bakand second perform an automated db restore?I need to perform an automated task, which ftp nightly backup file toanother server and then restore onto a database and leave the databasein read-only mode for additional transaction logs restore during theday.Can someone help and provide the procedures on how to do that?Thanks in advance.
As part of my SSIS package, a list of sites is created that need to be created on a remote machine. let's say 1000 sites. I need to pass this list to a web service so web service sitting on that machine creates these sites for me. MY SSIS package does not run frequently so I can sacrifice time a little bit to get better functionality.
I need to move the sites that are not created (for any reasons) by web service to another table and successfully created sites to another table, so I need to get confirmation for each site from the web service.
Which option is better?
1) Calling web service for every single record (site) and get the confirmation and then based on the confirmation I move the records accordingly. I know this might be very time consuming, but as I said my SSIS package might only run every six months
OR
2) Sending records to a web service in a batch and get the result. I don€™t know how to do this though.
I am new to SSIS. I am interested in using SSIS to import an excel spreadsheet into a SQL server database. My biggest concern is how to handle/manage errors that might occur when the import process occurs. Can anyone give me any guidance on this? I could write some C# code to do the import and to create a custom .txt file listing errors that occur on import. Using C# code to do the import seems like I would just be reinvinting the wheel so to speak.
I have scenario in which I am getting a row from XLS and in that ro I have data for multiple tables based on if one table has some data or not
EXAMPLE : Search a table in database on basis of column in the xls row STEP #1 If TABLE A has that row then do nothing and move to next step else insert new row into TABLE A and move to next step STEP #2 If TABLE B has that row then do nothing and move to next step else insert new row into TABLE B and move to next step STEP #3 If TABLE C has that row then do nothing and move to next step else insert new row into TABLE C and move to next step STEP #4 If TABLE D has that row then do nothing and move to next step else insert new row into TABLE D and move to next step STEP #5 If TABLE E has that row then do nothing else insert new row into TABLE E
Currently I am trying to achieve this by using multicast and from that multicast putting inputs into 5 OLE DB DESTINATION but by doing this some time one step execute earlier then previous and integrity constrain violated so getting error.
Hey, I've a few jobs which call SSIS packages. If I run the SSIS package, it runs fine but if I try to run the job which calls this package, it fails. Can someone help me troubleshoot this issue? None of my jobs that call an SSIS package work. All of them fail.
And there is a task (Execute SSIS package) in First package that calls the execution of second package.
I m continuously receiving an error "Failed to decrypt protected XML node "PackagePassword" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available."
As we are running first package by job, job runs successfully logging above error
The protection level of second package is set to "EncryptSensitiveWithUserKey"
hi ! I have a job that creates tranlog backups every 15 mins and makes a full database backup at midnight. I want to purge the old logs after the full backup , how can I do that ?
I have a question on shrinking tempDB. Currently, I am using MS SQL 2005 Server, I have a software that used SQL DB but I had created a separate database to store my data. The tempDB is totally not used at all.
Problem - Whenever I export data into my created database on MS SQL server, tempDB also grew, I noticed that the files grew so large that it crashed the server. I am running 50GB free space on my drive where by the MS SQL server was installed.
Question - May I know are there any solution to shrink or freeze the growth of tempDB size?
In one of our DR Servers we have configured Custom Logshipping. In the folder where the .TRN files are getting copied there is a script to purge the files which are older than one day. Following is the code for the same.
@Echo Off if exist filelist del filelist date /t > rundate for /f "tokens=2* delims= " %%i in (rundate) do set rundate=%%i for %%i IN (*.trn) do echo >> filelist %%i %%~ti for /f "tokens=1,2* delims= " %%i in (filelist) do if not %rundate% equ %%j del %%i :pause :Exit
Instead of removing the files older than 1 day, I need to keep 3 days transaction logs.
Being a novice I don't have much idea how to accomlish it. Can anybody help me with this?
My transmission queue has lots of messages that will never, ever be delivered because the transmission_status = "The session keys for this conversation could not be created or accessed. The database master key is required for this operation."
How can I purge the transmission queue to get rid of this junk?
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?
I have developed an SSIS package for ETL purpose. I am invoking the SSIS package through .Net console application by referencing the ManagedDTS Assembly. I am able to execute the package in Sql Server 2005 Developer Edition and it runs fine till completion.
But when i try to execute the packahe in Sql Server 2005 Standard edition, by invoking the package through .Net console application the status of the package is failure.
Can any one help me how to over come this problem.
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
I would like to standardize SSIS development so that developers all start with the same basic template. I have set it up so it is an available template ( http://support.microsoft.com/kb/908018 ) but I would like it to be the default when a new project or package is created. Is this an option?
I would like to fetch the data flow component name while package is executing. Since system variable named [System::SourceName] only fetches name of the control flow tasks? Is there a way to capture them?
The master package has a configuration file, specifying the connect strings The master package passes these connect-strings to the child packages in a variable Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s. We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1: The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €ścannot connect to database xxx€? Info - €śpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2: When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3: When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4: When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €śtimeout when connect to database xxx€? Info - €śpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
I am interested in Passing value from a child Package variable to the Parent package that calls it in ssis.
I am able to call the Child package using the execute package task and use Configurations to pass values from the parent variable to the child, but I am not able to pass the value from the child to the parent.
I have a variable called datasetId in both the parent and child. it gets computed in the child and needs to be passed to the parent...
Deployed Report having SSIS package as source do not work when Indirect Package configuration is used in ETL package. It seems ETL package when called/executed from Report manager does not recognize environment variable to pick up the dtsconfig file.
The Report works when Direct package configuration is used to same dtsconfig file.
What could be the reason? Any solution for this? This will cause our build/deployment to QA and Prod very difficult.
I now have two SSIS package, "TESTING" and "LOADING". The "TESTING" package have an execute package task that call the "LOADING" package. When I want to execute the TESTING package, how can I setup the connection string so that I can edit the password of the database connected by the "LOADING" package?
When connecting to an SQL Server (v7 in this case) the log file shows that old and deleted databases are still being opened as part of the process. Further, these db names are now reserved and cannot be reused, even thought I've deleted them.
Is there any way I can purge all traces of these deleted db names?
You might have guessed, I'm a newbie, which is why there are a few dozen deleted DBs.
I'm using SQL2K with SP4 2187. I have created a DB Maintenance wizard where the purging older than 1 day is set.
However, this feature seems not to be working, even if I tried two ways. Delete the scheduled job and recreate it - not successful, 2nd) delete the Maintenance Plan, still not successful.
How do I purge data off of an MSDE database. I only want to keep 6 months of data in the database. Right now I have data going back to 2004. I get errors about every 10 seconds. "Primary File Group is Full" is the error I am getting.
Regarding SQL Server data, I am looking to implement the beset Data-Archive and Purge policy. Normal, we do SQL Backups and keep the history for some period , for example, 8 weeks, so we can go back and restore any data point in time upto 8 month in past. and we also do Tape backups.
Question is Where can I get nice article or documentation on this to best design such policy where I make sure that I am covered for point in time recovery of database (which is sql backups) and point in time recovery in far past, say, 3 years ago using tape backups, and I need to make sure that I don't repeat the same efforsts.
I have two SSIS packages in a project, one calling the other. The parent package works fine in my local mechine. After they are deployed to the production, I schedeul jobs to run the packages in the SqlServer. The child package works fine if I run it alone, but the parent package could not find its child package if I run the parent package . As I checked, all xml config files and the connection string pointing to the child package were set correctly. It seems the parent package did not use the xml config file. Can someone help me? Thanks in advance.
I have successfully created a SSIS package which execute a DTS 2000 package and with no problem to execute the task. But I failed to schedule this package. I was not success in setting the logging. When running the package in command line:
dtexec file "C:Documents and SettingslyangMy DocumentsVisual Studio 2005ProjectsTraingDTSTraingDTSDTSTraining.dtsx"
Error: 2008-03-24 08:03:24.36 Code: 0xC0012024 Source: Execute DTS 2000 Package Task Description: The task "Execute DTS 2000 Package Task" cannot run on this edit ion of Integration Services. It requires a higher level edition. End Error Warning: 2008-03-24 08:03:24.38
Code: 0x80019002 Source: DTSTraining Description: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the M aximumErrorCount or fix the errors. End Warning DTExec: The package execution returned DTSER_FAILURE (1).
I am making use of the DtUtil tool to deploy my package to SQL Server. Following is my configuration: 32-bit machine and 32-bit named instance of Yukon.
I have some package variables which need to be set in the code.
Previously I did it as follows:
Set the package variables in the code. For example:
Here the package was successfully deployed and when i open those packages using BIDS, I am able to see that the variables are set to the values as doen in teh code.
Because of oen problem I am not using SaveToSQLServer method. So I switched to DTUtil tool. Now I am doing it this way:
Set the package variables as before. Deploy the package to SQL Server using DTUtil tool.
Now is the problem: The package is successfully deployed. But the variables are not set to the value that I have specified in the code.
I also tried DTexec utility to set the package variable. Even that does n't work. Can anyone help me out? Is there any alternate method to set package variables?