Porting an existing SQL 2k DTS job over to a SQL 2k5 SQL Server running SSIS.
Background:
The job loads data into an empty work table and performs some work before clearing out the work table.
This job runs every minute.
Question:
If the job happens to take longer than a minute, does SSIS create a second instance of the job?
Or perhaps it does what DTS did and reschedules the job for the next iteration?
Concern:
I need to know because there would be key contraint violations if another instance of the job started before the working table was cleared out.
If you have a data file, and you only want CERTAIN rows to pass to the destination, ie) a table
and you are using a script task to accomplish this,
is this a synchronous or asynchronous transformation?
Q. And how do you assign the values to the output? Do you have to create output columns, or not?
I am very very confused right now. I can't seem to find a decent answer to what is a very basic question either in my SSIS book or in the documenation. Perhaps it is so basic, that the question doesn't seem valid? I don't know. But I just don't understand this at all.
I'm creating a script component that reads from an OLEDB source and writes to an OLEDB destination.
For every input row, I need to output several rows.I tried using the Row.DirectRowToOutput0() method inside a loop in the
Input0_ProcessInputRow routine but that's not working. Should I be using Addrow() instead? If I use Addrow() does this mean it needs to be an asynchronous transformation?
I remember seeing a blog entry (Jamie's?) that did almost exactly what I wanted, but I can't find it now.
We are seeing high number of hadr_sync_wait types on our server after setting up AOAG during peak times. We have setup sync type as synchronous commit and failover automatic. Can we change these settings to async and manual failover whenever we need and change them back to sync commit during off peak timings. Any drawbacks because of these changes ?
Can someone explain how the PR testharness is supposed to function? When I run it f with and without debugging and by using dtexec other processes can start and stop before the awaiting response from the console date and number of days to process. If you just look at the testharness and the loadgroupfulldaily the increment date and SQL Audits will finish prematurely before the console data is even entered. Is there a property that I am missing?
I'd like to know if there's a way to control the execution of ETL packages, such that: Different packages, or at least packages that don't access the same table or database run asynchronously with respect to each other; e.g., two different packages run at the same time and If a package is called for execution more than once by different requests, force them to run synchronously, or one after the other.If this is possible, what resources would it require? Is this possible under, say, a dual or quad processor machine?
MSDN states the following on: Readable Secondary Replicas (AlwaysOn Availability Groups) for SQL Server 2014:
Limitations and Restrictions:
Change tracking and change data capture are not supported on secondary databases that belong to a readable secondary replica:
Change tracking is explicitly disabled on secondary databases. Change data capture can be enabled on a secondary database, but this is not supported.
This confuses me: You can not track the changes. However you can enable CDC?
The scenario I am trying to achieve is to use SSIS CDC components on an asynchronous secondary replica. Is this possible? If not what would be other viable approaches?
Currently I have two SSIS jobs on my machine. The problem I'm having is, only one of the jobs executes succesfully, the other one fails for incorrect user login. Both jobs use the same configuration database and all the packages on both jobs have the protection level set to "DontSaveSensitive". Both jobs have been deployed in the exact same manner, yet only one succeceeds and the other fails.
Hi, I need to transfer the data in A table on a 2005 instance to B table which has the same structure as A table on a 2000 instance. There are 200,000 records in A table. If I use <insert B select * from linkedserver.....>, it takes only 30 seconds. I create a SSIS package to do this. But it is very slow. After it runs 10 minutes I have to stop it. And I find that it transfers about 100 records every second. Then I change the source server and destination server. That is transferring the same data from the 2000 instance to the 2005 instance. It takes only 50 seconds. why? How to make the package used for transfer data from the 2005 instance to the 2000 instance run fast?
My team is using SqlServer 2005 SSIS to develop package for Consolidated Views. This works fine on my machine in Visual Studio when they run it as a script from command prompt from their desktop. However, when they move this package to ETL box, the package runs fine Visual Studio environment but fails when run from the script with the errors:
Description: Unable to load the package as XML because of package does not have a valid XML format. A specific XML parser error will be posted. End Error
Description: Failed to open package file "C:SM_BI_EXTRACTS_SHAREDSharedInt egrated ViewsIntegrated ViewsCV_Product.dtsx" due to error 0x800C0006 "The system cannot locate the object specified.". This happens when loading a package a nd the file cannot be opened or loaded correctly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format. End Error
-- Could not load package "C:SM_BI_EXTRACTS_SHAREDSharedIntegrated ViewsIntegrated ViewsCV_Product.dtsx" because of error 0xC0011002.
Description: Failed to open package file "C:SM_BI_EXTRACTS_SHAREDSharedIntegr ated ViewsIntegrated ViewsCV_Product.dtsx" due to error 0x800C0006 "The system cannot locate the object specified.". This happens when loading a package and the file cannot be opened or loaded correctly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format. Source: {519D338B-8874-40A8-A385-0D53E868A57C} Started: 12:57:45 PM Finished: 12:57:45 PM Elapsed: 0.016 seconds
Can someone help us understand the problem? We are new to SSIS. We also want to know exactly what needs to be on the Server that will be running our SSIS ETL programs (that is, what components of SSIS -- SQL Server 2K5 is already loaded).
We are using SQL 2005 on Dell Servers running Win 2003, with the latest service packs. Microsoft SQL Server Integration Services Designer Version 9.00.2047.00
Hey, I've a few jobs which call SSIS packages. If I run the SSIS package, it runs fine but if I try to run the job which calls this package, it fails. Can someone help me troubleshoot this issue? None of my jobs that call an SSIS package work. All of them fail.
I have an SSIS package created from a SQL 2000 DTS using the Migration Wizard. The package imports data from a MySQL database to a SQL 2005 64-bit database running on 64-bit windows server 2003. The package runs fine when executed from SQL Server Management Studio but when I schedule it as a job it fails with:
Executed as user: [the domain admin account]. The package execution failed. The step failed.
I've tried a lot of different ways to make this work including creating a new SSIS package. Again, the package ran fine except when it was scheduled as a job.
I am using SQL 2005. I have created a SSIS package that basically executes another SSIS package (as part of a larger package) . It runs fine in SSBIDS but will not run if I save it and schedule it using SQL Agent. I should mention I am using a domain/admin account with SQL Agent, so I don't think that is the problem.
When I execute the job in SSBIDS, the Execute Package Utility window pops up, at which point I click on the Execute button, the job runs successfully and then I click on the close button.
I suspect it is not running via SQL Agent because of the user intervention required to complete the task (i.e. clicking on execute as described above). Is this correct? If so, is there a way to override the requirement for any user intervention. Or, could it be from something else?
I've worked for awhile to get an SSIS package that connects to Oracle to succesfully run via DTExecUI from a 64 bit SQL Server. I completely understand that DTExecUI runs in 32 bit mode, but I need to initiate the package from a stored proc so I need to be able to utilize the 32 bit DTExec from either a xp_cmdshell call or from an sp_start_job (thanks to MatthewRouche for the suggestion). Both Ora92 and 10g were installed on the SQL Server.
Here's what I've tried unssuccessfully
1. SQL Job - Just calling the package from an SSIS step. I wouldn't expect this to run because it is 64 bit DTExec. 2. SQL Job - A job that calls the 32 bit version of dtexec from cmdexec. Here is 2 examples I have tried, the first is because I came across several postings saying that there is a problem with the "(x86)" for the Oracle driver.
\server_nameE$ProgramFilesLegacyMicrosoftSQLServerX8690DTSBinnDTExec.exe /DTS "PackageName" /SERVER "SERVER_NAME" /SET Package.Variables[PLT].Value;"8H" /CONFIGFILE "\SERVERConfigFile.dtsConfig" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EW
\server_nameE$Program FilesMicrosoft SQL Server (x86)90DTSBinnDTExec.exe /DTS "PackageName" /SERVER "SERVER_NAME" /SET Package.Variables[PLT].Value;"8H" /CONFIGFILE "\SERVERConfigFile.dtsConfig" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EW
Both of these give me the error below:
System.Exception: System.Data.OracleClient requires Oracle client software version 8.1.7 or greater.
I'm assuming that it is running in 32 bit mode because this was at the beginning of the job history log:
Executed as user: "XXX". ....3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved
3. I then tried to call the package from a batch file on a file server that had integration services installed on. Got the same error as above.
I'm obviously missing something if I can get it to work in DTExecUI but not DTExec 32-bit.
Any ideas? Below are some postings I have already visited several times:
Windows 64 bit and Oracle http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=358365&SiteID=1
Connecting to Oracle on 64-bit (x64) machine http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=159581&SiteID=1
One paragraph that stands out on this posting is:
Find the location of the application that is generating the error. Check the path to this location and see if it contains any parenthesis. If so, you must relocate the application to a directory without any parenthesis in the path. This didn€™t work for me, unless I was doing it wrong.
Error retrieving numeric data from Oracle 10g on 64 bit Itanium SSIS server http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1183296&SiteID=1
Connection Problems to Oracle Database http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1307460&SiteID=1
If this posting was complete, it looks like what I was looking for.
When I run a package I created in the development Studio it runs fine but if I create a job and run it I get an error "The AcquireConnection method call to the connection manager "ODS" failed with error code 0xC0202009"
I have the package setup to use a XML config file and it works fine on all the other packages but this one will not work.
A collegue of mine is having some issues. I hope someone can help
Hi all,
This is my first call for technical assistance so go easy on me.
I'm having a problem in one of my SSIS packages. In brief, the process involves copying the rows from an untyped table to a typed table. There are about 45,000 rows in this table and during the copy ("OLE DB source" to "OLE DB destination") the process appears to hang for about 5 minutes eventually crashing with a "The buffer manager could not get a temporary file name. The call to GetTempFileName failed."
After several attempts using all the trace info I could master this is the order of events with some interesting numbers and facts:
1. The SSIS package goes swifty through "Validation", "Prepare for Execution" and "Execution" in less than 1 second, with "Execute phase is beginning" being the last message on the "Progress tab".
2. Using Performance counters I note that in the next 5 mins the values for "BLOB bytes read" slowly rises and then after a couple of mins so does ""BLOB files in use" the latter reaching a figure of 65534
3. When this figure is reached, SSIS starts creating thousands of zero-size files with the name DTS####.tmp (where #### is hex e.g. DTSB4C1.TMP) in the TEMP folder (C:Documents and Settings<username>Local SettingsTemp in my case).
4. When I started running this package there were 130 files in my TEMP folder; As soon as the combined total of files in TEMP reaches 65664 (i.e. 65534+130), SSIS starts producing the errors list which includes the one I listed above and eventually it clears the TEMP folder down to the original 130 files.
5. My conclusion (thus far) is that SSIS creates all these 1000s of tmp files but in my case hits some kind of maximum (either a folder limit or runs out of hex combinations for the file names) and then crashes.
6. The only thread I found on the internet suggested setting up an environment variable "BLOBTempStoragePath" and assigning a value of "C:Temp1;C:Temp2;C:Temp3;C:Temp4" so that SSIS can span across a number of "temporary" folders instead of the 1 default folder contained in the "TEMP" environment variable.
7. Setting the above environment variable in Windows 2000 did not work for me (tried it as both a user variable and a system environment variable). So here are the facts so far - ANY assistance will be hugely appreciated - I have no idea why all these temporary files are being generated - I have created SSIS packages handling data sets 10 times bigger than this one without these problems so I don't think it's size related.
I have a very complex Stored Procedure called by a Job that is Scheduled to run every night. It's execution takes sometimes 1 or 2 hours and sometimes 7 hours or more.
So, if it is running for more than 4 hours I stop the Job and I run the procedure from a Query Window and it never takes more than 2 hours.
Can anyone help me identify the problem ? I want to run from the Job and not to worry about it.
Some more information: - It is SQL 2000 Enterprise with SP4 in a Cluster (It happens the same way in any node). - The SQL Server and SQL Agent services run using a Domain Account that have full Administrative access. - When I connect to a Query Window I also use a Windows Account.
- There is no locks or process bloking or being blocked while the job is running. - Using the Task Manager the processor activity is ok, no more than 30 % in any processor.
I have created an SSIS package which, as its 2nd step, picks up a .dbf table and does a Data Flow Task of putting the .dbf table into a SQL Server 2005 table (which was just truncated in Step 1).
The Connection Manager that works on the .dbf table is Native OLE DBMicrosoft Jet 4.0 OLE DB Provider.
The entire SSIS package runs perfectly when i run it from my development environment.
I then assigned this Package to a SQL Server 2005 Job. The entire package is the sole step of the job.
But when I run the SQL Job, the Job fails immediately on Step 2.
Here is the error message in View History of the SQL Job:
Executed as user: DBASE01SQLExec. ...on 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 11:48:37 AM Progress: 2007-06-14 11:48:37.91 Source: DBF to ScanUS_Process Validating: 0% complete End Progress Progress: 2007-06-14 11:48:37.91 Source: DBF to ScanUS_Process Validating: 33% complete End Progress Error: 2007-06-14 11:48:37.91 Code: 0xC0202009 Source: DBF to ScanUS_Process ScanUS DBF Output File [1] Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E37. End Error Error: 2007-06-14 11:48:37.91 Code: 0xC02020E8 Source: DBF to ScanUS_Process ScanUS DBF Output File [1] Description: Opening a rowset for "scanus_process" failed. Check that the object exists in the database. End Error Error: 2007-06-14 11:48:37.95 Code: 0xC004706B Source: DBF to ScanUS_Process DTS.Pipeline Description: "component "ScanUS... The package execution fa... The step failed.
I am using SSIS packages for data transfer, When i run the package on virtual server it takes more time as when run on a PC. After analysing i found that Package when run on Virtual server takes time in startup around (50 sec) or so.Could anyone help me with a little bit of detail description as to why it runs slow.
I'm at a bit of a loss. I've written a package that has about a dozen data flow tasks and each one loads data from text files into tables in a db. It's possible that some of the fields in the text files may have data that would need to be truncated upon insert into the db.
In the data flows, I've set the fields in the data sources I know this could happen to to ignore truncation errors. When I run this within BI Studio, everything works fine. I then loaded the package into the SSIS store on the db server and if I attempt to run it from the SSIS store or if a job executes it, the package appears to be failing because of these truncation errors that I've told it to ignore.
I have a problem where I have an SSIS package (SQL Server 2005) that won't run properly from SQL Server Agent, but it runs fine when kicked off manually from Integration Services -> Run Package or when run in debug from Visual Studio.
The first step in the package checks for the existance of a file via a script task. The script looks like this...
Code Block Public Sub Main()
Dim TaskResult As Integer Dim ImportFile As String = CStr(Dts.Variables("BaseDirectory").Value) + CStr(Dts.Variables("ImportDirectory").Value) + CStr(Dts.Variables("ImportFile").Value)
If Dir(ImportFile) = "" Then Dts.TaskResult = Dts.Results.Failure Else Dts.TaskResult = Dts.Results.Success End If
Return
End Sub
This script runs fine and the file is seen as expected when I run the package manually. But as a step in a SQL Server Agent job, it doesn't see the file.
The SQL Server Agent service is set to start up / log on as a Local System Account. I've also tried setting up a credential / proxy (using an account that I know can see and even move / rename the file) to run the job as but that didn't seem to help.
The package is being run from SQL Server (stored in MSDB) and is set to rely on SQL Server for sensitive information, so I don't think that's an issue; other packages are set up like this in terms of sensitive data and run fine.
Any ideas why my script can't "see" the file I'm looking at when it's kicked off by SQL Server agent? I've looked and looked...I can't seem to figure this out. I would really appreciate any help you might be able to offer up.
Hi I am trying to schedule a job to copy an MDB data file from Unix server to Windows 2003 server (Accfp1_data2_server). I have created a file copy SSIS package and tested it in the SSIS Visual Studio environment where it runs ok. The package was created while logged in as a domain administrator.
I then created a job to run this package (which is stored on a folder) using the credential of the same domain administrator who has full access privilege to both of these servers. However, the job fails whenever it is run manually or scheduled? The error message displayed is given below --------------------------------------------------------------- Message Executed as user: FORTIESABCITYG. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 14:26:07 Error: 2007-09-13 14:26:12.56 Code: 0xC001401E Source: CommunityContact - Copy MS Access Database Connection manager "CONTACT.mdb On Accfp1_data2_server" Description: The file name "\Accfp1_data2_serverDATA2Arts&recAppsContactsCONTACT.mdb" specified in the connection was not valid. End Error Error: 2007-09-13 14:26:12.56 Code: 0xC001401D Source: CommunityContact - Copy MS Access Database Description: Connection "CONTACT.mdb On Accfp1_data2_server" failed validation. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 14:26:07 Finished: 14:26:12 Elapsed: 5.297 seconds. The package execution failed. The step failed. ---------------------------------------------------------------
Please note that the job runs without problem when I change the source file to a Windows 2000 server share . How bizzare? Hope this is not a Microsoft's Trick?
Visual Studio runs out of memory when trying to use SSIS package. I am trying to create and run a SSIS package that validates and imports some large xml files >200MB. Validation fails because Visual studio cannot open large files without running out of memory.
The SSIS package throws this error when I run the package..at the validation task.
Error: 0xC002F304 at Validate bio_fixed, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.".
How do I increase the amount of RAM that VIsual Studio can use...I have plenty of RAM on my workstation >3GB, but VS chokes maybe around 100MB files?
I've created a SSIS package that calls the access dll and fires off 2003 access reports, saves them as PDF's and emails them off.
Now this works fine when I run it manually, but when I schedule and fire off a job I get a very vague error "exception has been thrown by the target of an invocation".
I have copied the access dll to the GAC and .net framework v2.0.50727 but still no luck.
I'm using Bull zip PDF printer and those DLL's are also in the GAC
I am copying a simple table from a Sql Server 2005 database to an *.sdf mobile database.
I am brand new to SSIS and I am probably doing something wrong. But after executing the SSIS package all the rows and all the fields are NULL in the destination database. I put a datagrid viewer between the OLE DB Source and the Sql Server compact edition destination and I can see the real data which is obviously not ALL NULL.
Does anyone have a clue as to why it would be doing this?
I have simple query which creates tables by passing database name as parameter from a parameter table .
SP1 --> creates databases and calls SP2--> which creates tables . I can run it fine via SSMS but when I run it using SSIS it fails with below error .The issue gets more interesting when it fails randomly on some database creation and some creates just fine .
Note** I am not passing any database of name '20'
Exception handler error :
ERROR :: 615 :: Could not find database ID 20, name '20'. The database may be offline. Wait a few minutes and try again. ---------------------------------------------------------------------------------------------------- SPID: 111 Origin: SQL Stored Procedure (SP1) ---------------------------------------------------------------------------------------------------- Could not find database ID 20, name '20'. The database may be offline. Wait a few minutes and try again. ----------------------------------------------------------------------------------------------------
Error in SSIS
[Execute SQL Task] Error: Executing the query "EXEC SP1" failed with the following error: "Error severity levels greater than 18 can only be specified by members of the sysadmin role, using the WITH LOG option.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.I have sysadmin permission .
I have an SSIS package that runs fine through command pormpt although when I try to run it from a SQL Servr Agent Job CmdExec step it bombs out. Please help this has me stumped...the SSIS package uses an XML connection string so certain key settings such as connection strings and email info can be changed easily. Currently this is all on the same machine. I have not moved it beyond where I am developing.
On the command line I am using the following command...
Microsoft (R) SQL Server Execute Package Utility Version 9.00.1399.06 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved.
Started: 6:59:40 PM Progress: 2006-08-16 18:59:41.29 Source: Data Flow Task Validating: 0% complete End Progress Progress: 2006-08-16 18:59:41.29 Source: Data Flow Task Validating: 33% complete End Progress Progress: 2006-08-16 18:59:41.71 Source: Data Flow Task Validating: 66% complete End Progress Progress: 2006-08-16 18:59:41.73 Source: Data Flow Task Validating: 100% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 0% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 33% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 66% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 100% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.81 Source: Data Flow Task Pre-Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.84 Source: Data Flow Task Pre-Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.90 Source: Data Flow Task Pre-Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.90 Source: Data Flow Task Pre-Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Cleanup: 0% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 33% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 66% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 100% complete End Progress Progress: 2006-08-16 18:59:41.95 Source: Send Mail Task The SendMail task is initiated.: 0% complete End Progress Progress: 2006-08-16 18:59:42.09 Source: Send Mail Task The SendMail task is completed.: 100% complete End Progress DTExec: The package execution returned DTSER_SUCCESS (0). Started: 6:59:40 PM Finished: 6:59:42 PM Elapsed: 1.984 seconds
When I try to use the same command within SQL Server Agent Job using a CmdExec step I get the following error...
Description: The package is attempting to configure from the XML file "S:connectionscontacts.dtsConfig". End Info Warning: 2006-08-16 18:40:03.15 Code: 0x80012012 Source: contactsPackage Description: The configuration file name "S:connectionscontacts.dtsConfig" is not valid. Check the configuration file name. End Warning Warning: 2006-08-16 18:40:03.15 Code: 0x80012059 Source: contactsPackage Description: Failed to load at least one of the configuration entries for the package. Check configurations entries and previous warnings to see descriptions of which configuration failed. End Warning Info: 2006-08-16 18:40:03.20 ... Process Exit Code 1. The step failed.
SQL Server 2012VS 2010 SQL Server data toolsFTP Connection Manager - port 21, chunk size 1kb, passive mode=false, saved plain text pwDownloads 1 csv file to local directory on SQL Server box.This always works when run from the package. And always gives a timeout error when scheduled.
Message Microsoft (R) SQL Server Execute Package Utility Version 11.0.2100.60 for 64-bit Started:Â 11:04:59 AM Error: 2014-03-27 11:05:31.12 Â Â Code: 0xC001602A Â Â [code]...
SQL Server Agent is set to Logon As my domain account, and in the package history it says "logged on as " my account.
I have a stored proceedure (which I will tag on at the end for those interested) which is taking at least 15 minutes to run when executed, but completes in 1 minute when the tsql statement is run in Query Analyser. Why is this?
I suspect that it may be connected to table indexing, but why then is this bypassed when QA is used?
Any advice appreciated.
Derek
************************************************** *********************** IF EXISTS (SELECT * FROM sysobjects WHERE id = object_id(N'dbo.sp_ValidateAIGL') AND OBJECTPROPERTY(id, N'IsProcedure') = 1) DROP PROCEDURE dbo.sp_ValidateAIGL GO
-- *** Global Non Program Specific Data Errors *** -- CHECK - that there are records in the DEB_IGL_PAYROLL_OUTPUT file.....none and the routine failed... IF NOT EXISTS(SELECT * FROM tbl_OUT_Payroll WHERE IGLProgramID = @IGLProgramID) INSERT INTO #IGLErrors SELECT NULL, 100, 'No records were processed by the IGL run!'
SELECT * FROM #IGLErrors
-- CHECK - search for any records where the employee's EXPENSE_CODE is NULL INSERT INTO #IGLErrors SELECT DISTINCT NULLIF(EmpNo, ''), 2, 'Employee "' + COALESCE(NULLIF(RTRIM(EmpNo), ''), '<Missing Employee>') + '" (Organisation Unit - ' + COALESCE(RTRIM(OrgUnitCode), '<No Organisation Unit>') + ') does not have a EXPENSE_CODE Code.' FROM tbl_OUT_Payroll WHERE NULLIF(ExpenseCode, '') IS NULL ANDIGLProgramID = @IGLProgramID
SELECT * FROM #IGLErrors -- CHECK - check that the BALANCE of DEBITs match the balance of CREDITs IF (SELECT SUM(Cash) FROM tbl_OUT_Payroll WHERE IsCredit = 1 AND IGLProgramID = @IGLProgramID) <> (SELECT SUM(Cash) FROM tbl_OUT_Payroll WHERE IsCredit = 0 AND IGLProgramID = @IGLProgramID) INSERT INTO #IGLErrors SELECT NULL, 3, 'The total cash value for DEBIT elements does not match the total cash for CREDIT elements.'
SELECT * FROM #IGLErrors -- *** Program 1 and 2 errors *** IF @IGLProgramID IN (1, 2) BEGIN -- CHECK - search for any records where the employee's COST_CENTRE is NULL INSERT INTO #IGLErrors SELECT DISTINCT NULLIF(EmpNo, ''), 1, 'Employee "' + NULLIF(RTRIM(EmpNo), '') + '" (Organisation Unit = ' + RTRIM(OrgUnitCode) + ') does not have a COST_CENTRE Code.' FROM tbl_OUT_Payroll WHERE NULLIF(CostCenter, '') IS NULL ANDIGLProgramID = @IGLProgramID
SELECT * FROM #IGLErrors
-- Check for EMPLOYEEs that were not transfered to the PAYROLL output (usually caused by missing ORG_UNITs or not picked up in -- the DEB_VIEW_APPOINTEE view...) INSERT INTO #IGLErrors SELECT DISTINCT EMP_NO, 11, 'Employee "' + RTRIM(EMP_NO) + '" was excluded from the summary. Check their Organisation Unit codes!' FROM PSELive.dbo.COSTING_OUTPUT WHERENOT EMP_NO IN (SELECT DISTINCT EmpNo FROM tbl_OUT_Payroll WHERE IGLProgramID = @IGLProgramID) ANDPERIOD_NO = @PeriodNo ANDTAX_YEAR = @TaxYear
SELECT * FROM #IGLErrors
-- Check that there are no ELEMENTS in the COSTING_OUTPUT table that don't exist in the tbl_IGLElements table INSERT INTO #IGLErrors SELECT DISTINCT ELEMENT, 12, 'Element "' + RTRIM(ELEMENT) + '" does not exist in the IGL Interface Elements table!' FROM PSELive.dbo.COSTING_OUTPUT WHERE ELEMENT NOT IN ( SELECT DISTINCT Element FROM tbl_IGLElements ) ANDPERIOD_NO = @PeriodNo
SELECT * FROM #IGLErrors
END
-- *** Add a error to indicate the number of errors *** IF EXISTS (SELECT * FROM #IGLErrors) INSERT INTO #IGLErrors SELECT 0, 0, 'Warning, there are ' + CAST(Count(*) AS VarChar(5)) + ' recorded errors!' FROM#IGLErrors
-- Transfer the records to the ErrorsLog table ready for the user to view... DELETE FROM tbl_SYSErrorsLog INSERT INTO tbl_SYSErrorsLog (IGLProgramID, OutputLogID, KeyField, ErrorID, Description) SELECT@ProgramLogID, @IGLPeriodID, KeyField, ErrorID, Description FROM #IGLErrors ORDER BY ErrorID
DROP TABLE #IGLErrors
SELECT * FROM tbl_SYSErrorsLog ORDER BY ErrorID
--SET NOCOUNT OFF
GO GRANT EXECUTE ON dbo.sp_ValidateAIGL TO Public GO
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
The following statement is from Microsoft documentation:
If you use the ExclusionGroup property to specify that rows should only go to one or another of a group of outputs, as in the Conditional Split transformation, you must call the DirectRow method to select the appropriate destination for each row. When you have an error output, you must call DirectErrorRow to send rows with problems to the error output instead of the default output.
I have a question about this because I have never used the "ExclusionGroup" property. For example, I have a script component where I specify 4 separate outputs, because I am sending different groups of rows to each output. I accomplish this programmatically using a lot of conditionals and it works fine.
I did not have to use the "ExclusionGroup" property to do this. So I'm not sure why I would ever need this, or to use DirectRow? I'm trying to understand this better, because maybe I feel like I'm not understanding the DirectRow, or how/when to use it.
I'm new to SSB, so please bear with me. Our application requirements are: 1) Web app gathers user input from a web UI. 2) Web app calls a stored procedure, passing in the user input gathered in step (1). 3) Procedure issues queries to multiple data sources (SQL Server 2005 db's) derived from the user input. 4) Procedure waits for replies from these multiple data sources, governed by a timeout in case a data source is unavailable or slow to respond.
5) Procedure aggregates the data returned in step (4) from multiple data sources into an XML document. 6) Procedure returns the XML document as an output parameter.
This is different than the usual SSB asynchronous application paradigm. In particular, I'm wondering:
How can I setup a synchronous dialog, where the procedure that issues the SEND waits for a reply from the target service? I don't want the initiator procedure to end after SENDing, but rather wait for the replies to those messages so it can aggregate the data from the reply message bodies.
HI, I need to know whether I am on the last row or not in my script component. If this is the case, I would alter a column in the row that indicates me that I am processing the last row. Is there a way to do it? I tried with process input but when the EndOfRowSet() indicates me that the last row is porcessed, I cannot alter the row in the buffer.