Currently I have two SSIS jobs on my machine. The problem I'm having is, only one of the jobs executes succesfully, the other one fails for incorrect user login. Both jobs use the same configuration database and all the packages on both jobs have the protection level set to "DontSaveSensitive". Both jobs have been deployed in the exact same manner, yet only one succeceeds and the other fails.
Porting an existing SQL 2k DTS job over to a SQL 2k5 SQL Server running SSIS.
Background: The job loads data into an empty work table and performs some work before clearing out the work table. This job runs every minute.
Question: If the job happens to take longer than a minute, does SSIS create a second instance of the job? Or perhaps it does what DTS did and reschedules the job for the next iteration?
Concern: I need to know because there would be key contraint violations if another instance of the job started before the working table was cleared out.
Hi, I need to transfer the data in A table on a 2005 instance to B table which has the same structure as A table on a 2000 instance. There are 200,000 records in A table. If I use <insert B select * from linkedserver.....>, it takes only 30 seconds. I create a SSIS package to do this. But it is very slow. After it runs 10 minutes I have to stop it. And I find that it transfers about 100 records every second. Then I change the source server and destination server. That is transferring the same data from the 2000 instance to the 2005 instance. It takes only 50 seconds. why? How to make the package used for transfer data from the 2005 instance to the 2000 instance run fast?
My team is using SqlServer 2005 SSIS to develop package for Consolidated Views. This works fine on my machine in Visual Studio when they run it as a script from command prompt from their desktop. However, when they move this package to ETL box, the package runs fine Visual Studio environment but fails when run from the script with the errors:
Description: Unable to load the package as XML because of package does not have a valid XML format. A specific XML parser error will be posted. End Error
Description: Failed to open package file "C:SM_BI_EXTRACTS_SHAREDSharedInt egrated ViewsIntegrated ViewsCV_Product.dtsx" due to error 0x800C0006 "The system cannot locate the object specified.". This happens when loading a package a nd the file cannot be opened or loaded correctly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format. End Error
-- Could not load package "C:SM_BI_EXTRACTS_SHAREDSharedIntegrated ViewsIntegrated ViewsCV_Product.dtsx" because of error 0xC0011002.
Description: Failed to open package file "C:SM_BI_EXTRACTS_SHAREDSharedIntegr ated ViewsIntegrated ViewsCV_Product.dtsx" due to error 0x800C0006 "The system cannot locate the object specified.". This happens when loading a package and the file cannot be opened or loaded correctly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format. Source: {519D338B-8874-40A8-A385-0D53E868A57C} Started: 12:57:45 PM Finished: 12:57:45 PM Elapsed: 0.016 seconds
Can someone help us understand the problem? We are new to SSIS. We also want to know exactly what needs to be on the Server that will be running our SSIS ETL programs (that is, what components of SSIS -- SQL Server 2K5 is already loaded).
We are using SQL 2005 on Dell Servers running Win 2003, with the latest service packs. Microsoft SQL Server Integration Services Designer Version 9.00.2047.00
Hey, I've a few jobs which call SSIS packages. If I run the SSIS package, it runs fine but if I try to run the job which calls this package, it fails. Can someone help me troubleshoot this issue? None of my jobs that call an SSIS package work. All of them fail.
I have an SSIS package created from a SQL 2000 DTS using the Migration Wizard. The package imports data from a MySQL database to a SQL 2005 64-bit database running on 64-bit windows server 2003. The package runs fine when executed from SQL Server Management Studio but when I schedule it as a job it fails with:
Executed as user: [the domain admin account]. The package execution failed. The step failed.
I've tried a lot of different ways to make this work including creating a new SSIS package. Again, the package ran fine except when it was scheduled as a job.
I am using SQL 2005. I have created a SSIS package that basically executes another SSIS package (as part of a larger package) . It runs fine in SSBIDS but will not run if I save it and schedule it using SQL Agent. I should mention I am using a domain/admin account with SQL Agent, so I don't think that is the problem.
When I execute the job in SSBIDS, the Execute Package Utility window pops up, at which point I click on the Execute button, the job runs successfully and then I click on the close button.
I suspect it is not running via SQL Agent because of the user intervention required to complete the task (i.e. clicking on execute as described above). Is this correct? If so, is there a way to override the requirement for any user intervention. Or, could it be from something else?
I've worked for awhile to get an SSIS package that connects to Oracle to succesfully run via DTExecUI from a 64 bit SQL Server. I completely understand that DTExecUI runs in 32 bit mode, but I need to initiate the package from a stored proc so I need to be able to utilize the 32 bit DTExec from either a xp_cmdshell call or from an sp_start_job (thanks to MatthewRouche for the suggestion). Both Ora92 and 10g were installed on the SQL Server.
Here's what I've tried unssuccessfully
1. SQL Job - Just calling the package from an SSIS step. I wouldn't expect this to run because it is 64 bit DTExec. 2. SQL Job - A job that calls the 32 bit version of dtexec from cmdexec. Here is 2 examples I have tried, the first is because I came across several postings saying that there is a problem with the "(x86)" for the Oracle driver.
\server_nameE$ProgramFilesLegacyMicrosoftSQLServerX8690DTSBinnDTExec.exe /DTS "PackageName" /SERVER "SERVER_NAME" /SET Package.Variables[PLT].Value;"8H" /CONFIGFILE "\SERVERConfigFile.dtsConfig" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EW
\server_nameE$Program FilesMicrosoft SQL Server (x86)90DTSBinnDTExec.exe /DTS "PackageName" /SERVER "SERVER_NAME" /SET Package.Variables[PLT].Value;"8H" /CONFIGFILE "\SERVERConfigFile.dtsConfig" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EW
Both of these give me the error below:
System.Exception: System.Data.OracleClient requires Oracle client software version 8.1.7 or greater.
I'm assuming that it is running in 32 bit mode because this was at the beginning of the job history log:
Executed as user: "XXX". ....3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved
3. I then tried to call the package from a batch file on a file server that had integration services installed on. Got the same error as above.
I'm obviously missing something if I can get it to work in DTExecUI but not DTExec 32-bit.
Any ideas? Below are some postings I have already visited several times:
Windows 64 bit and Oracle http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=358365&SiteID=1
Connecting to Oracle on 64-bit (x64) machine http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=159581&SiteID=1
One paragraph that stands out on this posting is:
Find the location of the application that is generating the error. Check the path to this location and see if it contains any parenthesis. If so, you must relocate the application to a directory without any parenthesis in the path. This didn€™t work for me, unless I was doing it wrong.
Error retrieving numeric data from Oracle 10g on 64 bit Itanium SSIS server http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1183296&SiteID=1
Connection Problems to Oracle Database http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1307460&SiteID=1
If this posting was complete, it looks like what I was looking for.
When I run a package I created in the development Studio it runs fine but if I create a job and run it I get an error "The AcquireConnection method call to the connection manager "ODS" failed with error code 0xC0202009"
I have the package setup to use a XML config file and it works fine on all the other packages but this one will not work.
A collegue of mine is having some issues. I hope someone can help
Hi all,
This is my first call for technical assistance so go easy on me.
I'm having a problem in one of my SSIS packages. In brief, the process involves copying the rows from an untyped table to a typed table. There are about 45,000 rows in this table and during the copy ("OLE DB source" to "OLE DB destination") the process appears to hang for about 5 minutes eventually crashing with a "The buffer manager could not get a temporary file name. The call to GetTempFileName failed."
After several attempts using all the trace info I could master this is the order of events with some interesting numbers and facts:
1. The SSIS package goes swifty through "Validation", "Prepare for Execution" and "Execution" in less than 1 second, with "Execute phase is beginning" being the last message on the "Progress tab".
2. Using Performance counters I note that in the next 5 mins the values for "BLOB bytes read" slowly rises and then after a couple of mins so does ""BLOB files in use" the latter reaching a figure of 65534
3. When this figure is reached, SSIS starts creating thousands of zero-size files with the name DTS####.tmp (where #### is hex e.g. DTSB4C1.TMP) in the TEMP folder (C:Documents and Settings<username>Local SettingsTemp in my case).
4. When I started running this package there were 130 files in my TEMP folder; As soon as the combined total of files in TEMP reaches 65664 (i.e. 65534+130), SSIS starts producing the errors list which includes the one I listed above and eventually it clears the TEMP folder down to the original 130 files.
5. My conclusion (thus far) is that SSIS creates all these 1000s of tmp files but in my case hits some kind of maximum (either a folder limit or runs out of hex combinations for the file names) and then crashes.
6. The only thread I found on the internet suggested setting up an environment variable "BLOBTempStoragePath" and assigning a value of "C:Temp1;C:Temp2;C:Temp3;C:Temp4" so that SSIS can span across a number of "temporary" folders instead of the 1 default folder contained in the "TEMP" environment variable.
7. Setting the above environment variable in Windows 2000 did not work for me (tried it as both a user variable and a system environment variable). So here are the facts so far - ANY assistance will be hugely appreciated - I have no idea why all these temporary files are being generated - I have created SSIS packages handling data sets 10 times bigger than this one without these problems so I don't think it's size related.
I have a very complex Stored Procedure called by a Job that is Scheduled to run every night. It's execution takes sometimes 1 or 2 hours and sometimes 7 hours or more.
So, if it is running for more than 4 hours I stop the Job and I run the procedure from a Query Window and it never takes more than 2 hours.
Can anyone help me identify the problem ? I want to run from the Job and not to worry about it.
Some more information: - It is SQL 2000 Enterprise with SP4 in a Cluster (It happens the same way in any node). - The SQL Server and SQL Agent services run using a Domain Account that have full Administrative access. - When I connect to a Query Window I also use a Windows Account.
- There is no locks or process bloking or being blocked while the job is running. - Using the Task Manager the processor activity is ok, no more than 30 % in any processor.
I have created an SSIS package which, as its 2nd step, picks up a .dbf table and does a Data Flow Task of putting the .dbf table into a SQL Server 2005 table (which was just truncated in Step 1).
The Connection Manager that works on the .dbf table is Native OLE DBMicrosoft Jet 4.0 OLE DB Provider.
The entire SSIS package runs perfectly when i run it from my development environment.
I then assigned this Package to a SQL Server 2005 Job. The entire package is the sole step of the job.
But when I run the SQL Job, the Job fails immediately on Step 2.
Here is the error message in View History of the SQL Job:
Executed as user: DBASE01SQLExec. ...on 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 11:48:37 AM Progress: 2007-06-14 11:48:37.91 Source: DBF to ScanUS_Process Validating: 0% complete End Progress Progress: 2007-06-14 11:48:37.91 Source: DBF to ScanUS_Process Validating: 33% complete End Progress Error: 2007-06-14 11:48:37.91 Code: 0xC0202009 Source: DBF to ScanUS_Process ScanUS DBF Output File [1] Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E37. End Error Error: 2007-06-14 11:48:37.91 Code: 0xC02020E8 Source: DBF to ScanUS_Process ScanUS DBF Output File [1] Description: Opening a rowset for "scanus_process" failed. Check that the object exists in the database. End Error Error: 2007-06-14 11:48:37.95 Code: 0xC004706B Source: DBF to ScanUS_Process DTS.Pipeline Description: "component "ScanUS... The package execution fa... The step failed.
I am using SSIS packages for data transfer, When i run the package on virtual server it takes more time as when run on a PC. After analysing i found that Package when run on Virtual server takes time in startup around (50 sec) or so.Could anyone help me with a little bit of detail description as to why it runs slow.
I'm at a bit of a loss. I've written a package that has about a dozen data flow tasks and each one loads data from text files into tables in a db. It's possible that some of the fields in the text files may have data that would need to be truncated upon insert into the db.
In the data flows, I've set the fields in the data sources I know this could happen to to ignore truncation errors. When I run this within BI Studio, everything works fine. I then loaded the package into the SSIS store on the db server and if I attempt to run it from the SSIS store or if a job executes it, the package appears to be failing because of these truncation errors that I've told it to ignore.
I have a problem where I have an SSIS package (SQL Server 2005) that won't run properly from SQL Server Agent, but it runs fine when kicked off manually from Integration Services -> Run Package or when run in debug from Visual Studio.
The first step in the package checks for the existance of a file via a script task. The script looks like this...
Code Block Public Sub Main()
Dim TaskResult As Integer Dim ImportFile As String = CStr(Dts.Variables("BaseDirectory").Value) + CStr(Dts.Variables("ImportDirectory").Value) + CStr(Dts.Variables("ImportFile").Value)
If Dir(ImportFile) = "" Then Dts.TaskResult = Dts.Results.Failure Else Dts.TaskResult = Dts.Results.Success End If
Return
End Sub
This script runs fine and the file is seen as expected when I run the package manually. But as a step in a SQL Server Agent job, it doesn't see the file.
The SQL Server Agent service is set to start up / log on as a Local System Account. I've also tried setting up a credential / proxy (using an account that I know can see and even move / rename the file) to run the job as but that didn't seem to help.
The package is being run from SQL Server (stored in MSDB) and is set to rely on SQL Server for sensitive information, so I don't think that's an issue; other packages are set up like this in terms of sensitive data and run fine.
Any ideas why my script can't "see" the file I'm looking at when it's kicked off by SQL Server agent? I've looked and looked...I can't seem to figure this out. I would really appreciate any help you might be able to offer up.
Hi I am trying to schedule a job to copy an MDB data file from Unix server to Windows 2003 server (Accfp1_data2_server). I have created a file copy SSIS package and tested it in the SSIS Visual Studio environment where it runs ok. The package was created while logged in as a domain administrator.
I then created a job to run this package (which is stored on a folder) using the credential of the same domain administrator who has full access privilege to both of these servers. However, the job fails whenever it is run manually or scheduled? The error message displayed is given below --------------------------------------------------------------- Message Executed as user: FORTIESABCITYG. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 14:26:07 Error: 2007-09-13 14:26:12.56 Code: 0xC001401E Source: CommunityContact - Copy MS Access Database Connection manager "CONTACT.mdb On Accfp1_data2_server" Description: The file name "\Accfp1_data2_serverDATA2Arts&recAppsContactsCONTACT.mdb" specified in the connection was not valid. End Error Error: 2007-09-13 14:26:12.56 Code: 0xC001401D Source: CommunityContact - Copy MS Access Database Description: Connection "CONTACT.mdb On Accfp1_data2_server" failed validation. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 14:26:07 Finished: 14:26:12 Elapsed: 5.297 seconds. The package execution failed. The step failed. ---------------------------------------------------------------
Please note that the job runs without problem when I change the source file to a Windows 2000 server share . How bizzare? Hope this is not a Microsoft's Trick?
Visual Studio runs out of memory when trying to use SSIS package. I am trying to create and run a SSIS package that validates and imports some large xml files >200MB. Validation fails because Visual studio cannot open large files without running out of memory.
The SSIS package throws this error when I run the package..at the validation task.
Error: 0xC002F304 at Validate bio_fixed, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.".
How do I increase the amount of RAM that VIsual Studio can use...I have plenty of RAM on my workstation >3GB, but VS chokes maybe around 100MB files?
I've created a SSIS package that calls the access dll and fires off 2003 access reports, saves them as PDF's and emails them off.
Now this works fine when I run it manually, but when I schedule and fire off a job I get a very vague error "exception has been thrown by the target of an invocation".
I have copied the access dll to the GAC and .net framework v2.0.50727 but still no luck.
I'm using Bull zip PDF printer and those DLL's are also in the GAC
I am copying a simple table from a Sql Server 2005 database to an *.sdf mobile database.
I am brand new to SSIS and I am probably doing something wrong. But after executing the SSIS package all the rows and all the fields are NULL in the destination database. I put a datagrid viewer between the OLE DB Source and the Sql Server compact edition destination and I can see the real data which is obviously not ALL NULL.
Does anyone have a clue as to why it would be doing this?
I have simple query which creates tables by passing database name as parameter from a parameter table .
SP1 --> creates databases and calls SP2--> which creates tables . I can run it fine via SSMS but when I run it using SSIS it fails with below error .The issue gets more interesting when it fails randomly on some database creation and some creates just fine .
Note** I am not passing any database of name '20'
Exception handler error :
ERROR :: 615 :: Could not find database ID 20, name '20'. The database may be offline. Wait a few minutes and try again. ---------------------------------------------------------------------------------------------------- SPID: 111 Origin: SQL Stored Procedure (SP1) ---------------------------------------------------------------------------------------------------- Could not find database ID 20, name '20'. The database may be offline. Wait a few minutes and try again. ----------------------------------------------------------------------------------------------------
Error in SSIS
[Execute SQL Task] Error: Executing the query "EXEC SP1" failed with the following error: "Error severity levels greater than 18 can only be specified by members of the sysadmin role, using the WITH LOG option.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.I have sysadmin permission .
I have an SSIS package that runs fine through command pormpt although when I try to run it from a SQL Servr Agent Job CmdExec step it bombs out. Please help this has me stumped...the SSIS package uses an XML connection string so certain key settings such as connection strings and email info can be changed easily. Currently this is all on the same machine. I have not moved it beyond where I am developing.
On the command line I am using the following command...
Microsoft (R) SQL Server Execute Package Utility Version 9.00.1399.06 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved.
Started: 6:59:40 PM Progress: 2006-08-16 18:59:41.29 Source: Data Flow Task Validating: 0% complete End Progress Progress: 2006-08-16 18:59:41.29 Source: Data Flow Task Validating: 33% complete End Progress Progress: 2006-08-16 18:59:41.71 Source: Data Flow Task Validating: 66% complete End Progress Progress: 2006-08-16 18:59:41.73 Source: Data Flow Task Validating: 100% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 0% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 33% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 66% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 100% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.81 Source: Data Flow Task Pre-Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.84 Source: Data Flow Task Pre-Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.90 Source: Data Flow Task Pre-Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.90 Source: Data Flow Task Pre-Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Cleanup: 0% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 33% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 66% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 100% complete End Progress Progress: 2006-08-16 18:59:41.95 Source: Send Mail Task The SendMail task is initiated.: 0% complete End Progress Progress: 2006-08-16 18:59:42.09 Source: Send Mail Task The SendMail task is completed.: 100% complete End Progress DTExec: The package execution returned DTSER_SUCCESS (0). Started: 6:59:40 PM Finished: 6:59:42 PM Elapsed: 1.984 seconds
When I try to use the same command within SQL Server Agent Job using a CmdExec step I get the following error...
Description: The package is attempting to configure from the XML file "S:connectionscontacts.dtsConfig". End Info Warning: 2006-08-16 18:40:03.15 Code: 0x80012012 Source: contactsPackage Description: The configuration file name "S:connectionscontacts.dtsConfig" is not valid. Check the configuration file name. End Warning Warning: 2006-08-16 18:40:03.15 Code: 0x80012059 Source: contactsPackage Description: Failed to load at least one of the configuration entries for the package. Check configurations entries and previous warnings to see descriptions of which configuration failed. End Warning Info: 2006-08-16 18:40:03.20 ... Process Exit Code 1. The step failed.
SQL Server 2012VS 2010 SQL Server data toolsFTP Connection Manager - port 21, chunk size 1kb, passive mode=false, saved plain text pwDownloads 1 csv file to local directory on SQL Server box.This always works when run from the package. And always gives a timeout error when scheduled.
Message Microsoft (R) SQL Server Execute Package Utility Version 11.0.2100.60 for 64-bit Started:Â 11:04:59 AM Error: 2014-03-27 11:05:31.12 Â Â Code: 0xC001602A Â Â [code]...
SQL Server Agent is set to Logon As my domain account, and in the package history it says "logged on as " my account.
I have a stored proceedure (which I will tag on at the end for those interested) which is taking at least 15 minutes to run when executed, but completes in 1 minute when the tsql statement is run in Query Analyser. Why is this?
I suspect that it may be connected to table indexing, but why then is this bypassed when QA is used?
Any advice appreciated.
Derek
************************************************** *********************** IF EXISTS (SELECT * FROM sysobjects WHERE id = object_id(N'dbo.sp_ValidateAIGL') AND OBJECTPROPERTY(id, N'IsProcedure') = 1) DROP PROCEDURE dbo.sp_ValidateAIGL GO
-- *** Global Non Program Specific Data Errors *** -- CHECK - that there are records in the DEB_IGL_PAYROLL_OUTPUT file.....none and the routine failed... IF NOT EXISTS(SELECT * FROM tbl_OUT_Payroll WHERE IGLProgramID = @IGLProgramID) INSERT INTO #IGLErrors SELECT NULL, 100, 'No records were processed by the IGL run!'
SELECT * FROM #IGLErrors
-- CHECK - search for any records where the employee's EXPENSE_CODE is NULL INSERT INTO #IGLErrors SELECT DISTINCT NULLIF(EmpNo, ''), 2, 'Employee "' + COALESCE(NULLIF(RTRIM(EmpNo), ''), '<Missing Employee>') + '" (Organisation Unit - ' + COALESCE(RTRIM(OrgUnitCode), '<No Organisation Unit>') + ') does not have a EXPENSE_CODE Code.' FROM tbl_OUT_Payroll WHERE NULLIF(ExpenseCode, '') IS NULL ANDIGLProgramID = @IGLProgramID
SELECT * FROM #IGLErrors -- CHECK - check that the BALANCE of DEBITs match the balance of CREDITs IF (SELECT SUM(Cash) FROM tbl_OUT_Payroll WHERE IsCredit = 1 AND IGLProgramID = @IGLProgramID) <> (SELECT SUM(Cash) FROM tbl_OUT_Payroll WHERE IsCredit = 0 AND IGLProgramID = @IGLProgramID) INSERT INTO #IGLErrors SELECT NULL, 3, 'The total cash value for DEBIT elements does not match the total cash for CREDIT elements.'
SELECT * FROM #IGLErrors -- *** Program 1 and 2 errors *** IF @IGLProgramID IN (1, 2) BEGIN -- CHECK - search for any records where the employee's COST_CENTRE is NULL INSERT INTO #IGLErrors SELECT DISTINCT NULLIF(EmpNo, ''), 1, 'Employee "' + NULLIF(RTRIM(EmpNo), '') + '" (Organisation Unit = ' + RTRIM(OrgUnitCode) + ') does not have a COST_CENTRE Code.' FROM tbl_OUT_Payroll WHERE NULLIF(CostCenter, '') IS NULL ANDIGLProgramID = @IGLProgramID
SELECT * FROM #IGLErrors
-- Check for EMPLOYEEs that were not transfered to the PAYROLL output (usually caused by missing ORG_UNITs or not picked up in -- the DEB_VIEW_APPOINTEE view...) INSERT INTO #IGLErrors SELECT DISTINCT EMP_NO, 11, 'Employee "' + RTRIM(EMP_NO) + '" was excluded from the summary. Check their Organisation Unit codes!' FROM PSELive.dbo.COSTING_OUTPUT WHERENOT EMP_NO IN (SELECT DISTINCT EmpNo FROM tbl_OUT_Payroll WHERE IGLProgramID = @IGLProgramID) ANDPERIOD_NO = @PeriodNo ANDTAX_YEAR = @TaxYear
SELECT * FROM #IGLErrors
-- Check that there are no ELEMENTS in the COSTING_OUTPUT table that don't exist in the tbl_IGLElements table INSERT INTO #IGLErrors SELECT DISTINCT ELEMENT, 12, 'Element "' + RTRIM(ELEMENT) + '" does not exist in the IGL Interface Elements table!' FROM PSELive.dbo.COSTING_OUTPUT WHERE ELEMENT NOT IN ( SELECT DISTINCT Element FROM tbl_IGLElements ) ANDPERIOD_NO = @PeriodNo
SELECT * FROM #IGLErrors
END
-- *** Add a error to indicate the number of errors *** IF EXISTS (SELECT * FROM #IGLErrors) INSERT INTO #IGLErrors SELECT 0, 0, 'Warning, there are ' + CAST(Count(*) AS VarChar(5)) + ' recorded errors!' FROM#IGLErrors
-- Transfer the records to the ErrorsLog table ready for the user to view... DELETE FROM tbl_SYSErrorsLog INSERT INTO tbl_SYSErrorsLog (IGLProgramID, OutputLogID, KeyField, ErrorID, Description) SELECT@ProgramLogID, @IGLPeriodID, KeyField, ErrorID, Description FROM #IGLErrors ORDER BY ErrorID
DROP TABLE #IGLErrors
SELECT * FROM tbl_SYSErrorsLog ORDER BY ErrorID
--SET NOCOUNT OFF
GO GRANT EXECUTE ON dbo.sp_ValidateAIGL TO Public GO
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
I have a vbs script to try to prove that I can perform vbs scripting in either a job step or a dts package The script is
dim rs, sql,adoconn, adocommand, dataconnstring
set adoconn = createObject("ADODB.CONNECTION") Set rs = CreateObject("ADODB.Recordset") set adocommand = CreateObject("ADODB.Command") adoconn.ConnectionString = "Provider=SQLOLEDB;Server=myserver;Database=pubs;U id=myuser;Pwd=mypass;"
adoconn.Open sql = "select * from import" rs.Open sql, adoconn,adOpenForwardOnly while rs.EOF = false sql = "insert into zz default values" adocommand.ActiveConnection = adoconn adocommand.CommandText = sql adocommand.Execute
rs.MoveNext wend
rs.Close adoconn.Close set adoconn = nothing
However, when I run this from windows explorer it works fine, but when I try to run it as an activeX script, I get the error ActiveX scripting: Function not found
As a cmdexec step in a job, I use the line c:inetpubwwwrootvbsvbstest1.vbs It failed with the error The process could not be created for step 1 of job 0x119BDBD264AD9B4597A9302786F0E250 (reason: %1 is not a valid Win32 application). The step failed.
What is wrong with the vbs script ?, or do I need to invoke it a different way ?
I have jobs (DTS packages) for several different tasks that I would like to run sequentially, rather than trying to estimate how long each will take and schedule them individually.
I'm trying to run a job which moves data from one machine to another. I can manually execute the package successfully, but the job always fails. I get the dreaded error message 18456, "login failis for user "". I have double checked the logins for both machines,(sql server agent AND the sql server authentication). Also, I can run a select statement (from server_ to server_2) successfully from query analyzer i.e.
Select * from <linked_server_name>.<database>.<database_owner>.<table_name>
however, if I run this query from server_2 to server_1 I get the failed login message similar to the error message found in the job history. Since both servers accept nt logins, where is my problem? They are both set up as linked servers, and they are both mssql oledb. Any suggestions will be appreciated! thanks
I have created a package that takes a Visual FoxPro .dbf and imports into SQL7. If I run the job, it works fine. If I schedule the job it fails stating that I can't find the .dbf, the same one that it just found when run manually. What gives?
My procedure compliles and runs. l 'm running it as 'exec Statement' it should prompt me for the start and end date but it does not. Where have l gone wrong in my logic l've declared all the variables need. Please advice
if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[Statement]') and OBJECTPROPERTY(id, N'IsProcedure') = 1) drop procedure [dbo].[Statement] GO
SET QUOTED_IDENTIFIER ON GO SET ANSI_NULLS ON GO
Create Procedure Statement As Begin
declare @Startdate As datetime declare @Enddate As datetime
declare @Customer_No As char(15) declare @loanno As char(15)
declare @transaction_date As datetime declare @transaction_type As char(3) declare @reference As varchar(20) declare @notes As varchar(255) declare @transaction_amount As decimal (9,2) declare @transaction_description As varchar(50)
declare @debit_amount As decimal (9,2) declare @credit_amount As decimal (9,2) declare @counter As int declare @balance As decimal (9,2) declare @user_changed As char(8)
declare c2 CURSOR FOR SELECT loan_no FROM loan where customer_no = @Customer_No ORDER BY loan_no
OPEN c2 FETCH NEXT FROM c2 INTO @loanno
-- Check @@FETCH_STATUS to see if there are any more rows to fetch. WHILE @@FETCH_STATUS = 0 BEGIN
declare c1 CURSOR FOR SELECT transaction_record.loan_no, transaction_record.transaction_date, transaction_record.transaction_type, transaction_record.reference_no, transaction_record.notes, transaction_record.transaction_amount, transaction_type.[description] FROM transaction_record inner join transaction_type on transaction_type.transaction_type = transaction_record.transaction_type where loan_no = @loanno and transaction_Date between @startdate and @enddate and transaction_amount <> 0 ORDER BY transaction_date
OPEN c1 FETCH NEXT FROM c1 INTO @transaction_date, @transaction_type, @reference, @notes, @transaction_amount, @transaction_description
-- Check @@FETCH_STATUS to see if there are any more rows to fetch. WHILE @@FETCH_STATUS = 0 BEGIN If (@transaction_amount < 0) Begin set @credit_amount = @transaction_amount set @debit_amount = 0 End Else Begin set @debit_amount = @transaction_amount set @credit_amount = 0 End
If (@counter = 0) Begin set @balance = @transaction_amount End Else Begin set @balance = @balance + @transaction_amount End
FETCH NEXT FROM c1 INTO @transaction_date, @transaction_type, @reference, @notes, @transaction_amount, @transaction_description End CLOSE c1 DEALLOCATE c1
FETCH NEXT FROM c2 INTO @loanno End CLOSE c2 DEALLOCATE c2 END; Go
Good afternoon, i'm new to Functions on the SQL server
I'm trying to create a dynamic query that would select the the column passed to the function from a certain table
my table called selected_Date, and has StartDate, and EndDate columns
when the user select for example "StartDate", i pass this as a variable to the function which runs the query. but i always gets back the passed string as a result..
I have SQL Server 2000, and our web application is in WebObjects.
I built a trigger on a table that indicates if certain fields in a record have been changed since the last time a report was run from the application.
This trigger runs fine through the Query Analyzer, and runs fine with a direct input through enterprise manager. However, when the WebObjects application tries to update the table, and error is thrown.
Is anyone familiar with a reason why an application would throw an error on an update, when the DB tools do not? If we disable the trigger, the application has no problem updating the table.
Here is the relevant portion of the trigger:
create trigger t_press_run_change on dbo.press_run_line_item for insert, update, delete as
updatepress_run_line_item setis_changed = 1 from deleted d join press_run_line_item p on p.press_run_line_item_id = d.press_run_line_item_id where(p.is_changed = 0 or p.is_changed is null) AND ((isNull(p.print_quantity,0) <> isNull(d.print_quantity,0)) OR (isNull(p.spoilage_pct,0) <> isNull(d.spoilage_pct,0)) OR(isNull(p.ad_dimension_id,0) <> isNull(d.ad_dimension_id,0)) OR(isNull(p.quarter_fold_id,0) <> isNull(d.quarter_fold_id,0)) OR(isNull(p.max_qty_per_shipment,'') <> isNULL(d.max_qty_per_shipment,'')) OR(isNull(p.packaging_max_height,'') <> isNull(d.packaging_max_height,'')) OR(isNull(p.packaging_max_weight,'') <> isNull(d.packaging_max_weight,'')) OR(isNull(p.packaging_skids,'') <> isNull(d.packaging_skids,'')) OR(isNull(p.packaging_turns,'') <> isNull(d.packaging_turns,'')) OR(isNull(p.packaging_cartons,'') <> isNull(d.packaging_cartons,'')) OR(isNull(p.preprint_delivery_time,'') <> isNull(d.preprint_delivery_time,'')) OR(isNull(p.contact_id,0)<> isNull(d.contact_id,0)) OR(isNull(p.address_company,'')<>isNull(d.address_company,'')) OR(isNull(p.address1,'')<>IsNull(d.address1,'')) OR(isNull(p.address2,'')<>IsNull(d.address2,'')) OR(isNull(p.address_city,'')<>IsNull(d.address_city,'')) OR(isNull(p.address_state,'')<>IsNull(d.address_state,'')) OR(IsNull(p.address_zip,'')<>IsNull(d.address_zip,'')) OR(isNull(p.address_Country_id,'')<>IsNull(d.address_country_id,'')) OR(isNull(p.client_printer_id,'')<>IsNull(d.client_printer_id,'')))
Some users complain that there computers run slow around noon. later it is fine. Anybody can tell me what is the problem? We have entivirus software installed on. Also i am runing back up database on the server. Many thanks.
Hello everyone. I am pretty new to SQL Server, but I have been reading a lot lately. One of the things that I have gone over extensively lately is backups (I didn't understand it til recently.)
I have a question: right now, I need to be able to backup my SQL database to a file server; both the DB and TLogs. I don't have room for them on the local drive right now, so this is my last option.
Here is the catch. The SQL server is not part of our domain. The file server box IS part of our domain.
In order to backup across the network, I need to change the account that SQL and the agent runs under.
Would this work:
Create a new account on the SQL box and make it part of the administrators group. Make the SQL server and SQL agent run under that account.
On the File server, create a local account. Make it the same (username and password) that was created on the SQL box. This should allow me to backup my DB to the share.
Would that work?
Lastly, by changing the account SQL runs under, does that change anything in the way that SQL runs? Does it affect the way users authenticate to SQL (Right now, they authenticate using SQL authentication)