i'm trying to update a table wich have a trigger on it,when i run the update, this fire the trigger that saves in other table the values of the modified fields and the type of operation that the user did.
this other table is for auditing the changes of the date.
when i try to run the update sql gives me this error :
Process 67 unlocking unowned resource: TAB: 10:334884510
i dont know what it means..can anybody help me to find how to correct this?
Hi there, One of my databases is getting the Following error
Error 1203 Caller of lock manager is incorrectly trying to unlock an unlocked object. spid=%d locktype=%d dbid=%d lockid=%Id.
I ran DBCC statements after I got the server up and running and it didn't come up with any errors. Everything ran ok until the following week when the same problem happened again. Originally we had problems with this server and we disabled write-caching, Does anyone have any idea what could be causing the problem and how to fix it, Thanks in advance, Fin
I am running a script on a schedule and I get this error quite often:
Unspecified error occurred on SQL Server. Connection may have been terminated by the server. [SQLSTATE HY000] (Error 0) Process ID ## attempted to unlock a resource it does not own: OBJECT: ## . Retry the transaction, because this error may be caused by a timing condition. If the problem persists, contact the database administrator. [SQLSTATE HY000] (Error 1203)
If I keep rerunning the transaction, it will eventually succeed. I am running SQL Server 2005 with SP1 installed. Does anyone know what this means and what would cause it to fail sometimes and succeed other times?
Our SQL Server 200 box is getting perflib errors when we get a decentamount of people using an application that I wrote, call queue system,web based. To accomplish a queue type system on a button push I wrotea query like this...BEGIN TRANSACTION;Select top 1...fields here...FROM table with (xlock,readpast)(2 joins)WHERE numerous where clausesORDER BY 2 order bys.Now our sql server starts timing out..then in the app log this showsup....Error: 1203, Severity: 20, State: 1Process ID 62 attempting to unlock unowned resource PAG: 6:1:126407.For more information, see Help and Support Center athttp://go.microsoft.com/fwlink/events.asp.The reason I am doing xlock is to make a record not viewable to 2people if they click the button on the web form that runs the abovequery within the same minute, they would get different records....So to avoid this error which I assume is due to my xlock should Irethink my query?
We have a customized package which uses 2 middle layers before it hits the database. yesterday the sqlserver rebooted all of sudden and I got this error
Process ID 96 attempting to unlock unowned resource KEY: 7:167671645:4 (8d021bb2cdf2)..
Error: 1203, Severity: 20, State: 1
Error 1203 Severity Level 20 Message Text Process ID %d attempting to unlock unowned resource %.*ls.
Explanation This error occurs when Microsoft® SQL Server™ is engaged in some activity other than normal post-processing cleanup and it finds that a particular page it is attempting to unlock is already unlocked. The underlying cause for this error may be related to structural problems within the affected database. SQL Server manages the acquisition and release of pages to maintain concurrency control in the multi-user environment. This mechanism is maintained through the use of various internal lock structures that identify the page and the type of lock present. Locks are acquired for processing of affected pages and released when the processing is completed.
Action Execute DBCC CHECKDB against the database in which the object belongs. If DBCC CHECKDB reports no errors, attempt to reestablish the connection and execute the command.
Important If executing DBCC CHECKDB with one of the repair clauses does not correct the index problem, or if you are unsure what effect DBCC CHECKDB with a repair clause has on your data, contact your primary support provider
Can anyone help with this.DBCC checkdb returned without errors.
On our datawarehouse server we are regularly having a 1203 error, causing the sql-server to hang. We get this message in the errorlog: Failed Assertion = 'm_activeSdesList.Head () == NULL'. In the knowledgebase I found a bug description that is very lookalike to our problem.
Article: Q240853 FIX: Lock Escalation With Parallel Query May Cause 1203 Error And Server Shutdown
*** part of the article *** SYMPTOMS If a lock escalation occurs while running a parallel query, it is possible to encounter error message 1203 as follows:
spid7 Process 7 unlocking unowned resource: KEY: 13:117575457:2 (35010560ebcd) spid7 Process 7 unlocking unowned resource: KEY: 13:117575457:2 (35010560ebcd) spid7 Error: 1203, Severity: 20, State: 1 spid7 Process ID 7 attempting to unlock unowned resource KEY: 13:117575457:2 (35010560ebcd). spid7 Error: 1203, Severity: 20, State: 1 spid7 Process ID 7 attempting to unlock unowned resource KEY: 13:117575457:2 (35010560ebcd).
The error message included in the error log probably mentions the same lock resource in several of the error messages.
Once the error is printed, an assertion message similar to the following is also printed: 1999-08-09 13:15:26.79 kernel SQL Server Assertion: File: <proc.c>, line=1866 Failed Assertion = 'm_activeSdesList.Head () == NULL'. After a dump of the stack for all threads, the server initiates a shutdown of the SQL Server process. ... *** end ***
You can find the complete article on: http://support.microsoft.com/support/kb/articles/q240/8/53.asp?LN=EN-US&SD=gn&FR=0
We can't use the workaround, because that would shut out parallelism, which is necessary for the project.
There is a fix, but in the article Microsoft says: "A supported fix that corrects this problem is now available from Microsoft, but it has not been fully regression tested and should be applied only to systems experiencing this specific problem.". You understan,d this is not one of my favorite type of fixes...
Does anyone have already installed the fix mentioned? Had any problems with it, or did it cause some other troubles?
Thx, Kurt De Cauwsemaecker Database Administrator Telepolis Antwerpen
Hi everyone.... I'm trying to execute this update statement... It takes an eternity... any ideas on how to rewrite or speed it up?
It's a several step process... below is everything that i run, one step at a time. The final update statement is what takes so long. It should only affect about 2600 rows out of a potential 9000. That's why I'm confused on the response time
select d.olddevicename, de.device, d.newdevicename into #temp9 from dns d, devices de where de.device = d.olddevicename
update #temp9 set device = newdevicename where olddevicename = device
update devices set device = #temp9.device from #temp9, devices where #temp9.device in (select #temp9.device from #temp9, devices where #temp9.olddevicename = devices.device)
Is it possible for me to run an update syntax at a certain time say midnight for example?
I'm trying to update a bit field in my table (which acts as a checkbox in my Access front end), but only if three date fields are before todays date. The dates in question are in two other tables.
I'm having what you might call an optimisation issue - but I'm also not sure if my approach to this problem is right. I've spent the whole day trying various methods but none seem to be performing as admirably as I'd hoped.
Basically, here's the scenario:
1. Log files are being inserted into a SQL table via Log Parser 2.2. 2. I have a lookup table of ip address ranges to countries and other geographic data. 3. Once the log row has been inserted from Log Parser, I want to update the same log table with data from the lookup table.
The main problem seems to be the updating (the initial insert from Log Parser is lightning quick).
I initially wrote a trigger on AFTER INSERT on the log table that converted the actual user's IP address into IPNumber format using a simple algorithm, then pumped the IPNumber into a quick select query which retrieved the geolocation data. Then, in the same trigger, there was an update statement which basically said:
update dbo.Logs set [Country] = @Country where [IpNumber] = @IpNumber and [Country] = Null
Therein lies the problem. The update statement slows everything down to almost a standstill and also seems to obtain a lock on the table.
Critical factors:
1. The log table must remain readable. 2. The query must execute in seconds -- not over half hour :)
I've experimented with various combinations of indexing, so far to no avail.
The following basic UPDATE SQL statement has been running for 16 hours and counting. I need to get this done ASAP.
UPDATE Recipients SET UndeliverableTime = getdate() FROM Recipients INNER JOIN Domains ON (Recipients.DomainID = Domains.ID) INNER JOIN Undeliverables ON ( Recipients.UserName + '@' + Domains.Domain = Undeliverables.EmailAddress)
Is there any way I can see how far this has gone and how long it will take to finish? Will this take another hour to finish or another week?
Both tables (Recipients and Undeliverables) have approximately 80 million records
I did a nearly identical operation with another table that had only 7 million records and it took 10.5 hours. I hope this doesn't scale linearly to 115 hours.
I am tempted to cancel, retune, and rerun but that may be trigger a really expensive rollback operation that could take days. Any ideas?
Does an UPDATE statment lock the entire table or just the rows that will be affected by the UPDATE?
I ask because -
Can I run UPDATE statements in parallel on the same table on the same column. The need for doing this is because the table is a large fact table. I plan to execute the same UPDATE statements on different time sections of the data to expedite the processing.
If the UPDATE statment lock the entire table then I cannot run an UPDATE in parallel. If the UPDATE statement just locks the rows that will be affected then maybe I can because rows affected will be different for each UPDATE.
am using FOR UPDATE triggers to audit a table that has 67 fields. Myproblem is that this slows down the system significantly. I havenarrowed down the problem to the size (Lines of code) that need to becompiled after the trigger has been fired. There is about 67 IFUpdate(fieldName) inside the trigger and a not very complex selectstatement inside the if followed by an insert to the audit table. WhenI leave only a few IF-s in the trigger and comment the rest of thecode performance increased dramatically. It seems like it is checkingevery single UPdate() statement. Assuming that this was slowing downdue to doing a select for every update i tried to do to seperateselects in the beginning from Deleted and Inserted and assigningcolumns name to specific variables and instead of doingif Update(fieldName) i didif @DelFieldName <> @InsFieldNamebeginINSERT INTO AUDITSELECT WHAT I NEEDENDThis did not improve performance. If you have any ideas on how to getaround this issue please let me know.Below is an example of what my triggers look like.------------------------------------Trigger 1 -- this was my original designCREATE trigger1 on TableFOR UPDATEASif update(field1)begininsert into AuditSELECT What I needENDif update(field2)begininsert into AuditSELECT What I needEND...... Repeated about 65 more timesif update(field67)insert into AuditSELECT What I needEND---------------------------------------------------------------------------Trigger 2 -- this is what i tried but did not improve performanceCREATE trigger2 on TableFOR UPDATEASDeclare @DelField1 varcharDeclare @DelField2 varchar....Declare @DelField67 varcharSelect@DelField1 = Field1,@DelField2 = Field2,....@DelField67 = Field67From DeletedDeclare @InsField1 varcharDeclare @InsField2 varchar....Declare @InsField67 varcharSelect@insField1 = Field1,@insField2 = Field2,....@InsField67 = Field67From Inserted-- I do not do if Update() but instead compare variablesif @DelField1 <> InsField1beginInsert into AUDITSELECT what I needendif @DelField2 <> InsField2beginInsert into AUDITSELECT what I needend............if @DelField67 <> InsField67beginInsert into AUDITSELECT what I needend----------------------------------------------IF you have any idea how to optimize this please let me know. Anyinput is greatly appreciated. I do not have a problem with triggersdoing what they are supposed to, they are very slow this is myconcern. The reason I gave you two examples is because i suspect ithas something to do with the enormouse amount of code inside thetrigger. both examples perform about the same whether i use the twohuge selects from the Inserted and Deleted or not.Thanks,Gent
I'm trying to write a script that would only update a column if it exists.
This is what I tried first:
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'Enrollment' AND COLUMN_NAME = 'nosuchfield') BEGIN UPDATE dbo.Enrollment SET nosuchfield='666' END
And got the following error:
Server: Msg 207, Level 16, State 1, Line 1 Invalid column name 'nosuchfield'.
I'm curious why MS-SQL would do syntax checking in this case. I've used this type of check with ALTER TABLE ADD COLUMN commands before and it worked perfectly fine.
The only way I can think of to get around this is with:
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'Enrollment' AND COLUMN_NAME = 'nosuchfield') BEGIN declare @sql nvarchar(100) SET @sql = N'UPDATE dbo.Enrollment SET nosuchfield=''666''' execute sp_executesql @sql END
which looks a bit awkward. Is there a better way to accomplish this?
My DataAdapter.Update() Method is running this query against my database and all the parameters and the formatting looks correct to me. I was wondering if anyone could identify obvious errors.... Thanks!
I am running a parent SSIS package (running sp2, 9.0.3042) that calls several child packages.
On our development server, we now cannot run this because we get 1 or more of these errors:
"Error 0x80004003 while preparing to load the package. Invalid pointer . " "Error 0xC0011008 while preparing to load the package. Error loading from XML. No further detailed error information can be specified for this problem because no Events object was passed where detailed error information can be stored. . "
It is not occuring on the same packages. It varies every time it is run.
I can run every one of the child packages individually, using the same login ID that the parent is executed under.
The parent package works fine on my local machine and other servers running the same version of SSIS. Just not on this server.
I have come up with an issue where I want to update data in a table using bulk/SET update to get the result shown in below code with output in column titled "Arrear Amt".
Please use this test data.
CREATE TABLE ##vOD_Calc ( Seq_No INT , Contract_id INT , Rental_id INT , Actual_OD INT , Logic_OD INT , Due_dte DATETIME ,
[Code] .....
Logic required is that once the sum of column [ArrearAmt] of current row and all previous rows becomes greater than $100 then column [ChArrrearAmt] should show that summed up value and in else case the column [ChArrrearAmt] should show the same value as that of column [ArrearAmt].
Once the column [ChArrrearAmt] reaches the threshold of $100 then the same cycle should start again i.e. in above example rental#1 had $37.17 < $100 then rental#1 + rental#2 is also < $100 and at rental#3 sum of rental#1, rental#2 and rental#3 becomes $111.51 which is greater than $100 so its updated in column [CHArrrearAmt]. The same cycle start overs from rental#4 onwards however the summation of [ArrearAmt] will now begin after rental#4 onwards and not from the starting.
Below is the loop based SQL script which handles the above situation, however in BULK its a total deterioration of performance if thousands of rows are to be processed i.e. with a contract having multiple rentals.
The case here is that I have to use the result of previously updated column value of [ChArrrearAmt] to take decision for the next row, however with BULK update since the row is not yet updated with latest amount therefore the decision on next row is also giving wrong result.
This is the code with which I have achieved to update the column 'chArrear Amount', however its a loop based solution and performance killer.
INSERT INTO ##vOD_Calc_loop ( Rows_count , contract_id ) SELECT COUNT(*) , T.Contract_id FROM ##vOD_Calc T GROUP BY T.Contract_id
SQL Ver: 2008 (not r2) Problem: The following code returns correct results when moving variable declarations and update statement outside a stored procedure, but fails to return a value other than zero for the "COMPANY TOTAL" records. The "DEPT TOTAL" result works fine both in and outside the sp.This may have to do with handling NULL values since I was getting warning message earlier involving a value being eliminated by an aggregate function involving a NULL. I only got this message when running inside the sp, not when running standalone. I wrapped the values inside the SUM functions with an ISNULL, and now return a zero rather than NULL for the "COMPANY TOTAL" records when running inside SP.All variable values are correct when running.
SQL CODE: DECLARE     @WIPMonthCurrent date = (SELECT TOP 1 WIPMonth FROM budxcWIPMonths WHERE ActiveWIPPeriod = 'Y')   select @WIPMonthCurrent as WIPMonthCurrent   [code]....
I am sync. for the first time (Merge Replication using SQL 2005 and Mobile) and this error keeps poping up on the emulator. I am tring out the tutorial. :) boss on my butt.
Does anyone have any ideas or a solution to this...your help is WELL APPRECIATED ...
I need to search for such SPs in my database in which the queries for update a table contains where clause which uses non primary key while updating rows in table.
If employee table have empId as primary key and an Update query is using empName in where clause to update employee record then such SP should be listed. so there would be hundreds of tables with their primary key and thousands of SPs in a database. How can I find them where the "where" clause is using some other column than its primary key.
If there is any other hint or query to identify such queries that lock tables, I only found the above few queries that are not using primary key in where clause.
I have small problem with the ADO 2.6 - 2.8 Command object.
I created a Stored procedure preforming some tasks within a tran.
whitin the tran i'm collecting the @@ERROR values and in the end (after tran closing )generating the correct error string - which will be send back to the caller
in the VB6 app, i'm using a command object with the next line:
Set MyRecordset= MyCommand.Execute
In all the cases and option, when i run the SP - the command preform the tasks and return closed recordset (with the currect answer) and i can't read it.
the only solution that worked until now: i need to call MyRecordset.Open which will requery the SP again and generate an error (User defined error).
It happens only when i run command.execute with SP that preforms multitasks and return result recordset.
I'm getting an error when I try to import a flat file package from SQL, when I created it on Visual Studios it runned fine. I'm getting this error on SQL when I try to run the job: Executed as user: MAINOFFICEASPNET. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 11:30:01 PM Error: 2007-09-11 23:30:06.56 Code: 0xC001401E Source: Import_GasBoyData Connection manager "RawTrans" Description: The file name "\Mt111gasboyPC TopKATFiles awtrans.dat" specified in the connection was not valid. End Error Error: 2007-09-11 23:30:06.56 Code: 0xC001401D Source: Import_GasBoyData Description: Connection "RawTrans" failed validation. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 11:30:01 PM Finished: 11:30:07 PM Elapsed: 6.063 seconds. The package execution failed. The step failed. Thanks, Erick
Hi, I am trying to run a DTS from SP. The SQL code that I am using is as follows.........
exec master..xp_cmdshell 'dtsrun /S 142.102.27.207 /U sa /P sa /N DTS_TEST1'
But Running this I am getting the the following errror....... DTSRun: Loading... DTSRun: Executing... DTSRun OnStart: DTSStep_DTSDataPumpTask_1 DTSRun OnError: DTSStep_DTSDataPumpTask_1, Error = -2147467259 (80004005) Error string: Error opening datafile: The system cannot find the path specified. Error source: Microsoft Data Transformation Services Flat File Rowset Provider Help file: DTSFFile.hlp Help context: 0
Error Detail Records: Error: 3 (3); Provider Error: 3 (3) Error string: Error opening datafile: The system cannot find the path specified. Error source: Microsoft Data Transformation Services Flat File Rowset Provider
I have this SSIS Package that it works when I run it on development on Visual Studio. But when I imported it and schedule the job, it gives me the following error: Executed as user: MAINOFFICEASPNET. ... 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 2:38:39 PM Error: 2007-11-05 14:38:41.02 Code: 0xC0202009 Source: Export_WHSCANP Connection manager "10.5.1.10.DRDB400.aspnet1" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "IBMDA400 Session" Hresult: 0x80004005 Description: "CWBSY0002 - Password for user ASPNET on server 10.5.1.10 is not correct ". End Error Error: 2007-11-05 14:38:41.04 Code: 0xC020801C Source: Data Flow Task OLE DB Destination [801] Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "10.5.1.10.DRDB400.aspnet1" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnec... The package execution fa... The step failed.
This is trying to get a table from the SQL server and copy it to an IBM AS400 server. Any ideas? Thanks, Erick
I downloaded the code example "Running Stored Procedures with ASP.NET" (http://www.dotnetjunkies.com/Tutorial/9AE62C44-3841-4687-B906-2F6D4A5A4622.dcik) and I'm having trouble filtering the sp's that are populating the drop down box. Here's my code: (all I did was add the one parameter)
Dim ds As New DataSet
Dim conn As New SqlConnection( _ Data source=" & DatabaseServer.Text & _ ";User id=" & Userid.text & _ ";Password=" & Password.Text & _ ";Initial catalog=" & Database.Text)
Dim cmd As New SqlCommand("sp_stored_procedures", conn) Dim adpt As New SqlDataAdapter(cmd) Try cmd.Parameters.Add("@sp_name", SqlDbType.NVarchar, 390).Value = CType("my%", String)
so if I comment out the cmd.Parameters.Add ... line it all works fine and I get every sp, but if I uncomment it I get " Incorrect syntax near 'sp_stored_procedures'" every time no matter what i put in the string.
I have a feeling I'm doing something stupid :) Anyone have any ideas? The query works fine in query analyzer with the same arguments that I'm trying in the code.
I am getting an error while running the XP_CMDSHELL 'NET USER /DOMAIN'. But when I run the same command for sp_cmdshell 'net user' it works fine. It giving me the local users that are available on the my local pc. But now i want the domain users on my network which is not working with the xp_cmdshell..
I have a DTS package that will run and execute with no problem when you start it. But when I schedule the dts package. The dynamic properties task fails with the following message.
Executed as user: CSEDBSTST001sqljobmgr. DTSRun: Loading... DTSRun: Executing... DTSRun OnStart: DTSStep_DTSActiveScriptTask_1 DTSRun OnFinish: DTSStep_DTSActiveScriptTask_1 DTSRun OnStart: DTSStep_DTSDynamicPropertiesTask_1 DTSRun OnError: DTSStep_DTSDynamicPropertiesTask_1, Error = -2147220492 (800403F4) Error string: Could not open file for reading. Error source: Dynamic Properties Task Help file: Help context: 0 Error Detail Records: Error: 0 (0); Provider Error: 0 (0) Error string: Could not open file for reading. Error source: Dynamic Properties Task Help file: Help context: 0 DTSRun OnFinish: DTSStep_DTSDynamicPropertiesTask_1 DTSRun: Package execution complete. Process Exit Code 1. The step failed.
The task reads an .ini file to setup up many properties throughout the DTS package.
I m getting the following error while running the bcp utility.
Cannot insert null into column_name in table
something like this.
The value for that column is null in the input .csv file.
Instead of throwing error like this, i need those bad records should go in to the .BAD file and the remaining records should get uploaded into the database.
In my case the entire data upload is failing if any one bad record is there in the .csv file.
When I run sp_helpdb against the master (or any other DB for that matter) I get the following error:
Server: Msg 515, Level 16, State 2, Procedure sp_helpdb, Line 53 Cannot insert the value NULL into column 'owner', table 'tempdb.dbo.#spdbdesc_ 0010001A6EF'; column does not allow nulls. INSERT fails. The statement has been terminated.
I have about 20 DBs on a SQL Server 2000 Standard Edition Instance with sp3a .
There have been no changes made to the Server or the SQL Server Instance in quite some time. Can anyone tell me how I should proceed??????
SQL Server 7 When i ran a query in query analyzer i am getting the below error. once i closed and opened the query analyzer and ran the same query it worked fine.
pls the error which i got in first time.
Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server [Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream [Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream [Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream [Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream [Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream [Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream [Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream [Microsoft][ODBC SQL Server Driver]TDS buffer length too large [Microsoft][ODBC SQL Server Driver]TDS buffer length too large [Microsoft][ODBC SQL Server Driver]TDS buffer length too large [Microsoft][ODBC SQL Server Driver]TDS buffer length too large [Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server [Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server [Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server [Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server [Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server [Microsoft][ODBC SQL Server Driver]TDS buffer length too large [Microsoft][ODBC SQL Server Driver]TDS buffer length too large [Microsoft][ODBC SQL Server Driver]TDS buffer length too large
I am getting the following error when running a command in QueryAnalyzer.Msg 50001, Level 1, State 50001xpsql.cpp: Error 5 from CreateProcess on line 675Here is the command that I am running that generates this errormessage:xp_cmdshell "@ECHO test message > c: empewtemp.txt"The command inside the xp_cmdshell command runs successfully from aWindows command prompt. Simpler commands also fail with the same errormessage. For example:xp_cmdshell 'dir'Can anyone suggest a solution to this problem? I assume this problem isdue to a permissions/security issue. I have given Everyone Full Controlover c: emp.Windows 2000SQL Server 2000Thanks for any help.
I have a DTS package which deletes some rows in a table and then inserts new ones from a Dbase file.
When I execute the package it works perfectly, but when I shedule it as a job it gives this error message:
Executed as user: LOMMELDOMAdministrator. ...DTSStep_DTSExecuteSQLTask_2 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_2 DTSRun OnStart: Copy Data from GRES to [ZNA].[dbo].[GRES] Step DTSRun OnError: Copy Data from GRES to [ZNA].[dbo].[GRES] Step, Error = -2147221164 (80040154) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 700 Error Detail Records: Error: -2147221164 (80040154); Provider Error: 0 (0) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 700 Error: -2147221164 (80040154); Provider Error: 0 (0) Error string: Class not registered Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 4700 Error: -2147221164 (... Process Exit Code 5. The step failed.
The deleting of the rows works but the import from the Dbase doens't, I thought it was a security issue. So I made sure the JobAgent has all the rights it needs to access the directory of the Dbase file.