Hiya,
Got a little question for future reference. I have a cursor that does a grant for roles, users, etc. for all tables in a DB. I can manually run it in query analyzer and it does fine. The issue is that if I try to create it as a SQL Agent job and just paste the script into the command box it returns the error:
Line 1: Incorrect syntax near 'xxx yyy'. [SQLSTATE 42000] (Error 170) Associated statement is not prepared [SQLSTATE HY007] (Error 0) Line 1: Incorrect syntax near 'xxx yyy'. [SQLSTATE 42000] (Error 170). The step failed.
The Cursor is written as follows:
declare @cStatement varchar(255)
declare G_cursor CURSOR for select 'grant select on[' + convert(varchar(64),name) + ']to "xxx yyy"' from sysobjects
where (type = 'U' or type = 'V') and uid = 1
set nocount on
OPEN G_cursor
FETCH NEXT FROM G_cursor INTO @cStatement
WHILE (@@FETCH_STATUS <> -1)
begin
EXEC (@cStatement)
FETCH NEXT FROM G_cursor INTO @cStatement
end
DEALLOCATE G_cursor
I figure this has something to do with the quoted_identifier option or something simple like that, but I can't put my finger on it....In Query Analyzer this will error out if I don't have double quotes around the DB role "xxx yyy" because of the space in the role name.
I corrected the error by recreating the role name without a space, but I have some other places I'd like to be able to use this where I won't have the luxury of recreating the role if it has a space in the name.
STATIC Defines a cursor that makes a temporary copy of the data to be used by the cursor. All requests to the cursor are answered from this temporary table in tempdb; therefore, modifications made to base tables are not reflected in the data returned by fetches made to this cursor, and this cursor does not allow modifications
It say's that modifications is not allowed in the static cursor. I have a questions regarding that
Static Cursor declare ll cursor global static for select name, salary from ag open ll fetch from ll
while @@FETCH_STATUS=0 fetch from ll update ag set salary=200 where 1=1
close ll deallocate ll
In "AG" table, "SALARY" was 100 for all the entries. When I run the Cursor, it showed the salary value as "100" correctly.After the cursor was closed, I run the query select * from AG.But the result had updated to salary 200 as given in the cursor. file says modifications is not allowed in the static cursor.But I am able to update the data using static cursor.
Hello,I have a test database with table A containing 10,000 rows and a tableB containing 100,000 rows. Rows in B are "children" of rows in A -each row in A has 10 related rows in B (ie. B has a foreign key to A).Using ODBC I am executing the following loop 10,000 times, expressedbelow in pseudo-code:"select * from A order by a_pk option (fast 1)""fetch from A result set""select * from B where where fk_to_a = 'xxx' order by b_pk option(fast 1)""fetch from B result set" repeated 10 timesIn the above psueod-code 'xxx' is the primary key of the current Arow. NOTE: it is not a mistake that we are repeatedly doing the Aquery and retrieving only the first row.When the queries use fast-forward-only cursors this takes about 2.5minutes. When the queries use dynamic cursors this takes about 1 hour.Does anyone know why the dynamic cursor is killing performance?Because of the SQL Server ODBC driver it is not possible to havenested/multiple fast-forward-only cursors, hence I need to exploreother alternatives.I can only assume that a different query plan is getting constructedfor the dynamic cursor case versus the fast forward only cursor, but Ihave no way of finding out what that query plan is.All help appreciated.Kevin
I'm trying to implement a sp_MSforeachsp howvever when I call sp_MSforeach_worker I get the following error can you please explain this problem to me so I can over come the issue.
Msg 16958, Level 16, State 3, Procedure sp_MSforeach_worker, Line 31
Could not complete cursor operation because the set options have changed since the cursor was declared.
Msg 16958, Level 16, State 3, Procedure sp_MSforeach_worker, Line 32
Could not complete cursor operation because the set options have changed since the cursor was declared.
Msg 16917, Level 16, State 1, Procedure sp_MSforeach_worker, Line 153
Cursor is not open.
here is the stored procedure:
Alter PROCEDURE [dbo].[sp_MSforeachsp]
@command1 nvarchar(2000)
, @replacechar nchar(1) = N'?'
, @command2 nvarchar(2000) = null
, @command3 nvarchar(2000) = null
, @whereand nvarchar(2000) = null
, @precommand nvarchar(2000) = null
, @postcommand nvarchar(2000) = null
AS
/* This procedure belongs in the "master" database so it is acessible to all databases */
/* This proc returns one or more rows for each stored procedure */
/* @precommand and @postcommand may be used to force a single result set via a temp table. */
declare @retval int
if (@precommand is not null) EXECUTE(@precommand)
/* Create the select */
EXECUTE(N'declare hCForEachTable cursor global for
DECLARE DBCur CURSOR FOR SELECT U_OB_DB FROM [@OB_TB04_COMPDATA]
OPEN DBCur FETCH NEXT FROM DBCur INTO @DBNAME
WHILE @@FETCH_STATUS = 0 BEGIN
SELECT @SQLCMD = 'SELECT T0.CARDCODE, T0.U_OB_TID AS TRANSID, T0.DOCNUM AS INV_NO, ' + + 'T0.DOCDATE AS INV_DATE, T0.DOCTOTAL AS INV_AMT, T0.U_OB_DONO AS DONO ' + + 'FROM ' + @DBNAME + '.dbo.OINV T0 WHERE T0.U_OB_TID IS NOT NULL' EXEC(@SQLCMD) PRINT @SQLCMD FETCH NEXT FROM DBCur INTO @DBNAME
END
CLOSE DBCur DEALLOCATE DBCur
Part 2
SELECT T4.U_OB_PCOMP AS PARENTCOMP, T0.CARDCODE, T0.CARDNAME, ISNULL(T0.U_OB_TID,'') AS TRANSID, T0.DOCNUM AS SONO, T0.DOCDATE AS SODATE, SUM(T1.QUANTITY) AS SOQTY, T0.DOCTOTAL - T0.TOTALEXPNS AS SO_AMT, T3.DOCNUM AS DONO, T3.DOCDATE AS DO_DATE, SUM(T2.QUANTITY) AS DOQTY, T3.DOCTOTAL - T3.TOTALEXPNS AS DO_AMT INTO #MAIN FROM ORDR T0 JOIN RDR1 T1 ON T0.DOCENTRY = T1.DOCENTRY LEFT JOIN DLN1 T2 ON T1.DOCENTRY = T2.BASEENTRY AND T1.LINENUM = T2.BASELINE AND T2.BASETYPE = T0.OBJTYPE LEFT JOIN ODLN T3 ON T2.DOCENTRY = T3.DOCENTRY LEFT JOIN OCRD T4 ON T0.CARDCODE = T4.CARDCODE WHERE ISNULL(T0.U_OB_TID,0) <> 0 GROUP BY T4.U_OB_PCOMP, T0.CARDCODE,T0.CARDNAME, T0.U_OB_TID, T0.DOCNUM, T0.DOCDATE, T3.DOCNUM, T3.DOCDATE, T0.DOCTOTAL, T3.DOCTOTAL, T3.TOTALEXPNS, T0.TOTALEXPNS
my question is, how to join the part 1 n part 2? is there posibility?
I'm new to cursors, and I'm not sure what's wrong with this code, it run for ever and when I stop it I get cursor open errors
declare Q cursor for select systudentid from satrans
declare @id int
open Q fetch next from Q into @id while @@fetch_status = 0 begin
declare c cursor for
Select b.ssn, SaTrans.SyStudentID, satrans.date, satrans.type, SaTrans.SyCampusID, Amount = Case SaTrans.Type When 'P' Then SaTrans.Amount * -1 When 'C' Then SaTrans.Amount * -1 Else SaTrans.Amount END
From SaTrans , systudent b where satrans.systudentid = b.systudentid
I having a difficult time here trying to figure out what to do here.I need a way to scroll through a recordset and display the resultswith both forward and backward movement on a web page(PHP usingADO/COM)..I know that if I use a client side cursor all the records get shovedto the client everytime that stored procedure is executed..if thisdatabase grows big wont that be an issue?..I know that I can set up a server side cursor that will only send therecord I need to the front end but..Ive been reading around and a lot of people have been saying never touse a server side cursor because of peformance issues.So i guess im weighing network performance needs with the client sidecursor vs server performance with the server side cursor..I am reallyconfused..which one should I use?-Jim
Hi!I scheduled a DTS-Import from MySQL, whenever I run it manually(Right-Click on the DTS package) it runs through without any problems.But firing it by a schedule doesn't work!?Just to exclude any issues regarding users/roles, I created a DTS toexport files to my desktop to an EXCEL-sheet. Manually export as wellas scheduled export works fine.My Application Log shows me following error message:Event Type:WarningEvent Source:SQLSERVERAGENTEvent Category:Job EngineEvent ID:208Date:6/8/2005Time:10:05:02 AMUser:N/AComputer:*****Description:SQL Server Scheduled Job 'ImportFromMySQL'(0xC89612CE034F6642BD585B048DBC0F06) - Status: Failed - Invoked on:2005-06-08 10:05:02 - Message: The job failed. The Job was invoked bySchedule 22 (ImportFromMySQL). The last step to run was step 1(ImportFromMySQL).Anybody know what's wrong!?
I have designed a DTS Package and it can be run successfully from Enterprise Manager. However, when I schedule the DTS package to run as a job then it fails with an error message of "Error string: The system cannot find the file specified.".
Anyone any idea as to why the job cannot find the DTS package?
Can it be something to do with SQL Server 7 and 2000 tools as 2000 Client tools have recently been installed onto my PC and the package and job ran fine when I had designed everything in SQL Server 7.
Hi, I am trying to do an automatic backup of my database and for some reason it does not do it. I have it set to backup daily at 4:00 pm. Please let me know if you know why it is not backing up on it's own. The Server manager is on always and a manual backup is not a problem for me, only the automatic. Thanks very much.
I have a number of DTS packages which when run manually complete successfully however, when run as scheduled tasks they always fail. Can anyone offer any advice?
I'm thinking of using the SQL Agent Job Scheduler as part of a larger application and I'm wondering if anyone knows of a limit on how many schedules or jobs that can exist on a SQL Server at one time.
Hello, I have many dts packages scheduled as jobs . job always fails when executed. It runs fine if i execute the dts package. the follwing is the error message Error: -2147217887 (80040E21); Provider Error: 0 (0) Error string: Errors occurred Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_1 DTSRun: Package execution complete. Process Exit Code 1. The step failed. any help would be welcome thenk you
When I create a DTS to import data from Visual FoxPro it will work if I run immeadiately, but when I schedule it to run at a specific time it will Fail. Any ideas why??
I am trying to figure out a way , looking at the tables in MSDB (SysJobHistory) that if a scheduled job is running and not completed yet , for how long it has been running. I have to look for all the jobs with run_status = 4 (in process). But what i figured out it no history is written till the job is finished or cancelled. Can anyone help me with this
Hi, we run a nightly job processes, this job depends on the data entered from the frontend, since yesterday we have been entering lot of data in it, so the job that ran last night 10pm(09/05/00) is still running now and its next schedule time 10pm today(09/06/00), if the same job is already running since yesterday and if it still runs till 10pm today, will it starts again as per the schedule or it will not run, since the same job is running since yesterday.
I have a scheduled job that run one time a day at 10 pm. My problem is that if the job fail, i want that the job run 10 minutes after until it complete with success.
In SQL 6.5, when a scheduled job failed, you could see the error message in the history. In SQL 7.0, it simply tells you which was the last step to run. Is there a place which will report the actual error message generated by the task?
I have created a DTS package on a developement server that connects to our Exchange server and downloads customer service e-mail and inserts them into a table. This is done using a VB script.
When I right click on the job and execute it, it runs fine and we can see the mails after they show up in the table.
However, when I schedule the job to run at 15 minute intervals it fails with a vb runtime error. When I copy the job to my personal machine and schedule it, it runs just fine and again we see the mails in the table.
The server has the latest version of the scrrun.dll as well as IE 5.0.
I have messed around with this for 3 days now and have gotten MSFT involved as well.
Anyone seen this before? Any ideas or help will be greatly appreciated.
I am running a scheduled DTS which transfers all the rows in a production table to another server every 30 minutes,each time it truncates the table on the second server before transfer.It has been running fine for several days.Will there be any problem in this kind of backup strategy? Do I have to clear any history logs frequently?Or any other problem can happen? Can anyone suggest any precautions,as there will not be any down time allowed.Replication is also not acceptable by the client. Thanks.
Why is it that I can run my DTS package locally howeverv when I try to run it as a Job it always fails? I do realize that the sqladmin account is used to run the job and it has all the permissions needed.
Any suggestions?
Also if I kick the job off from my local system it states it cannot find the batch file that I am trying to run? Im running it on the server however it treats it like im running it locally?
I have just taken away a tremendous amount of rights from our developers, but would like some of the developers to still have rights to manage the scheduled job. Short of making them a system administrator I can not seem to find a predefined role that will do this. Is there one? And if not what system stored procedures or xp would you need to give them rights to, to view and run the scheduled jobs in enterprise manager.
If I have 2 scheduled tasks set for the same time (perhaps accidentally), will the SQL Executive start 1 and queue the other one until the first is complete and then run the 2nd task? Or will they both be started simultaneously?
I have the below code which works to rebuild indexes on a large db. I can run it from QA, but not as a job or as a sp in a job. I get the same error
Executed as user: ADCsqlexec. Retrieving Table List for DB Development [SQLSTATE 01000] (Message 0) ReIndexing Table Development..cms_appointments [SQLSTATE 01000] (Message 0) DBCC execution completed. If DBCC printed error messages, contact your system administrator. [SQLSTATE 01000] (Message 2528) Updating Statistics on Table Development..cms_appointments [SQLSTATE 01000] (Message 0) Could not complete cursor operation because the set options have changed since the cursor was declared. [SQLSTATE 42000] (Error 16958). The step failed.
/* Create DB List */ DECLARE DBCursor CURSOR FOR SELECT name FROM master..sysdatabases where name = 'development'
OPEN DBCursor
FETCH NEXT FROM DBCursor INTO @DBName
/* Create Database Loop */ WHILE @@FETCH_STATUS = 0 BEGIN /* Retrieve Table List */ PRINT 'Retrieving Table List for DB ' + @DBName
EXEC ('SELECT name AS TableName INTO ##TableNames FROM [' + @DBName + ']..sysobjects WHERE type = ''U''')
/* Open Table List */ DECLARE TableCursor CURSOR FOR SELECT TableName FROM ##TableNames
OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName
/* Create Table Loop */ WHILE @@FETCH_STATUS = 0 BEGIN /* Add DB Name to Table Name */ SELECT @FQTableName = QUOTENAME(RTRIM(@DBName)) + '..' + QUOTENAME(RTRIM(@TableName)) SELECT @TableName = RTRIM(@DBName) + '..' + RTRIM(@TableName) /* fix from ms */ SET ARITHABORT ON SET QUOTED_IDENTIFIER ON -- DBCC CHECKTABLE(mytable)
I have been running the following production job successfully for a long time. It now fails, and the Task History Last Error Message displays 'No Message'. The log file ( C:MSSQLLOGMaint_TombV50.txt) shows it ran successfully, with a Return Code 0.
I'm new to SQL so this should be an easy question. All i want to do is create a recuring task. I want to copy data from one SQL server to another. I've created a package to accomplish this and when i execute it manually it works. What doesn't work is scheduling to execute a regular intervals. As a test i set it up to execute every minute every day (This however will change to once a day when i prove it works) and for some reason it never executes. Any ideas? By the way i created the package using the DTS export wizzard.
To all, If I have a scheduled tasks that is owned by 'sa', how can I assign permissions to allow another user, even the database dbo, to register the SQL server and view the scheduled tasks?
I find some scheduled jobs are switching from an 'enabled' state to a 'disabled' state apparently for no good reason. The job itself still shows as being 'enabled', however the associated scheduled becomes 'disabled'. Does anyone why this would be? Is the problem associated with a paritcular service pack or anything??
I created a scheduled task on SQL server 6.5 which is actually dump system DB.
The problem I have is the scheduled task did not run with no error messsage returned. I have tried to force it run in different schedule modes. Nothing happened. However I can dump system DB through SEM Backup/Restore which runs OK!
Any body has idea why scheduled task does not run ?
I am trying to set up a DTS to transfer logging data from one server to another. The record may already exist at the destination causing a primary key violation. I do not want this error to cause the entire DTS to fail.
When I execute the DTS I created by right clicking and selecting "Execute Package" it shows me 2 errors. Although there are 2 errors the rows that do not have a primary key violation are successfully transfered to the destination database. Here are the 2 errors I see:
Error 1: Error at Destination for Row number 97. Errors encountered so far in this task: 97. The statement has been terminated. Violation of PRIMARY KEY constraint 'PK_event'. Cannot insert duplicate key object 'event'.
Error 2: Error at Destination for Row number 198. Errors encountered so far in this task: 198. The statement has been terminated. Violation of PRIMARY KEY constraint 'PK_eventDetail'. Cannot insert duplicate key object 'eventDetail'.
These errors make sense, there were 97 duplicate lines in the event table and 198 duplicates in the eventDetail table.
This is the behavior I want. New rows are copied to the destination database.
When I schedule the DTS as a Job in the Enterprise manager things change. When the DTS is executed as a Job (as opposed to me right clicking and selecting "Execute Package"), the job reports a failure and none of the new rows are transfered to the destination database.
Why does the DTS transfer the rows that do not violate the Primary Key constraint when I manually execute it and not when it is executed as a job?
I've scheduled a job to run on a certain schedule, but the Last Run Status date comes back very oddly, a couple years out of synch, the other jobs scheduled report back just fine.
Anyone seen this behavior?
Edward R Hunter, Data Application Designer comScore Networks, Inc.