On production server, I have two tasks in status "performing completion action" for last couple hours. Ususally these tasks take seconds to finish.
Do you know how can I kill or cancel these tasks? I tried to stop both the jobs couple times but nothing is happening.
There are no locks/blocks on current activity window.
I'm running 7.0 sp2,windows 2000, 1gig of ram, and a 933mhz cpu. Server has been very stable with no problems until I moved a 2gig table into the database. Query performance is excellent even table scans take less than 2 minutes. The problem is that once a table scan is performed on the table (I can't index for every possible query) the query finishes but the enterprise manager freezes on the server and users can no longer connect. I've set SQL server to have only 650 MB of ram and the rest is free, problem also existed when the memory was controlled entirely by SQL server.
My cache Hit Ratio was 97% and Cache flushes 0.0 (unfortunately these can't be checked when the problem exist because the box is frozen). I may have a concurrency issue but I'm not sure how to be positive. I don't want to just throw memory at the problem because I'm not sure the problem will be fixed.
My overnight backups hang occasionally. I say this because there is no consistancy to when the routine will hang. It happens on the first step where it reindexes. The only user database on the server is about 12GB. The server is dedicated to SQL so nothing else runs on it. It has 2x 800 MHz processors with 2 G of ram of which 1.5 is allocated to SQL.
I can't trace the error because it happens before the report is written to. There are a few tables with a million plus records that take up almost a 1G per table.
It was my understanding that if users were in the database they would notice a big performance loss but that the backups would still run. Could it dying because of resource contention? I'm at a loss and really could use some input. I can't afford to go a whole weekend without a backup.
Is there a way for me to progammatically check who is the database prior to the job kicking off?
Any ideas as to what would cause an inconsistant hang of job is greatly appreicated.
I have a strange problem, which I never encountered earlier, I have a data flow which has one source and multiple destinations, I am also using multicast and derived column transformations too, however when I run the package (either from cmd line or designer) the data flow task hangs when it comes to inserting data into the target tables. and it keeps waiting forever. Initialy I thought it was my data set , however if with very few records the problem persists. I have also tried removing the table lock from the destination , increasing the rows per batch to 10000 and commit size to 10000 but to no change. I am using the fast load option within my destination. When I run sp_who2 on my Management Studio I see that the status of all the processes is sleeping and the command is "awaiting command" , any help appreciated.
In my control flow, I have a container which contains an Execute SQL Task, and then upon success, a Data Flow Task. The SQL Task truncates my datamart table. In the data flow task, I execute a stored procedure (through a variable) that populates that same datamart table. I can execute the stored procedure's select statement in Management Studio with no problems in about ten seconds. However, in the SSIS package, the SQL task completes successfully, and then it hangs indefinitely on the data flow step. In the Data Flow tab, none of the boxes are even turning yellow. Why won't it complete? When I move the Exec SQL Task to another container, the package executes fine, but it should be in the Load Phase container.
Hi, we run a nightly job processes, this job depends on the data entered from the frontend, since yesterday we have been entering lot of data in it, so the job that ran last night 10pm(09/05/00) is still running now and its next schedule time 10pm today(09/06/00), if the same job is already running since yesterday and if it still runs till 10pm today, will it starts again as per the schedule or it will not run, since the same job is running since yesterday.
Does anyone has a script or knows how to find out whether the schedule job completed or not? If I schedule the job to backup all databases on server and e-mail me when job completes. But one time the job got stuck and somehow never completed. Is there a way I can schedule the job to see whether the other job completed or not?
I have set scehduled Backup process, which suppose to run every night at 210000, and a seperate job for Tlog every 15 minutes.
Schduled job won't run at all..Job ius enabled, scheduled is enabled. SQL Server service is running under local admin account.. Then Why not job runs? No Event Log, No Job History and no Job output? They why not Run?
When I change the SA password on my server the current Scheduled SQL Jobs no longer work. If I run DTS packages by themselves and not from the Scheduled job window they work fine. I am using the sa as the Owner of the jobs and my SQL Agents services use a Domain Account. If I rebuild the jobs after the sa password is change they work fine. However, I don't want to rebuild 100 jobs every time I change my SA password.How can I fix or get around this problem without rebuilding my jobs?
Previously I was performing the process of populating my DataMart and then processing my cubes via 2 separate job scheduled one after the other and with about 30 minutes idle time in-between (just in case).
I now have tried to make a single job with both of the above as 2 steps (one after the other) of a single job.
However when I was done, I did a right click and tried to run the newly created job but instead of what I was hoping (that the job would start, would perform the 1st step, followed by the 2nd), it instead popped up a dialog box asking (I think) which step needs to be executed :(
Is there a solution so that when the job is scheduled to run, it should start performing the steps in the order rather than waiting for someone to select which step to perform.
Previously I was performing the process of populating my DataMart and then processing my cubes via 2 separate job scheduled one after the other and with about 30 minutes idle time in-between (just in case).
I now have tried to make a single job with both of the above as 2 steps (one after the other) of a single job.
However when I was done, I did a right click and tried to run the newly created job but instead of what I was hoping (that the job would start, would perform the 1st step, followed by the 2nd), it instead popped up a dialog box asking (I think) which step needs to be executed :(
Is there a solution so that when the job is scheduled to run, it should start performing the steps in the order rather than waiting for someone to select which step to perform.
I have a DTS package that consist of approximately 50 steps which pulls data from an as400 and populates a SQL 7.0 database. The package works perfect flawlessly when executed manually but won't run when scheduled.
The DTS package starts but hangs when it tries to connect to the as400, this is where the problem lies. The preceeding steps work fine but they execute against the SQL 7.0 database. I have client access installed on the server and configured. I checked and made sure the SQL server agent service was started and running also,I just can't figure why it works fine manually,but not when scheduled.
I have a very peculiar problem. I have few CommandExec jobs on SQL Server 7.0 which has been scheduled to execute at a specific time. The jobs are executing properly but, the problem is they stop after the completion. i.e after finishing the jobs they still flash a message "Executing the job" they don't flash the message "Not Running" this is happening only to commandExec jobs, I have also few TSQL jobs which are working properly. I tried even giving the duration (start and stop time) but no use. Has anyone encountered this problem? If you have a solution PLS pass it on to me. I am in a very desparate situation.
I'm new to SQL so this should be an easy question. All i want to do is create a recuring task. I want to copy data from one SQL server to another. I've created a package to accomplish this and when i execute it manually it works. What doesn't work is scheduling to execute a regular intervals. As a test i set it up to execute every minute every day (This however will change to once a day when i prove it works) and for some reason it never executes. Any ideas? By the way i created the package using the DTS export wizzard.
I created a scheduled task on SQL server 6.5 which is actually dump system DB.
The problem I have is the scheduled task did not run with no error messsage returned. I have tried to force it run in different schedule modes. Nothing happened. However I can dump system DB through SEM Backup/Restore which runs OK!
Any body has idea why scheduled task does not run ?
Running NT 4.0, sp6a. and sql 2k. I have a DTS job that works just fine if i go to 'design view' and then 'execute' it. But if I schedule it, the following error appears.. My question is... why? Also I modified the path from the current below (denapp02cmc$cd01.txt to the local path of the server as well... still came up with the same problem. ?? TIA! ::
DTSRun: Loading... DTSRun: Executing... DTSRun OnStart: Copy Data from Results to denapp02cmc$cd01.txt Step DTSRun OnError: Copy Data from Results to denapp02cmc$cd01.txt Step, Error = -2147467259 (80004005) Error string: Error opening datafile: Access is denied. Error source: Microsoft Data Transformation Services Flat File Rowset Provider Help file: DTSFFile.hlp Help context: 0 Error Detail Records: Error: 5 (5); Provider Error: 5 (5) Error string: Error opening datafile: Access is denied.
I need to use FTP task in DTS ( or any other way), but the problem is, the source file name is not constant. It changes every day with a date extension appended to the file name, like PO_2002_6_12_PM.zip. How do I create a task task to grab this file daily as a scheduled task? Any help is appreciated. thanks. Di.
I think I have a basic question. Can someone tell me the command I can use to have a stored procedure run and generate output? This is for SQL 6.5.
I would like to schedule the sp_help_revdatabase stored procedure to run periodically and output it to a file. I think this is possible, but I haven't been able to come up with the syntax...
How do you specifiy a log file for scheduled tasks. I have consistency checks done at night and when it fails, I cannot find out why because no log file is created. I checked for a log file in /MSSQL/LOGS/*.log. The maintenance wizard creates a log file, but how does a regular TSQL scheduled task create a log file?
I've written an scheduled job for importing an excel file into a sql table.When I try to run the job from Sql server agent , I get the status 'Not running',and in the last run status,it's written 'failed'.I've tested the command before in the query analyzer,and it works.Can someone tell me why isn't this working as a scheduled job?
I have a job that runs a SQL script and creates a flat file. I use an FTP task to send the file to a specific FTP site. The job work beautifully when is is completed in the development environment.
After I build the package and deploy on the SQL server and set it up to run as a job it will not work on the SQL server. The error appears to be in the FTP connection. I get two separate errors:
'Unable to connect to FTP Server using FTP Connection Manager' and
'An Error Occured in the requested FTP operation. Detailed error description: The Password was not allowed'
I have SQLServer 7, and have used Enterprise Manager to schedule local packages. The package that I'm trying to do is run a SQL script on a nightly basis. The problem that I'm having is that I need to be able to install the scheduled script customer box using an installer. I have access to execute command line programs in the installer. Do anyone know if you can schedule local packages (using SQL Scripts) from the command line? Or if a 3rd party application can do this. Any help or direction would be greatly appreciated. I've tried to use sp_add_job and sp_add_jobschedule, but haven't been able to get them to work.
Thanks, it worked on one server, but not the other. So I'm going to recreate that job again. Do you know where it would keep the results of the dbcc checkdb when running it in a scheduled task?
------------ Ray Miao at 9/23/99 8:07:13 AM
Recreate job.
------------ Laura Cappon at 9/22/99 5:22:43 PM
I have setup a DBCC checkdb scheduled task for Sunday nights at 11:00pm. Monday morning I checked it and the job never started running, nor did it fail. I looked in the error log and found no errors around that time. I set this job up the same way on 6 different servers, 4 ran fine and the other 2 didn't run at all.
So I set it up again to run Monday night and the 2 both ran for over 24 hours (on a 2 Gig DB). I killed it by stopping SQLExecutive. Does anyone know why this would happen and how to get it to work correctly?
I have a scheduled task that has been running for several weeks now. When I try to stop it from the scheduled task manager, all appears to go well, but the task continues to show under the running tasks tab. Is there any other way to force this task to stop?
I am not sure if this is a correct place to post this question. i am making a simple pay bill system, require people set a schedule that pays bill, then save it into database, when the time come, it auto transfers the money, i am thinking if i can do this in a store procedure. here is the interface:From Account:To Payee:Amount:ScheduleDate: Save the schedule task View scheduled task
I just wonder if it is possible to run a scheduled task from a SQL Server 6.5 Client. Whenever I would try to run a scheduled task from a SQL Server Client a would get the error message that says "The SQLExecutive service is not currently running on server 'POFDS1099'. This prevents task 'ACORS2 FS90 People Pull' from being run", but when I would try to go the server itself, the service is runnning. Can anybody please tell me what could be the possible cause of this or perhaps 6.5 doesn't really support running scheduled task using SQL Server client. Thanks. By the way, I also have SQL 2000 Client running on my local.
I searched the forum's threads on this, and while there were many results, none have helped so far.
I am running a DTS package that is an ACtiveX Script Task using VBScript. The script uses CreateObject() to create a FileSystemObject to copy an .MDB before importing the tables into SQL Server. I want to copy it because of Access' notoriety of corrupting, and this much data being pumped out of Access could force me to Compact & Repair. I would rather do that on a copy.
Function Main()
Dim FSO Set FSO=CreateObject("Scripting.FileSystemObject")
The DTS Package runs when I execute it from Ent. Manager, of course. It fails if scheduled, or course :(
I have set the Owner of the Scheduled Task to my domain account, which is also in the Adminstrators Group on the physical server with the SQL Server installation (Windows 2003 Server). I also did the unnecessary task of adding my domain account specifically to the destination folder, which is also Shared.
My sqlagent.exe service runs as SYSTEM on the server, so the SQLAgent should have no problem copying a file from one folder on the server to another.
The Scheduled Task fails with the common error:
DTSRun: Loading... DTSRun: Executing... DTSRun OnStart: DTSStep_DTSActiveScriptTask_1 DTSRun OnError: DTSStep_DTSActiveScriptTask_1, Error = -2147220482 (800403FE) Error string: Error Code: 0 Error Source= Microsoft VBScript runtime error Error Description: Permission denied Error on Line 12 Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 4500 Error Detail Records: Error: -2147220482 (800403FE); Provider Error: 0 (0) Error string: Error Code: 0 Error Source= Microsoft VBScript runtime error Error Description: Permission denied Error on Line 12 Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 4500 DTSRun OnFinish: DTSStep_DTSActiveScriptTask_1 DTSRun: Package execution complete.
I checked this MS KB Article (http://support.microsoft.com/kb/q298725/), but the instructions after opening DCOMcnfg.exe do not follow what is shown in the WMI window on Windows 2003 Server :rolleyes: (i.e. there is no "Default Security" tab to click.)
I have a strange problem that I think deals with security on SQL 2005.I have a scheduled task that runs on a Windows 2000 machine. It callsa vb script which creates a connection to SQL Server.We migrated a database from SQL 2000 to 2005 which is on a differentbox. I changed the connection in the vb script to use the new sqlserver. The original connection to SQL 2000 used the 'sa' accountcoded into the connection string , which we don't want to use on thenew server, so I changed the connection string in the script to usethe below login information.Const strConnection = "Provider=SQLOLEDB;DataSource=SQLServer;Integrated Security=SSPI;Persist SecurityInfo=False;Initial Catalog=database;I created a domain user and gave it dbo rights on the new database onSQL 2005 as well as administrative rights on the local machine and thenetwork. The task runs fine for a while and then it will fail tostart. I have looked in the event log as well as the SQL log and havenot found anything else that ran when my task failed. Once it hasfailed, if I manually run the vb script on the 2000 machine, it runsjust fine, but the schedule won't work. If I change the name of theuser that is running the scheduled task, it will begin working again.I have run the profiler on SQL 2005 and watched the scheduled tasklogin as the correct user and update the database. There is nopattern to when the scheduled task will stop running. This has beenhappening for a few days now.This script and scheduled task worked fine for over a year on themachine when it logged into SQL 2000 and nothing else has changed,which makes me think it is related to the SQL 2005 server. Any ideas?
I have a SSIS package with an FTP task to download a Excel file and populate a table using an excel connection mnager and a SQL Server Destination and it always fails with the following error when scheduled:
The job failed. The Job was invoked by User sa. The last step to run was step 1 (FTP-DM-CRN_ALLOCATION_COMMENTS).
Executed as user: WEB-INTSQLSYSTEM. The package execution failed. The step failed.
The box on which SQL Server is installed is on a workgroup on the domain and the SQL Server is started up by the Local System user on the box.
I am thinking this has to do with windows security based on all the information that I have read on these kind of error messages. Any input on resolving this will be much appreciated.
Ok, I thought this one would be easy.I have a stored proc: master.dbo.restore_database_fooThis is on database server B.Database server A backs up database foo on a daily basis as a scheduledtask.What I wanted to do was, at the end of the scheduled task is then call thestored proc on B and restore the database.If I go into Query Analyzer and log into database A, then execb.master.dbo.restore_database_foo works.But if I take the same command and make it part of the scheduled task itfails.Error is:OLE DB provider 'SQLOLEDB' reported an error. [SQLSTATE 42000] (Error 7399)[SQLSTATE 01000] (Error 7312). The step failed.To me this seems like a permissions issue, but nothing I've tried seems tohave helped.Suggestions?----Greg D. MoorePresident Green Mountain SoftwarePersonal: http://stratton.greenms.com