Changing Sa Password Causes Scheduled Tasks To Fail.
Feb 8, 2000
I recently changed the SA password on my SQL Server 6.5 installation and discovered that this had caused the Scheduled CmdExec Tasks (defined in SQL Executive) to fail with:
"Process Exit Code 1. Microsoft (R) SQLMaint Utility, Version 6.50.240 Copyright (C) Microsoft Corporation, 1995 - 1996 [Microsoft SQL-DMO] Error 4002: [SQL Server] Login failed"
Changing the password back to the original resolves the problem, but is a less than satisfactory solution.
Any advice on how I can change the SA password and allow CmdExec tasks to continue running would be gratefully appreciated.
If I have 2 scheduled tasks set for the same time (perhaps accidentally), will the SQL Executive start 1 and queue the other one until the first is complete and then run the 2nd task? Or will they both be started simultaneously?
I have been running the following production job successfully for a long time. It now fails, and the Task History Last Error Message displays 'No Message'. The log file ( C:MSSQLLOGMaint_TombV50.txt) shows it ran successfully, with a Return Code 0.
To all, If I have a scheduled tasks that is owned by 'sa', how can I assign permissions to allow another user, even the database dbo, to register the SQL server and view the scheduled tasks?
Hi, I am using Windows 2000 and created around 15 scheduled tasks. My scheduled tasks supposed to run every morning. I have to set password to first scheduled tasks every morning; otherwise none of the scheduled tasks works. As soon as i put password in the scheduled tasks, other tasks work fine.
In my production environment, ALL OF THE SUDDEN, my backups via EM schedule tasks, fail, yet when i examine the error, it says that the DUMP/LOAD was successful. I receive the following error: 'Could not insert a backup or restore history/detail record in msdb.dbo.sysbackuphistory or sysrestore history. This may indicate a problem with the msdb.. . . . . . I expanded the MSDB database, and ran sp_purgehistory (no params) and it still FAILS. Is this related to the log? Please HELP asap. Thank you in advance.
I am trying to set up a DTS to transfer logging data from one server to another. The record may already exist at the destination causing a primary key violation. I do not want this error to cause the entire DTS to fail.
When I execute the DTS I created by right clicking and selecting "Execute Package" it shows me 2 errors. Although there are 2 errors the rows that do not have a primary key violation are successfully transfered to the destination database. Here are the 2 errors I see:
Error 1: Error at Destination for Row number 97. Errors encountered so far in this task: 97. The statement has been terminated. Violation of PRIMARY KEY constraint 'PK_event'. Cannot insert duplicate key object 'event'.
Error 2: Error at Destination for Row number 198. Errors encountered so far in this task: 198. The statement has been terminated. Violation of PRIMARY KEY constraint 'PK_eventDetail'. Cannot insert duplicate key object 'eventDetail'.
These errors make sense, there were 97 duplicate lines in the event table and 198 duplicates in the eventDetail table.
This is the behavior I want. New rows are copied to the destination database.
When I schedule the DTS as a Job in the Enterprise manager things change. When the DTS is executed as a Job (as opposed to me right clicking and selecting "Execute Package"), the job reports a failure and none of the new rows are transfered to the destination database.
Why does the DTS transfer the rows that do not violate the Primary Key constraint when I manually execute it and not when it is executed as a job?
1) Is there a way to call a second task 'upon completion' of the first one. Or would I need to place the code of the second task in one task? (This is v6.5) I guess the main purpose is to avoid conflict & see if one fails, the other may succeed, etc. I generally like to keep diff tasks separate, but one is dependent upon the completion of the other.
2) I want to restore a db, upon completion of the backup of another db. They are on separate servers, but i know i can use remote sp's.
Any advice, help, code, scripts, etc, would be appreciated. Thank you.
Apologies if this has already been covered, but I can't find any information.
What is the easiest way to move a batch of scheduled tasks from one server/network to another? Can you script them up somehow and then reapply the script to the new location?
I'm running SQL Server 6.5 and have schedules a number of tasks to run overnight. This has generally been successful; even when the tasks have failed, it's been because of a lack of data in a feeder database system.
My problem is that when I try to run the tasks manually, I get a message saying that they can't be run because SQL Executive is not running on the server. Now, this ain't true. The little gizmos show green in the server manager, and if I right click on SQL Executive, 'Start' is disabled and 'Stop' is enabled.
Am I missing something here? Does anyone have any clues or suggestions?
I'm going crazy with EM's Sheduled Tasks. I've setup couple of backups and weekly maintenance plans from EM. After couple of days I viewed sheduled tasks from EM toolbar and found that all the tasks have not executed as I have scheduled. the task history does not show any history at all. Clicking refresh does not do any good either. However if i right click on the database and choose restore i see all the backups that i have scheduled. Is this a bug?
We are running SQL 6.5 sp5a. There have been a number of instances recently where some scheduled tasks don't complete. I can't even cancel them. The only way to stop them is to stop and start the Executive service.
Anyone come across this before and know what the problem is?
Is anyone aware of a way of adding scheduled tasks to MSDE?
Obviously in SQL Server you'd use the Enterprise Manager to do that, but I don't believe it is possible to get Enterprise Manager seperately from SQL Server.
What are peoples opinions on using SSIS as a "central repository" and replacement for all scheduled tasks. Example, we have a bunch of servers which we have installed services which we have written. Currently we have scheduled tasks on each machine to stop and start the services. One of my collegues is using SSIS to run a system which runs tasks on multiple machines by remotely running programs on other machines via scheduled tasks and then collects the data and puts it into a database.
He's now pitching the idea that we remove the scheduled tasks on each machine and start and stop our services via SSIS so that it's centralized. In addition, we can also check for holidays in our database before starting services. Since it doesn't seem like SSIS was meant for this type of use, I'm weary of using the tool to do something it wasn't intended for.
Any opinions? I'm also worried that the learnnig curve for everyone is going to be too high.
I have 2 DTS packages that import data from an Access database that have recently started failing. They run fine manually in DTS, but not manually as a job. They get an "unspecified error." These were running fine, until we installed Outlook and started to add Outlook mail to SQL.
Originally, Administrator was the owner and that is how the jobs were run. We changed to SQLAdmin for the SQLAgent to start under, and I changed the owner of the jobs to SQLAdmin. This works for all jobs but these 2. I thought maybe SQLAdmin could not get to the Access database, but it can. I spell out the path for the database, I don't use any mappings. I recreated the jobs logged in as SQLAdmin, and they still do not work as jobs.
Any ideas are much appreciated!! Thanks in advance!! Karen
I googled this an I got wonderful resul... no, no I didnt. Could someone please provide a link or a step by step guide on how to schedule the running of SP's that will then send the results to an email address.
I tried installing SQL 2005 Dev Edition on my Win 2000 Professional machine. That failed. But now I cannot send out emails or even open a simple Email task from DTS. This is something that worked perfectly fine before I attempted the SQL 2005 installation. It seems that the SQL 2005 installation somehow messed up the the MAPI profile. The exact error message that I get when trying to execute or open a Email task in DTS is: CAnnot load MAPI Interface layer for DTS. Please make sure that semmap90.dll is installed.
I have a strange problem with a scheduled task failing with the following:
"Unable to send completion notification email to operator with email name '' for task 2780, 'Scheduled Update'"
This job is the same across several servers and the job runs on the other servers. This is a bit frustrating... I cannot seem to find the difference that is causing the problem.
The funny thing is that I am not using SQLMail or anything to notify anyone regardless if the job succeeds or fails.
I have some "phantom" tasks running, specifically, DB dumps to a NULL Pipe device and they are not scheduled!!! These do not show up in the SYSTASKS table, where can I look to possibly find these un-scheduled dumps? Thank you....
I schedule dbcc checkdb command in the SQL schedule task program at 3:00am. It ran successfully but since it ran as a schedule task, I don't know where to find the results. Can anyone help?
The import from Flat File Source fails: Error 0xc02020a1: Data Flow Task 1: Data conversion failed.
The data conversion for column "ArticleName" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". (SQL Server Import and Export Wizard)
I have changed the size of the column "ArticleName" (varchar) to max but the error comes up again.
The data i want to import came with multiple flat files. They all could import properly but this one is a problem.
Would please any expert here give me any guidance about what Data Mining tasks can be automated and scheduled via Integration Services Packages? Also, If we automated the tasks, can we also automatically save the results of the tasks somewhere? Like if we automate assessing the accuracy of a mining model, then we wanna know the mining model accuracy later, therefore, we need to save all these results from the automated actions. Is it possible to realize this?
Thanks a lot in advance for any guidance and help for this.
Best practice talks about the use of name conventions in the SSIS packages.
Is there a way to rename the default names of the tasks in the DataFlow and ControlFlow once and for all, so that each time I drag a new DataFlowTask in, the default name will be DFT and so on.
It is our yearly procedure to change the sa password. This time we do it for the first time in ss2005. After changing the sa password, all jobs fail with the following error:
MessageLogin failed for user 'sa'. [CLIENT: <local machine>] MessageError: 18456, Severity: 14, State: 8.
When i check the properties of the job i can see (in the general tab) that all jobs are running through Windows authentication.
Any newly created jobs run successfull. What do i do wrong?
After observing brute force attacks on our VPS myhosting instance against the SA login, I change the SA login name. Now my two new backup jobs are failing. The agent service logs in as NT ServiceSQLSERVERAGENT. Nothing changed there (so far as I know and I'm the only one who should be on the VPS). The job owner was SA and after changing the SA account that was reflected in the SSMS gui; job owner 'newsaname'. I'm sure I can just rebuild the maintenance plans but I'd like to know what happened.
Also, I would like to learn more about the brute force attacks and how to determine what port they are comming in on. I see an IP address associated with them. Does that mean they are coming in on 1433 or 1434?
SQL 2012 Standard VPS Windows 2012 Server Standard
I am using SQL server 7, all service packs. We recently changed the NT Administrator password because of staffing changes. After doing so, I have 2 scheduled DTS job that are failing with an "unspecified error" -2147008507 (80074005). All other scheduled DTS packages succeed (about 20 other jobs including 1 DTS).
I have not found this error documented. I am assuming this has to do with the password change, but I have not been able to resolve it. The DTS packages run fine by hand (manually). I recreated the scheduled jobs several different ways but they still fail.
Any information on this is most appreciated! Thank-you!!
The full message is:
... DTSRun: Executing... DTSRun OnStart: Drop table [Intranet].[dbo].[tbl_phone_depts] Step DTSRun OnFinish: Drop table [Intranet].[dbo].[tbl_phone_depts] Step DTSRun OnStart: Create Table [Intranet].[dbo].[tbl_phone_depts] Step DTSRun OnFinish: Create Table [Intranet].[dbo].[tbl_phone_depts] Step DTSRun OnStart: Copy Data from Departments to [Intranet].[dbo].[tbl_phone_depts] Step DTSRun OnError: Copy Data from Departments to [Intranet].[dbo].[tbl_phone_depts] Step, Error = -2147008507 (80074005) Error string: Unspecified error Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts.hlp Help context: 700 Error Detail Records: Error: -2147008507 (80074005); Provider Error: 0 (0) Error string: Unspecified error Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts.hlp Help context: 700 Error: -2147467259 (80004005... Process Exit Code 1. The step failed.
Whenever I make a breaking change to a custom SSIS component/tasks and update the Assembly Version, it seems to break my packages beyond repair, telling me it can't load the task:
Error loading Package1.dtsx: Error loading a task. The contact information for the task is "". This happens when loading a task fails.
All of the properties of said task now show:
Could not get value for property 'c-155-designer-name'. Specified cast is not valid.
Typically, a "breaking" change when it comes to code just means that you need to update your components to adhere to the new contract of the updated signatures. But with SSIS, it seems the only solution to this is to completely remove the component, and re-add the new version, and re-enter all of the property values/expressions. If I have a package containing 10 instances of a task that only had one property removed, for example, this results in a very time-consuming process of fixing my package.
So my questions:
1) Am I doing something wrong in my versioning/deployment that is causing my packages to unnecessarily break?
2) If this is just "by design" and the way it's meant to behave, what is the best practice for making breaking changes to custom tasks/components used by many packages? Should I just never change the assembly version, even when it is a breaking change (this seems to be less disastrous)?
3) As a last resort, if I'm stuck with having to fix the broken tasks, is there a better way to fix them rather than having to completely remove them, re-add them, and re-set all of their properties/expressions?
Hello. I am using SQL Server Management Studio (SQL 2005) and created a daily backup job. Inside the job, i have an "Operating system" step to copy backuped up files onto another directory. However, the job kept on failing with an error "Executed as user ... Access is denied." With this error, how can i change the user used to execute the job? Thanks.
We find that if we deploy the OLAP database with a different name on the test server, then regardless of how we change the connection string provided to the SSIS package that processes the cube, then the package fails to connect to the database. To clarify:
In development the OLAP database is called MyOlapDB and the source database is called MySqlDB. Both are on the same machine. When the the application is built and released for test, the test team install the databases on a replica of the production environment (i.e. web app on one machine, OLAP DB on another and SQL database on yet another). They also, quite rightly, implement the new test databases so they incorporate the build version number. So, MyOlapDB123 and MySqlDB123 are both from build 123.
This is when the problems start. Regardless of how the connection string is specified in the job that processes the cube, the SSIS integration package fails with the error:
[Analysis Services Execute DDL Task] Error: Errors in the metadata manager. Either the database with the ID of 'MyOlapDB' does not exist in the server with the ID of 'OurTestServer', or the user does not have permissions to access the object.
We have tried config files and job properties, but neither work. Also, simply attempting to run the package using the DTEXECUI does not work either.
Looking inside the XML of the package, we clearly see the ConnectionManager object which has the original connection string, which is
Data Source=localhost;Initial Catalog=MyOlapDB;Provider=MSOLAP.3;Integrated Security=SSPI;Impersonation Level=Impersonate;
However, editing the initial catalog here still does not solve the problem. Searching the XML for the string MyOlapDB reveals the OLAP database name in two other places - both within the object data of the two Analysis Services Execute DDL tasks.
Anyone know how to solve this problem without having to hack the XML of the package?
I`m running SQL 6.5 with standard security. We`re running NT 4.0. I am a member of the NT Admin group - which, of course, has sa privileges in SQL Server. I changed the sa password (via Enterpise Manager on my client machine); and it worked. I tested the password change by connecting to ISQL/w - both on my client machine and at the server machine.
However, when I attempted to connect to the server via Enterprise Manager (SEM) - on my client machine - my login attempt failed. But at the server machine, I connected just fine.
Somehow, even though we`re using standard security, I`m getting locked out of SEM because of the new password. Changing the sa password back to what it had been resolved the problem; but the old password was only meant to be temporary.