In SQL 7.0 jobs that have been scheduled start correctly, but jobs will not start when requested manually. All services are running. The only way to fix this problem has been to reboot the server. Does anybody have any ideas what might be causing this situation.
Hey all. I've got a DTS package that's scheduled to run after business hours on the last day of the month. This package copies some tables from an offsite SQL Server, then runs through a series of SQL Statements and finally exports an excel file with the results.
My problem is that the DTS will run if I manually start it, but the scheduled job always fails. Of course, the error I get is that the job failed at step one, and I have no other info.
I'm not a heavy DBA (mor eon the client app side of things), so I'm unsure as to how I can dbug this. Any help would be greatly appreciated!
Why is it that I can run my DTS package locally howeverv when I try to run it as a Job it always fails? I do realize that the sqladmin account is used to run the job and it has all the permissions needed.
Any suggestions?
Also if I kick the job off from my local system it states it cannot find the batch file that I am trying to run? Im running it on the server however it treats it like im running it locally?
I have 2 DTS packages that import data from an Access database that have recently started failing. They run fine manually in DTS, but not manually as a job. They get an "unspecified error." These were running fine, until we installed Outlook and started to add Outlook mail to SQL.
Originally, Administrator was the owner and that is how the jobs were run. We changed to SQLAdmin for the SQLAgent to start under, and I changed the owner of the jobs to SQLAdmin. This works for all jobs but these 2. I thought maybe SQLAdmin could not get to the Access database, but it can. I spell out the path for the database, I don't use any mappings. I recreated the jobs logged in as SQLAdmin, and they still do not work as jobs.
Any ideas are much appreciated!! Thanks in advance!! Karen
Hi all - I'm having problems getting a package to run successfully to completion when I schedule it in SQL Server or as a batch file on the Windows scheduler. If I run the package interactively or run the batch file interactively that contains the DTSRUN command it runs to completion. Both packages start with a call to a batch file that FTPs files from a remote server and then they continue on by executing additional DTS packages within the running package. The owner of all the packages involved and is the same user that I am logged in as when running the packages interactively and is the same user that that SQL Agent and the Windows scheduler job runs under. The FTP step of each package does complete successfully but then I cannot trace where the package then hangs. The package never fails but rather it just continues in a Executing/Running state. This is getting extremely frustrating. Any insight in to this problem would be greatly appreciate.
I have a section of about 6 DTS packages that started failing recently. They all write to a network share using a UNC path. The failure coincides with a change to the account SQL Server and SQL Agent run under. I know security context is very important and I have verified the new service account they run under is a Domain account that can execute the DTS packages and has full permissions to the network share.
But when the Agent Job tries to run it fails everytime. Two of us have been looking at this and can't find the issue.
NOTE: I also made the service accout a Server Administrator (fixed server role) to make sure it wasn't a SQL permissions issue.
I have SQLServer 7, and have used Enterprise Manager to schedule local packages. The package that I'm trying to do is run a SQL script on a nightly basis. The problem that I'm having is that I need to be able to install the scheduled script customer box using an installer. I have access to execute command line programs in the installer. Do anyone know if you can schedule local packages (using SQL Scripts) from the command line? Or if a 3rd party application can do this. Any help or direction would be greatly appreciated. I've tried to use sp_add_job and sp_add_jobschedule, but haven't been able to get them to work.
I have a DTS package that consist of approximately 50 steps which pulls data from an as400 and populates a SQL 7.0 database. The package works perfect flawlessly when executed manually but won't run when scheduled.
The DTS package starts but hangs when it tries to connect to the as400, this is where the problem lies. The preceeding steps work fine but they execute against the SQL 7.0 database. I have client access installed on the server and configured. I checked and made sure the SQL server agent service was started and running also,I just can't figure why it works fine manually,but not when scheduled.
Below is a migration plan that I've compiled to migrate SQL 2000 DTS packages to SSIS 2005. Once these DTS packages have been migrated i will need to create a job and schedule them in SQL 2005.
I would appreciate and feedback or questions on this migration plan.
Migration DTS 2000 packages to SSIS 2005:
1. Will need to save the current production DTS package as structure storage file. We do not have a UDL file. We set the data connections within each DTS package. 2. Go to Sql 2005 - ManagementLegacyData Transformation Services - right-click and open previous saved structure storage file. 3. Modify the DTS data creditentials to reflect the SQL 2005 connection data. Modify any SQL 2000 MAPI settings to utilize SQL 2005 new database mail. Save the package on SQL 2005. 4. After the modified DTS package has been updated and saved on SQL 2005, save this file as a structure storage file. 5. go to BIDS. Create a new SSIS project. Right-click on SSIS packages and select Migrate DTS 2000 package. This will migrate over the DTS 2000 package with the updated SQL 2005 data creditentails. 6. click on the package properties - protectionlevel and change it to dontsavesensitive. 7. right-click and select package configurations..., select to store data creditentials in xml format. 8. right-click on execute DTS 2000 package task, select Edit... and click on Load DTS2000 package internally. This will embed this task into the new SSIS package. Test the package. Continue if successful. 9. Use SSIS deployment functionality to move the package over to SQL 2005 Integration Services. Right-click on package and select Run Package, if successful, create a job and schedule it to run on SQL 2005 Agent. 10. When creating the Job under SQL Agent, change the Owner: of the job to reflect the owner of the new SSIS package. Schedule the job.
I've just coming up to speed on SSIS 2005. Therefore, this is what i've been able to piece together up to this point and I'm looking for some industry advice/feedback on whether or not this is a good migration plan. I need to provide a migration plan to management by 2/18. Thanks
I have a SQL Scheduled Job in SQL 2005 SP2 that executes a few different SSIS packages. When I update the SSIS Packages, SQL Agent runs the job and shows success; however, the underlying packages are not executed. The packages are located on the server but we are using UNC paths in the jobs since much of the time they are started manually from a remote workstation and the jobs seem to look at the initiating workstation's file server rather than that of the SQL Server. Is this a known issue?
Would please any expert here give me any guidance about what Data Mining tasks can be automated and scheduled via Integration Services Packages? Also, If we automated the tasks, can we also automatically save the results of the tasks somewhere? Like if we automate assessing the accuracy of a mining model, then we wanna know the mining model accuracy later, therefore, we need to save all these results from the automated actions. Is it possible to realize this?
Thanks a lot in advance for any guidance and help for this.
We are using an MSDE database engine (1.0=sql7) as a report scheduler on a web server. I can access MSDE via SQL7 Enterprise Manager to connect and view MSDE, but cannot use the DTS and job scheduling as MSDE seems to lack needed components. Using Microsoft Knowledge Base article Q241397 as a guide, I was able to create a backup of the msdb database using stored procedures including sp_add_job, sp_add_jobserver,etc, that are already present in the msdb database.
My problem is that I need to create and schedule a job for another MSDE database called CE8. This database was created by the reporting app, and contains no stored procedures. It looks to me like only msdb has the needed job scheduling procs. Does this imply that I am supposed to schedule all jobs for all databases via msdb database, or will I need to copy all job scheduling stored procedures into the other CE8 database, then create a job? Hope this makes sense, Randy
I have inherited a deployment running inside a custom .net application. I'm looking to deploy an updated version of a report to the report server. My question is there a manual way of deploying an RDL? I'm afraid of taking down the application entirely and I do not have detailed enough documentation to bring it back up. Thanks!
Interested in creating a manual standby database.Will like toimplement this with Sql Server standard Edition availableAny ideas or recommendationsThanksTY
Assuming that I have a table [users] in a MSSQL 2005 database. the table has two columns. [userId] and [password]. I have an aspx with C# page with two textboxes and a button. I want to let the user login or send him an error message if the password or the username does not match from the table in database. I always made this possible through the template database from administrating the website. but never tried to do it as simple as that. I'm lost here! How can I do it?
I would like to automate a simple process which involves trucating a table and importing records back in. What would be the simplest solution to automate this kind of process? I'd like to run this process twice a day.
I have only ever created and deployed SQL 2005 Reports to the report server using Business Intelligence Studio. I have a client who's network will not allow me to remote connect to their Reporting server so that I can deploy their reports.
If I deploy the reports to my own report server, how can I then manually upload them to the Report Server using their Report Manager instad of my Business Intelligence Studio. The Upload files option is only looking for 1 file at a time, however the Business Intelligence solution have the files split up instead of in one neat package.
I never had to use DTS in MSSQL 2000 but I'm finding a need to use SSIS quite a bit in a new position. I know this is subjective, but I looking for suggestions for a reference book with good examples and tutorials.
I recently built a SQL Server x64 server, where I changed the default install path of SSIS. My reason for doing so was to move the SSIS engine off of the system volume, onto a different set of spindles, to reduce potential contention between the OS, the SQL Engine and the SSIS Engine. It is also documented in BOL as an option for the components.
The 64-bit runtime installed fine on the new volume, however, the 32-bit runtime was not installed at all. I checked Program Files (x86) and the directories I chose during the initial install, but only the 64-but runtime exists on the server. Since 64-bit drivers are limited (to put it mildly), I would say about half of the packages I run through jobs require execution of the 32-bit runtime to work.
BOL doesn't seem to indicate that the 32-bit will not install if the default is changed, but the following note indicates that I would not even be able to uninstall/reinstall the components back to the system drives:
A single installation path is shared between SQL Server Integration Services, Notification Services, and Client Components. Changing the installation path for one component also changes it for other components. Subsequent installations install components to the same location as the original installation.
I am wondering if there is a command-line install option to ensure the 32-bit runtime is installed, or if there is a way to install the 32-bit SSIS runtime manually. And if not, can the registry be edited to force a re-install back to the default path?
I'm trying to install SQL2005 Express Edition, the adv. version that includes SSRS. The installation fails because it cannot upgrade the existing SSRS2000. I tried to uninstall the latter from the Add/Rem. Programs, and I get "Fatal Error during installation". IIS was stopped prior to the uninstall attempt.
SSRS2000 will thus not uninstall. How can I do a manual uninstall of it? I need to retain my SQL2000 instance, so the manual uninstall of SSRS must not injure it.
Hello All Prg's I need your help, I use Database SQL server Express with C# when I entred some data into DB by clciking on the table in server Explorer in Visual Studio 2005 IDE, and then I make SQL in the code as: if (e.KeyCode == Keys.Return) { bool res; int OldPlayerId = 0; string ConnectionStr = Program.Member_Connect; string sqlQuery = "select Player_ID from Player where Player_Code = '" + textBox1.Text + "'"; SqlConnection Sql_connection = new SqlConnection(ConnectionStr); Sql_connection.Open(); SqlCommand command = new SqlCommand(sqlQuery, Sql_connection); SqlDataReader dataReader = command.ExecuteReader(); if (dataReader.HasRows) { if (dataReader.Read()) OldPlayerId = dataReader.GetInt32(0); else MessageBox.Show("rtet"); } else { res = false; } comboBox2.SelectedValue = textBox1.Text; command.Dispose(); Sql_connection.Close(); Sql_connection.Dispose(); comboBox1.SelectedValue = OldPlayerId; } it work correctly on the old data that I entred them manualy, but when I enter new data from my project, this query don't give correct value, it still work correctly on old data,
I tried to take same data by combobox connected to view that contain same sql ,it work always. but I need manualy Sql. Please help me , I am very lating to delever my project... please help me thank you very mush
I've got a java application that connects to a sql server 2000database.The application must access with total permissions to database but Idon't want that anybody can insert or delete data with the corporativeadministrator of sql server 2000.How can I lock the corporative administrator in order to not permitmanual manipulation but my application can work properly?Thanks!
HiI am attempting to take a manual backup of an SQL 2005 database, viaRight Click on MyDatabase --Tasks --Backup[Specify the filename path and type of backup]However, when I try to specify the filename or, in fact, do anything, Ireceive the following errorProperty BackupDirectory is not available for Settings'Microsoft.SqlServer.Management.Smo.Settings'. This property may notexist for this object, or may not be retrievable due to insufficientaccess rights. (Microsoft.SqlServer.Smo)The user I'm logged into the database with, however, has full rightswithin SQL.Does anyone have any ideas?Thanks!
Hi,I'm using the JDBC API's to access a MS-SQL 2000 DB.I have amanual-commit transaction block involving multiple inserts andupdates. One of the step involves retrieving the auto generated key(Identity column) which, I presume can be done with @@IDENTITY. MyQuestion is: Will @@IDENTITY work if Auto Commit is turned off?
is there a way to start validation of external metadata manual?
My problem is this:
The package uses a variable as connectionstring for flatfile source, and another variable for the destination table. Running the package gives a warning about external metadata that needs to be updated. Normally I update this data by just opening the data flow, and answering the question for updating with yes. This time that deosnt work, I think because the variable is not set, so there can not be any conflicts with external metadata.
I dont want do disable validation, but just validate one time and then save the package.
I have got the error “a network related or instance specific error occurred sql server 2012 “.I have enabled tcp/ip, restarted the services. The sql server service is getting stopped even after the manual restart. I have checked in event viewer and I noticed a error. Here is the error...The log scan number (43:456:1) passed to log scan in database ‘model’ is not valid. This error may indicate data corruption or that the log file (.ldf) does not match the data file (.mdf). If this error occurred during replication, re-create the publication.
We have a scenario where we want to lock and unlock manually some rows in a table in SQL CE database .First the rows we need should be locked. By using a select query the locked rows will be read and processed in front end. Only if the process is completed in front end, manually I need to unlock rows which I have locked earlier. When rows are in locked state no body should be allowed to access that rows . ( should not allow anybody even to put a select query on that locked rows). At the same time appending rows in the table should always be allowed. Is it possible to achieve the above scenario?
We have a scenario where we want to lock and unlock manually some rows in a table in SQL CE database .First the rows we need should be locked. By using a select query the locked rows will be read and processed in front end. Only if the process is completed in front end, manually I need to unlock rows which I have locked earlier. When rows are in locked state no body should be allowed to access that rows . ( should not allow anybody even to put a select query on that locked rows). At the same time appending rows in the table should always be allowed. Is it possible to achieve the above scenario?
I'm trying to install SQL Mobile from a Windows XP SP machine to a Dell Axim X51v mobile device.
I've read the installation procedures and I've been unable to locate the cab files. I have located "DLLs" and Zip files but no CAB files. Where else might I find the CAB files?
I've run into a problem with SSIS packages wherein tasks that write or copy files, or create or delete directories, quit execution without any hint of an error nor a failure message, when called from an ASP.NET 2.0 application running on any other machine than the one where the package was created from. By all indications it appeared to be an identity/permissions problem.
Our application involves a separate web server and database server. Both have SQL Server 2005 installed, but the application server originally only had Integration services. The packages are file system-deployed on the application server, and are called using Microsoft.SqlServer.Dts.Runtime methods. For all packages that involve file system tasks, the above problem occurs.
When the above packages are run using the command prompt (either DTEXEC or DTEXECUI) the packages execute just fine. This is expected since we are using an administrative account. However when a ShellExecute of the same command is called from ASP.NET, the same problem occurs.
I've tried giving administrative permissions to the ASPNET worker process user to no avail.
I have likewise attempted to use the SQL Server Agent job approach but that approach might not be acceptable for our clients since it means installing SQL Server 2005 Database services on the application server.
I have read the relevant threads in this forum, namely http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1044739&SiteID=1 and http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=927084&SiteID=1 but failed to find any solution appropriate for our set up.