Hi All,
In creating 'steps' in JOBS, is it possible to execute many DOS CmdExec in
one step, instead of creating several steps with a single DOS-cmd in each.
For example:
If I created a job executing those 3 functions in 3 separate steps then it works fine. But if I put all those 3 DOS command in one step, it won't work. Somehow,
SQL doesn't 'understand' it should execute after the end of each command OR
I missed something here (apparently so!).
I know if I put all those 3 DOS commands into a DOIT.BAT and execute it, it will work. But I want to use SQL Job to schedule it to run on a regular basis.
Anyone has run into this same problem? Thanks in advance.
David Nguyen.
Here is an interesting problem I can't figure out. I have a job with 6 steps as follows:
Step 1 - Import text file 1 Step 2 - Import text file 2 Step 3 - Delete all data from address tables 1 and 2 Step 4 - Copy data from imported table 1 to address table 1 Step 5 - Copy data from imported table 2 to address table 2 Step 6 - Delete imported taxt file table 1 and 2
Now when I run each of these steps individually, like running the dts packages and stored procedures my self it all works fine and the data in my tables appears to update. Then, when I set the job to run automatically, it says completed and no errors but my data hasn't updated. The job must be doing what it is meant to as it took about 40 seconds which is normal.
In my production server all the jobs are scheduled. but though sa user id sometimes the steps in the jobs are not visible but it is working, sometime it is visible in the sql server management studio. do i have to check any settings or change any settings.
Hi, I have two job J1 and J2, each one has 10 steps. Now I want J2 to be the 11th step of J1 and I did not want manually type all the steps of J2 to be 11-20 step of J2. Is there an easy way through TSQL to do this ? sp_add_jobstep only works when the step is OS command or a script but not a job
I have tried to make a DTS task running a batc-file on the same server as the SQL program is running using the CmdExec command.
My Command is as follows:
CmdExec serverpathatchname.bat
The problem is that everytime the step runs it report the following error:
"The system cannot find the file specified."
In the bat file I am also trying the de-map/map a network drive with another loginname/password than the one used on the SQL-server -> is this i problem?
Hope to hear from you soon, this is a bit urgent!!
Please provide tips to this adresse: qtip@bigfoot.com
I have to verify a .CSV file exists before I run a BULK INSERT. I am using XP_FileExist in SQL 2000 to accomplish this. After the Bulk Insert is completed and validated, I need to Rename the file and Move the file to archive the folder. For testing I figure If I can rename the file I can move it. I suspect I have permission issues and need to provide the SQL Server Agent permissions to this folder and file. I have my PC setup as a SQL 2000 Server and am attempting to get this step only working on my local machine. I created a nightly Job that remanes a file that I created in a Job and that is all it does. I am running the Job as SA but am still having issues.
The step being executed by the Job is "Ren C:MyTestFile.csv C:MyTestFile1.csv" (with the quotes). If I run this statment (without the Quotes from a command prompt, the file is renamed.
I have set the Type as "Operating System Command (CmdEXEC)". The Job history shows "The process could not be created for step 2 of job 0x71D51027F920A140A2913234DB7FF509 (reason: The system cannot find the file specified). The step failed."
As I said, I suspect that it is a permissions issue as the command works from the command prompt. What is the windows account that the SQL Server Agent uses to commit these commands? I added "Everyone" with Full access to the folder and I still get the same failure.
I would appreciate any assistance anyone could provide. Thanks in advance!
WE have a number of SQL 6.5 servers all of which run many scheduled tasks. However on one all tasks which use cmdexec to run batch files fail after 2-3 seconds. The history returns : 'No message' and the error logs just indicate that the tasks have faILED.
The identical tasks run on servers with what appears to be identical configurations and setup.
I would like to trap a return value from a cmdexec that is scheduled. The cmdexec returns 0 if it is a success and something other than 0 if it doesn't.
Can I raise an error from a command file. The command file calls a console application ( i.e. no interface ).
I have a Job that runs a cmdexec job step which executes a batch file in sql server 7.0 that runs fine In sql 2000 when i try to run that job it gives the following error and fails
---- the process could not be created for step 1 of job 0x677EF599B13FA743AA2D501D4C211AC4 (reason: The system cannot find the file specified). The step failed. ------ In fact i am not able to execute any cmdexec job in sql 2000 The owner of the above job is sa Does it have to do with SqlAgentCmdExec account which is set to corporate/administrator and has required permission.
Thanks for the invitation to post a question, so I will post one.
I need to create a job step that uses cmdExec.
This is the command line I entering: D:odbcTimeClockUpdatesinReleaseTimeKeepingNo nLogouts.exe
When I run this job the job fails. When I look at the job history, the only information I get is the date and time, user that ran the job and the fact that it failed. I haven't been able to get any CmdExec job to run at all. Can anyone tell me what I'm doing wrong?
Facts. 1. This exact same command is used by my network administrator using windows scheduler on the server. The only reason he wants me to create an sql server job is because it's mostly sql functions. 2. I know the Sql Server agent is running, because I have other jobs that are run. 3. I have verified that I have permission to run the file because I can go to the actual directory and run the exe. 4. Do I need to enclose my command in quotes i.e. “D:odbcTimeClockUpdatesinReleaseTimeKeepingN onLogouts.exe” 5. the path of the file I need to run is the path on the server and not the path on my local machine.
If you need any other information, please let me know Thanks for you help GEM
I've working a while, not at full time, but seeking the solution...Here what I want to do and what I've done till now:
I want to build a sql job, so I can run a package (witch loads 2 excel sheets into 2 tables) passing "dynamic" parameters, like convert(varchar,getdate(),112) in the format YYYYMMDD. From what I've found, I can do this with an Stored Procedure, which first set this variable, building a statement so it can be run by xp_cmdshell. For example:
And this statement runs with no problem in the SP until it reaches the step of loading the Excel Sheets into tables. Here it gives an error. It's about the JET driver. From what I've read in the forum and from the output error the problem seems to be that this statement executes the dtexec.exe 64bits, even the suggestion to change the property Run64BitRuntime set to False, it stills running from the dtexec.exe 64bits.
So, I changed the statement to point to dtexec.exe 32bits:
But executing this statement it does not even run. The errors are: 'C:Program' is not recognized as an internal or external command operable program or batch file
This was very wird to me, because this was very mentioned by the moderators.
So, I copy this same statement and created a job with a step type of "Operating System (CmdExec)" and it runs great...With no problem with the extraction from excel source.
Now my questions are: 1. Why the step job CmdExec recognizes the path of dtexec.exe 32bits ('C:Program Files (x86)Microsoft SQL Server90DTSBinnDTExec.exe ), but trying to run with xp_cmdshell it gives the error mencioned above. And if there is another way to set dtexec.exe 32bits besides this way?
2. If I cannot run it throught xp_cmdshell, how could I pass a parameter like convert(varchar,getdate(),112) in the format YYYYMMDD instead of the static parameter 20070101.
I want to copy a backup from the production server to another server periodically. The following is the DOS command and it works in DOS propmt d: is the local drive and g: is the mapping server --------------------------------------------------- "copy d:ackupackup_CustomerDB.dat g:mssqlackupackup_on_3Wednesdayackup_CustomerDB .dat"
I try to use the above command in the executive task with CmdExec as type, if failed. Then try on "cmd c/ copy d:ackupackup_CustomerDB.dat g:mssqlackupackup_on_3Wednesdayackup_CustomerDB .dat"
this time, it started to run with no error. But it run 40 miniutes and I cancelled it, since it should only run 10 minuts.
I'm a bit confused. On the command line of the job step property I entered dtexec /SQL... and got an error saying file not found, I assumed dtexec itself couldnt be found. So I tried /SQL .... by itself and got something that looked more like a security error. If I make the step property type "ssis" job appears to run fine, I receive my pkg's on success (rather than on failure) email but I know everything isnt fine because even if no data is ETL'd, first executable is supposed to (and always has in client) insert a row into an audit table and it doesnt. If I set job step "type" to t-sql and simply db email myself with t-sql command, everything is fine.
The first question is "Wouldnt dtexec need to be specified, how else could sqlagent know what I'm trying to run?" If answer is yes, what's wrong with my syntax or environment?
Can someone tell me how i can generate an exit code (Cmdexec). Ive tried running an .exe file in a job but when i do that the job hangs. When i check the history it says the exit code is somewhere in 27000. Now i'm wondering if it is possible for me to generate an exit code if the executable closes. PS. When i run the file from a batch file it works great
I have sql 2000 job that has been setup as a "Operating System Command (CmdExec) job. I am logged into the SQL Server as DomainNameSQLAdmin this domain account is part of the Administrator group on the SQL Server that the job in running on. This account has SysAdmin rights and is also starting the MSSQLSERVER Service and SQLSERVERAGENT Service on the same machine.
The job just copies files from one directory to another, here's the code.
D: cd MSSQLBACKUPAPAP_Primary xcopy *.* D:MSSQLBACKUPAP /s/y/d
When run as a job, with the owner of the job being DomainNameSQLAdmin the job fails with the following error message: Executed as user: DomainNameSQLAdmin. The process could not be created for step 1 of job 0x822E9AD29DCAAF4196369A46C7FE212A (reason: Access is denied). The step failed.
Here's the weird part if I open a command window on the sql server and run the batch it works fine.
I even tried executing the commands via xp_cmdshell but that didn't work either, I recieved the message: (1 row(s) affected) with the output being NULL and the file was never copied.
"Error executing extended stored procedure: Specified user can not login"
I have tried this through Enterprise Manager and get identical results, of course.
I have also tried all of the following: - different OS user accounts, including local system accounts with local admin rights; - assigning the OS account to a SQL login with System Admin role/rights; - specifically assigning the above SQL login with EXEC rights on the master.dbo.xp_CmdShell procedure; - verifying local security policy settings, as per the following link: http://support.microsoft.com/?id=283811; - pulling out my hair and banging my head against the wall.
I have created an Integration Services package on my development machine. The package contains a configuration file witch let's say is stored in c:projectsMyIntegrationServicesProjectmyConfigfile.dtsConfig (on my dev machine).
Then I have another "Production" machine where I import the SSIS package into an SQL database. I then create an sql-job with only one step, to run my SSIS package. This works fine if I configure the step to be an "SQL Integration services package" and configure it to use my configurationfile.
However I would like to configure this package as a CmdExec step. In the commandline, I specify /CONFIGFILE "d:....myConfigfile.dtsConfig" (the correct path on the prod machine). But it seems to be ignored, because when I execute the package I get an error telling me that the configuretion file c:projectsMyIntegrationServicesProjectmyConfigfile.dtsConfig cannot be found.
What I try to say is, it seems like it ignors the config-file I specify on the command-line and tries to reach the config-file on a location that's probably stored somewhere in the SSIS package from the time it was created on my development machine.
I have an SSIS package that runs fine through command pormpt although when I try to run it from a SQL Servr Agent Job CmdExec step it bombs out. Please help this has me stumped...the SSIS package uses an XML connection string so certain key settings such as connection strings and email info can be changed easily. Currently this is all on the same machine. I have not moved it beyond where I am developing.
On the command line I am using the following command...
Microsoft (R) SQL Server Execute Package Utility Version 9.00.1399.06 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved.
Started: 6:59:40 PM Progress: 2006-08-16 18:59:41.29 Source: Data Flow Task Validating: 0% complete End Progress Progress: 2006-08-16 18:59:41.29 Source: Data Flow Task Validating: 33% complete End Progress Progress: 2006-08-16 18:59:41.71 Source: Data Flow Task Validating: 66% complete End Progress Progress: 2006-08-16 18:59:41.73 Source: Data Flow Task Validating: 100% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 0% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 33% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 66% complete End Progress Progress: 2006-08-16 18:59:41.77 Source: Data Flow Task Validating: 100% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.79 Source: Data Flow Task Prepare for Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.81 Source: Data Flow Task Pre-Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.84 Source: Data Flow Task Pre-Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.90 Source: Data Flow Task Pre-Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.90 Source: Data Flow Task Pre-Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 0% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 33% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 66% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Post Execute: 100% complete End Progress Progress: 2006-08-16 18:59:41.92 Source: Data Flow Task Cleanup: 0% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 33% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 66% complete End Progress Progress: 2006-08-16 18:59:41.93 Source: Data Flow Task Cleanup: 100% complete End Progress Progress: 2006-08-16 18:59:41.95 Source: Send Mail Task The SendMail task is initiated.: 0% complete End Progress Progress: 2006-08-16 18:59:42.09 Source: Send Mail Task The SendMail task is completed.: 100% complete End Progress DTExec: The package execution returned DTSER_SUCCESS (0). Started: 6:59:40 PM Finished: 6:59:42 PM Elapsed: 1.984 seconds
When I try to use the same command within SQL Server Agent Job using a CmdExec step I get the following error...
Description: The package is attempting to configure from the XML file "S:connectionscontacts.dtsConfig". End Info Warning: 2006-08-16 18:40:03.15 Code: 0x80012012 Source: contactsPackage Description: The configuration file name "S:connectionscontacts.dtsConfig" is not valid. Check the configuration file name. End Warning Warning: 2006-08-16 18:40:03.15 Code: 0x80012059 Source: contactsPackage Description: Failed to load at least one of the configuration entries for the package. Check configurations entries and previous warnings to see descriptions of which configuration failed. End Warning Info: 2006-08-16 18:40:03.20 ... Process Exit Code 1. The step failed.
I'm using SQL Server 7.0. I have a job which runs DTS packages (1 package per step). When a task fails within my DTS package, I'd like an error returned for that step in the job thus stopping the job and not starting up the next step (DTS package) in the job. As it stands right now, if a task fails within the DTS package, that step in the job still returns a successful completion. Has anyone seen this before and is there something I can do to get the DTS to send a failure for that step in the job?
I am going to be moving multiple databases to a new server. Everything should go smooth, but I need to change a lot of the DTS packages that reference the old servername and replace it with the databases DNS record.
Is there an easy way to get a list of which dts reference the old server explicitly (not using database DNS)?
hi !!!i try to connect to my sql server local instance but it is always failed ..... can you please tell me the step by steps and options to use to install sql server on my machine and i think i need to use he personal copy rather than the standard as it will be on my machne not in the server??? please help
Can anybody tell me how many steps it's possible to put in one job. The reason I ask is that we have a job that has over 500 steps (import data from Excel file into SQl table) and every time it runs we have different steps failures.
Does fact, that excel file was dropped and recreated, change DTS Id ?
Hi, I am new to replication. I have to replicate a db on SQL7.0 sp5 . It's going to be transactionol. Is there any article which explains everything - where to start from and where to end? I mean everything step by step..... TIA.
Hello, SQL Server 2005 Enterprise and new hardware have been ordered for our department. We currently run SQL Server 2000 (sp4). We have almost 500 DTS packages, 293 Jobs, and 14 user databases with hundreds of objects within.
Is there any documentation out there on how to scrutenize a current system? I have searched, and most of what I can find addresses migration planning with the assumption that the databases, packages, jobs, security, etc are ready to move over. We have a lot to think about before we can do that. We know we have redundancy problems (like View proliferation), table schema issues, obsolete DTS packages and Jobs, and otherwise a host of opportunities to 'clean house' and/or improve. We would really like to get a handle on what we are migrating before we migrate.
If you have any ideas or resources to you feel would be worth looking at, please share.
Generally speaking when you want to optimise an application that relies on a database which is the order of the following optimization techniques
a) optimizing the spread of the pysichal elements of the database on different disks of the server b) optimizing the use ot the RAM c) optimizing the SQL d) opimizing the OS
I've created SQL Server Agent jobs through management studio on SQL Server 2005. I can view and edit these jobs when I am logged into the server via remote desktop, but when trying to administer these jobs through Management Studio on a different machine, the steps do not appear in the job properties window. Anybody else see this behavior? Know why it occurs? Is it a bug, or another wonderful "feature" of Manglement Studio?
welcome everybody i want to publish my sql2005 server through my isa2004 so i do the following steps and i want to know if there is wrong in it or if there is another step is missing or not?
1-i make editing in router configuration file to natting requests on my real ip to the external interface of my isa
2-at isa i make sql publishing rule to forword requests to the ip of sql server (from:anywhere to: ip of sql server listner:external protocol:microsoft sql server requests:appear from isa not original client ports:default ports1433)
3-at sql server i enable allow remote and local connection over tcp only 4-at sql server i enable allow remote desktop 5-at sql server i enable firewall and in exception tab i add remotedesktop and 1433 port
but still when i try to connect from internet using the studio managment express tool using the real ip address(tcp:{my real ip address}) and login information of sql still error occure and no connection opened.... note:scw was installed and i uninstall it
so what is the problem why sql can't published also i make at isa another rule to allow remote desktop to my sql server using rdp protocol but when i try to connect using remote connection to sql server it failed but when connect to any other internal server it work succesfully
What in SSIS replaces DTS Task Steps? In DTS you could build tasks and assign them an order in which to execute. How is this replaced in the SSIS Control Flow. Thanks.
I am trying to create a SQL Agent job with 3 steps.
I want to delete three tables.
Step 1 ...delete table "X"
Step 2 ...delete table "X"
Step 3 ...delete table "X"
problem is that afer i create the three steps and start the job it never seems to finish the first steps and non of the other steps run, the job looks like is executing and never finishes. I break the job into three jobs each completes fine. I need this to runs one job, any ideas?