Can The Tell The SQL Server Agent To Wait For A Batch File To Finish Before Proceeding?
May 29, 2008
I'm setting up a Maintenance Plan to do a backup for me in the night of my SQL Databases but before the backup starts I have created a SQL Server Agent Job that runs a DOS batch (.bat) file using CmdExec. The problem is I need this process to finish processing before the rest of the tasks run in the Maintenance Plan. It seems like SQL Server Agent Job Task tells the batch file to start running (and it runs fine) but it just continues to the next task and does not wait for it to finish.
Is there any way I can configure this to wait before proceeding?
Ive built an SSIS package which generates a file from a legacy system and then downloads the file into a designated folder on the server. I need the file watcher task to wait for a the file to completely finish loading before it says it is complete. Currently, as soon as the file is created, the WMI step finishes.
Here's my dilema, I want to run a stored procedure that starts another stored procedure running, but does not wait for the stored procedure to complete execution.
The stored procedure should execute immediately, and leave the other procedure to complete running in the background. Is there any way to do this?
I need to execute a DTS that have a couple of steps and one of them is a process task that simply call an exe file i made that will send an email to warn the user.
What happens here is that the process task executes my exe file but it doesn't wait for it to compete and fires the next task after and finally closes.
There is anyway to make a "while statament" to wait until my exe application finishes?
Hi again! I would like to thank the 2 persons which I had an answer from my previous question first. Thanks a lot !
But, it didn't quite answer my question so I'll give a bit more info. I would like to know if I can make a batch file, that excutes or starts an active script in SQL Server Agent (that is in SQL Enterprise Manager). If I can, can you give me more info on how to do it (command line, parameters), a web site, anything that can help me to that.
If I cannot do that, let me know please. Thanks a lot!
[I originally posted this in the General Tools forum, but perhaps I'll get better response here...]
I am wondering if this is a bug or a (strange) feature: when a SQL Server Agent job step is a CmdExec job (properly configured for owner/proxy, etc), the step is invoked with the proper user credentials (of the 'Run As' account); HOWEVER, the job step doesn't ever seem to be logged in and thus the step does not pick up the critical local group membership(s) such as BATCH (my preference) or SERVICE, etc. Provided that the command consists solely of built-ins or specifically-permissioned executable invocations (permissioned to the specific 'Run as' account), all is well. However, if the command attempts to invoke general OS exe's (cmd.exe as a sub-shell, whoami, ftp, etc -- anything that is generally permissioned via a local group such as BATCH or SERVICE), such invocations fail with access denied. Specifying a CmdExec step with 'whoami /all' as the command demonstrates the missing group memberships.
(The only workaround seems to be to either crawl through the OS exes and assign permissions, or to grant administrator membership to the proxy credentials account.)
So, design feature or major annoyance?
[Additional info I left out: SS05 (same behavior on SP2 and SP1); on WS03R2-x64; in an Active Directory domain; 'Run as' account granted log on as service, log on as batch, etc.]
I need to take a variable from a tabel in SQL Server pass to a Batch file and execute the batch file. Right now I can exec the batch file with XP_CMDSHELL but how can I pass the variable to the batch file and loop through all the variables.
I am using the following batch file to execute a script that creates a db and all its objects in the local sql express:
sqlcmd -S (local)SQLExpress -i C:CreateDB.sql
This works fine, but I'm wondering if there's an easy way to put the script in the batch file, so users don't have to worry about putting the script in the C drive. I tried getting rid of the i parameter and pasting the script from the sql file into the batch file, but it didn't work.
Hi all, I have the "Northwind" database in my Sql Server Management Studio Express.
In my C:ProSSEAppsSamplesForChapter02Chapter02 folder, I have the following 2 files: (1) ListColumnValues (MS-DOS Batch File) sqlcmd -S .sqlexpress -v DBName = "Northwind" CName = "CompanyName" TName = "Shippers" -i c:prosseappschapter02ListListColumnVales.sql -o c:prosseappschapter02ColumnValuesOut.rpt (2) ListColumnValues (Microsoft SQL Server Query File) USE $(Northwind) GO SELECT $(CompanyName) FROM $(Shippers) GO When I ran the following SQLcmd: C:ProSSEAppsSamplesForChapter02Chapter02>ListColumnValues.bat I got the following "ColumnValuesOut.rpt" with error messages:
'Northwind' scripting variable not defined. Msg 102, Level 15, State 1, Server L1P2P3SQLEXPRESS, Line 1 Incorrect syntax near '$'. 'CompanyName' scripting variable not defined. 'Shippers' scripting variable not defined. Msg 102, Level 15, State 1, Server L1P2P3SQLEXPRESS, Line 1 Incorrect syntax near 'CompanyName'.
I copied these T-SQL statements from a book and I do not know how to correct them. Please help and tell me how to correct these errors.
Hi, and thanks for your help. I have created a simple batch that xcopy files from one directory to another shared directory (in another server). Here is the code: xcopy c:OUT_TRANSIT E:BACKUP_CESWEB /y
the E:BACKUP_CESWEB is in another server where I mapped the BACKP_CESWEB folder and make it shared.
the location of the batch file is in c:code . When I double click the batch, the files are moved into E:BACKUP_CESWEB But when I use the window 2000 schedule or try to run the batch code from within sql server analyser, I get no results.
Any idea how to solve this. The bottom line is that I want to copy files from one server to another server.Thanks for your help
Hi,I'm trying to write a trigger that on the insert/update of a column, Iwant a directory to be created. If insert then create or if updaterename existing dir. Has anyone done thins before? Any pointers wouldbe appreciated.Thanks.
I will start off with the default warning message: I am a beginner. That said, I have an SSIS process that calls an external executable to transform a data file through a homegrown C program. (This will eventually be converted, but for the moment needs to remain.) The end of the run creates a *.done file. How do I use the SSIS tasks to pause/wait, checking periodically, for the existence of this file before continuing with processing tasks? I apologize if this is easy, but I am stumped.
Thanks in advance, Roger
Info: SQL 2005 Microsoft SQL Server Integration Services Designer Version 9.00.3042.00
I am able to run SSIS packages as SQL Server Agent jobs with a Control Flow items "File system task", if I move a file (test.txt) from a drive (c on the server (where SQL Agent jobs run) to a subdirectory on the same drive. But, if I try to move a file on a network drive, the package fail.
My client will be receiving a .dbf file which needs to be uploadedinto a sql server database table (as an append) every week. They areNOT computer savvy and I would like to automate this process ratherthan go into enterprise manager and run data transformation.Is there any way to write a batch file or a set of command lines whichwill do this?Thanks.Monica
I have create a batch file to execute a stored proc to import data.
When I run it from the server (Remote Desktop) it works fine, but if I share the folder and try to run it from my pc, it doesn't do anything. I don't get an error, it just doesn't do anything. My windows user has admin rights in SQL. Why is it not executing from my PC?
We have a Job that calls a SSIS package 2005 that does some processing and execute a BAT file. This Job is being called by a web application.The BAT file creates a folder and named it based on the current date ( YYYY_MM) e.g 2015_07
It was working okay in the SQL Agent 2005 server until we moved to the new server SQL Agent 2012 using the same package SSIS package 2005. Now the issue is, instead of creating the folder based on YYYY_MM, it's now being created as YYYY_DD.I've checked the Regional settings of both server and they have the same "ENGLISH (United States) format. I even ran the code below and they're returning the same output echo %date:~ 10,4%_% date :~4,2%
I know the BAT file can be improved by not depending current locale in WINDOWS, but I just want to understand how this issue occurs and how does the regional setting being overridden?
I've completed my first SQL project, for which I've built a DTS Package. First thing it does it drop all records in the destination table, before importing new records from a txt file and then massaging them.
After I got done, I realized that if the data source is not available for some reason, the records will still be dropped, the process will fail, and the destination table will be left empty. In this case, leaving the existing records intact would be preferable to not having any.
How can I test that the txt file exists before dropping the records?
Thanks,
Randy
ps: Users will maintain a link to the table. I plan to update the table after business hours. If someone happens to have their linked application open while I'm trying to update the table, will it fail?
I'm trying to get a SSIS package to run as a SQL Server Agent job. The package uses a config file. When I try to add the config file to the job under the Configurations tab. When I click on the 'Add' button, I get the following error:
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)
The EXECUTE permission was denied on the object 'xp_availablemedia', database 'mssqlsystemresource', schema 'sys'. The user does not have permission to perform this action. The statement has been terminated. (Microsoft SQL Server, Error:229)
Im trying to schedule an integration sevice package with no anvile. I have tried both running it in Agent under Integration services and operation system (CmdExec) and still wont work. Under CmdExec, the command is /FILE "C:Documents and SettingsebuahMy DocumentsVisual Studio 2005ProjectsSwamp_SolutionFIRSTTIME_DOWNLOADSSIS_ACT.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF. In the advance tab, I checked outfile to a log.txt file so as to read the log if there was any errors. The account running the job is the same account that sql server agent services is running which would only mean I really dont need any proxy account to run the job. When i run the job under CmdExec, i get the error
Message Executed as user: CYPRESSAdministrator. The process could not be created for step 1 of job 0x529C255F20199B40A83858B1CCC875F2 (reason: The system cannot find the file specified). The step failed.
when I run the same package under integeration system, the error I get in the logg file is
<record> <event>OnError</event> <message>System.Data.Odbc.OdbcException: ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified at Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSConnectionManager90.AcquireConnection(Object pTransaction) at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.AcquireConnections(Object transaction) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostAcquireConnections(IDTSManagedComponentWrapper90 wrapper, Object transaction)</message> <computer>SPIDERMAN</computer> <operator>CYPRESSAdministrator</operator> <source>ACT TABLE 250</source> <sourceid>{99b92f4c-686d-47a6-b136-4d1ec698c837}</sourceid> <executionid>{CD95BB1A-7FE4-468B-904D-C02B3FEBF1B0}</executionid> <starttime>4/17/2006 1:20:47 PM</starttime> <endtime>4/17/2006 1:20:47 PM</endtime> <datacode>-1073450910</datacode> <databytes>0x</databytes> </record>
If you ask me, I dont know whats going on but I am very fustrated. Have been working on this for 4 days now. I really really need help.
EmployeeAccess (MasterID, LoginID, AccessID, Storage1, Storage2, Storage3) that needs to be updated using the data in the following spreadsheet NewEmployeeAccessData (ID, MasterID, AccessID1, LoginID1)
There is a 1:1 relationship between the two tables..I'm trying to code a pair of update statements on the EmployeeAccess table (1 for LoginID, 1 for AccessID) with the following logic:
If LoginID is NULL, then Update LoginID with new LoginID1 value, If LoginID is not null, and Storage1 is NULL then Update Storage1 with New LoginID1 values If LoginID is not null, and Storage1 is not NULL and Storage2 is NULL then Update Storage2 with New LoginID1 values etc etc...
The same applies when trying to populate the AccessID column
If AccessID is NULL, then Update AccessID with new AccessID1 value, If AccessID is not null, and Storage1 is NULL then Update Storage1 with New AccessID1 values If AccessID is not null, and Storage1 is not NULL and Storage2 is NULL then Update Storage2 with New AccessID1 values etc etc.
I have no control over the schema of this table so I'm trying to work the logic on how to update the columns in my table only if the corresponding column data is NULL, else update the next non NULL Storage column.
Hello, I have created a package that runs without problem. I run the package with the command dtexec /F "package_name.dtsx" > package_name.txt.
Then I run the same package from SQL Server Agent, everything is OK
Then, I tried to edit the command line to have the output file, but I got an error.
The command line is: dtexec /F "package_name.dtsx" MAXCONCURRENT "-1" / CHECKPOINTING OFF /REPORTING E > package_name.txt. (MAXCONCURRENT "-1" / CHECKPOINTING OFF /REPORTING E are created by default)
We are not certain if this has happened due to the SSIS FTP Task, but incidently the Excel file that is being copied from the FTP site using an SSIS FTP Task got corrupted a couple hours after the package was scheduled as a SQL Server Agent Job on SQL Server 2005.
I had a SQL Server 2000 DTS package doing the same thing, but it was never an issue then. I was using the FTP Task there along with an Excel Data source in that and has been working for a couple years atleast with never any corruption related issues.
In the SSIS SQL Server 2005 package I am using an FTP Task with an Excel Connection Manager and Excel source and the Excel file got corrupted within a couple hours of the package being scheduled as a SQL Server Agent job.
Has anyone experienced this issue? Any inputs will be appreciated.
Just as an fyi, the excel file has a lot of vlookups.
When adding an SSIS step to a SQL Server Agent job, when selecting the location of a config file, the dialog lets you select from the database server you're working with. If selecting the location of the package itself (when the source is File System), the dialog lets you select from the machine where Management Studio is sitting instead of from the database server. Is that intentional? And if so, why? Should I just use a fully qualified file name for the package location rather than one using a drive letter?
I recently installed standalone version of SQL 2014 Standard on my work computer. I used Access before but I want to use a SQL server instead.
We have a shared drive that a file gets deposited every day at midnight. I want to be able to get this file and import it to the server (its basically a list of names).
Here what I have done so far:
I created the database
Created the file and successfully imported data into it using the Import Data feature.
I saved the SSIS package
Scheduled an Agent Job for this package to run at certain time,daily
At first the jobs would fail with a Access is Denied. I added a user under Credentials with my network account ( have admin rights on the work computer).Also added a Proxy for the Credential user I made.
Jobs fail with a “Cannot open data file” error. I tried changing things here and there, but I can’t get it to work.
I want to design a DTS task which will: a)copy a certain given files from one directory to another b)import the files into the tables c)upon successful import delete the files from the original directory.
I done know much about scripting and need help in figuring out steps a) and c). thanks Zoey