This script shows how to load a SQL script file into a nvarchar(max) local variable and then execute it.
-- Execute_Script_From_File.sql
/*
Load a script from a text file and execute it.
The script must not contain a GO Statement
Runs in SQL Server 2005
*/
set nocount on
if object_id('tempdb..#FileData','U') is not null
begin drop table #FileData end
-- Set the file to load SQL Script from
set @FileName = 'C:MyDirMyScriptFile.sql'
-- Create command to load the file into temp table #FileData
set @SQLLoad =
N'insert into #FileData
select fd.*
from openrowset(bulk ''' + @FileName + ''',SINGLE_CLOB ) as FD'
-- Load file data into temp table #FileData
exec sp_executesql @SQLLoad
-- Load SQL into a local variable from temp table #FileData
select top 1 @Script = FileData from #FileData
I was wondering if anyone out there has any code to automate the execution of a SQL Job? The purpose of this is to run the vbs file locally (on local machine), connect to SQL server and run the SQL Job. I would appreciate this help. While I can run DTS packages, I am unable to "find" the SQL object to run the SQL Job. Again thanks.
Dim parameters(1) As ParameterValue parameters(0) = New ParameterValue() parameters(0).Name = "s_no" parameters(0).Value = param1
My question is how do I pass dynamic values in param1. I have a variable in my stored proc called @s_no which i try to pass like this, however it is not working.
Hi, I would like to ask someone who could help me with one small problem:
I would like to run SQL server task on one of my servers, that will connect to all other servers (including ones that are not SQL servers, ie: Exchange, Test Server etc) one at the time, and that will execute one batch file (update.bat) that is residing on each server under C: drive.
Batch file will copy some things from all of the servers to one server.
hi i have some couple of .sql files. each file contains some tables creation and some stored procedures creation, i don't want to go to Query Analyzer open each SQL file and Execute, i want to execute all .SQL files at a time, how can i do this one.
I need to execute an excel file (built with macro) whenever a date change is detected between 2 fields in my sql table.
if @Date1<>@Date2 begin DECLARE @command varchar(1000) SET @command='d:Refresh_ABC.bat' exec xp_cmdshell @command . . .
In my Refresh_ABC.bat, I have the following line Start Excel.exe "D:ABC.xls"
When I tested the above script (from Declare... to @command) in query analyzer, nothing seems to happen. It shows The command(s) completed successfully. But my excel file was not refreshed at all. When I ran the batch file Refresh_ABC.bat on it's own, ABC.xls was refreshed so the batch file is working.
Next test, I placed the line Start Excel.exe "D:ABC.xls" on SET @command = ......... line. Same problem, nothing happened.
Any idea what went wrong. Please advise. Thank you.
I'm trying schedule a batch file to run as a job from sql server agent. The batch file copies files from one server directory to a directory on another server. The batch file works properly when executed directly. The job is being executed under the sql service account login. I've given the service account access to both the source and destination directories.
When I try to run the job it fails with an "Access is denied" error on both the source and destination directories (as read from job history).
Hi All, I would like to embed some stored procedures in a batch file and execute it from the command prompt on windows. I have no idea of how to embed a stored proc in a .bat file. Can you please redirect me to the solution to this?
Is it possible to use sql to execute a batch file? I would like toexecute the following "C:BTWartend.exe /f=C:BTWToolboxFormatscarnum.btw /p", 6Thanks,Matt
I continue to try and find "easy" solutions to what should be a straightforward problem of outputting the results from a stored procedure to a file.
I tried using both XML task and file system task with the thought that one of those would actually be able to output a file from a variable, but both of those tasks threw fits when I tried using different variable types (file system required a string, but the XML result set never seemed to throw anything but an object) so I decided to just try a script task and do everything "manually".
So my latest gyrations have been thus:
1) Set execute sql task to output XML and push to a script task to write a file
2) Set execute sql task to output a full result set and push to a script task to write a file
Number 1 was the only one I could get working, because I kept getting this error with Number 2 that said the variable wasn't a recordset (maybe it was null?)
I can actually create files now via the script task, but it seems like the variable that should get the results from the stored procedure isn't getting anything. I tried using a MsgBox to see what was actually being passed to the script task, and all I got was the number 0 which I'm assuming is the default for the object type.
What's the best way to debug this? The package runs without errors, and I'm not familiar with debugging in SSIS. How can I tell if the stored procedure is returning results into the result set variable?
i have a stored procedure and i want to run a program from it.. i think that i need use api functions but how can i do that.. if there is a nother ways please tell me..
I am having trouble getting a job step to execute an executable file. One of the options for a job step is to enter an operating system command. I have tried entering the following operating system command to execute a file:
'start c:myfile.exe'
For some reason this doesn't work. How can I get SQL Server to execute an executable file?
Hi, and thanks for your help. I have created a simple batch that xcopy files from one directory to another shared directory (in another server). Here is the code: xcopy c:OUT_TRANSIT E:BACKUP_CESWEB /y
the E:BACKUP_CESWEB is in another server where I mapped the BACKP_CESWEB folder and make it shared.
the location of the batch file is in c:code . When I double click the batch, the files are moved into E:BACKUP_CESWEB But when I use the window 2000 schedule or try to run the batch code from within sql server analyser, I get no results.
Any idea how to solve this. The bottom line is that I want to copy files from one server to another server.Thanks for your help
I am situation, where we have a table named as Project, columns for the table as follows:
Code: ------------------------------------------ ID | ClientCode | ProjectName ------------------------------------------ 1 | AAA | Dubai Airport Phase I 2 | AAA | Dubai Airport Phase II 3 | ARC | Salala 4 | MIZ | UMBC Building ------------------------------------------
Now my task was, whenever a project name and other details being created, then a Folder will be created in a server itself in the path E:ProjectFolder in following way:
You can see here Folder and sub-folder is being created with that following project - client code & ID
I used following trigger to do the same:
Code: CREATE TRIGGER [dbo].[CreateFolderName] ON [dbo].[Project] after INSERT AS SET NOCOUNT ON BEGIN declare @chkdirectory as nvarchar(4000), @folderName varchar(100), @mainfolderName varchar(100) declare @folder_exists as int SET @mainfolderName = (SELECT ClientCode AS Project FROM INSERTED) SET @folderName = (SELECT (ClientCode + cast(ID as varchar(10))) AS Project FROM INSERTED) set @chkdirectory = 'E:ProjectFolder' + @mainfolderName + '' + @folderName
[code].....
This worked like a charm, now my next task is using same trigger, I have to create a BAT file inside that SubFolder - T-SQL for creation of BAT File as follows:
Code: DECLARE @FileName varchar(50), @bcpCommand varchar(2000) SET @FileName = REPLACE('E:ProjectFolder[red](select ClientCode from INSERTED)[/red][red](select ClientCode + cast(ID as varchar(10)) from INSERTED)[/red]xcopy_'+ (SELECT cast(ID as varchar(10)) FROM INSERTED) +'.bat','/','-') SET @bcpCommand = 'bcp "[red]SELECT 'xcopy "E:ProjectFolder' + clientCode + '" "10.0.0.35ProjectFolder" /T /E /I' FROM INSERTED[/red]" queryout "' SET @bcpCommand = @bcpCommand + @FileName + '" -U SQLServerUsername -P SQLServerPassword -c' EXEC master..xp_cmdshell @bcpCommand
Here I am not understanding how to insert the above T-SQL in the Trigger as well as the above T-SQL is not right, what's wrong in this?
Last query that will be included in the trigger is to execute the newly created bat file.
i have designed DTS packages, I have a script component that picks up the 'Path' of a file stored. The Path is a column in database and that obtained from OLEDB source.
now i need to zip the files in the path given above and than store it in some location. How do I do that?
Hi,How can we execute SQL Scripts using Batch file???i think Batch file should contain Username,Password,Database andScripts...Using that file scripts should run...How can i give UserName,Password,Database and all those things?Plz send me details how to do that...if possible with example...Thanx in advance,RR...
I have a situation where I need to make sure a task executes regardless of whether the package starts from a checkpoint or not. Is this possible?
Here's the scenario:
I have a package with 3 tasks {TaskA, TaskB, TaskC} that execute serially using OnSuccess precedence constraints. The package is setup to use checkpoints so that if a task fails the package will restart from that failed task TaskA is insignificant here. TaskB fetches some data and puts it in a raw file TaskC inserts that raw file data into a table.
Problem is that the insertion violates an integrity constraint in the database - so it fails. The problem is easily fixed but it needs to be fixed in TaskB because that is where the data is sourced.
So, I need to be able to execute TaskB again, even though it was TaskC that failed. Currently the package restarts from TaskC which reuses the raw file (which has the bad data in it) so the package continues to fail even though the cause of the problem has been fixed.
How do I configure the package in order to execute TaskB again? Is it even possible?
I want execute my package when a set of files exists in a directory. What is the best way of doing this?
I have been successful in creating a WMI Event Watcher Task that executes when any file (the first file) is added to a directory. But I can not figure out the WQL for a specific file or set files.
What have you done with in SSIS to trigger the package?
hi all I have to execute the stored procedure from code file string constr = ConfigurationSettings.AppSettings["ConnectionString"];SqlConnection con = new SqlConnection(constr);con.Open();SqlCommand cmd = new SqlCommand("GetTax",con);cmd.CommandType = CommandType.StoredProcedure;SqlParameter paramFrom = new SqlParameter("@from", SqlDbType.VarChar, 50);paramFrom.Value = "JFK";SqlParameter paramTo = new SqlParameter("@To", SqlDbType.VarChar,50);paramTo.Value = "HOU";SqlParameter paramAirline = new SqlParameter("@Airline", SqlDbType.VarChar,50);paramAirline.Value = "US";SqlParameter rpTax = new SqlParameter("@Tax",SqlDbType.Int);rpTax.Direction = ParameterDirection.Output;cmd.Parameters.Add(rpTax); insted of this way can i execute stored procedure any other way like exec MystoredProc "param1"."param2","param3" i appreciate u r help
I have a stored procedure that generates some data and dumps it into a table. I need to export data using bcp based on the data that this procedure creates. So I know how to use bcp but don't know how to execute the procedure and pass it the two variables that it needs. I googled it and sqlcmd looks promising but can get the syntax right. The two variables are the current year and school number ie. 1112, 0021.
-The first task in that package (an Exec SQL Task) retrieves a timestamp value from an external source
-That timestamp value is used in data-flows to pull data that has been changed since then
-At the end of the package the value in the db is updated
Now...if something fails then on the next execution the checkpoint file will kick in. But this means that the first task (which retrieves the timestamp) will not execute and therefore all the data-flows will be pulling the wrong data.
The only way I can think of getting around this problem is to specify that a task should execute regardless of the presence of a checkpoint file. Unfortunately it seems this cannot be done.
Another option might be to put an OnPreExecute task on the package that gets the timestamp value.
Anyone got any advice about how I should progress? Short of a suggestion from elsewhere I'm going to go with the tactic of using the OnPreExecute to retrieve the timestamp.
Hi you all, In abc.aspx, I use a GridView and a SqlDataSource with a SelectCommand. The GridView's DataSourceID is the SqlDataSource. In abc.aspx.cs, I would like to use an IF statement in which if a criterion is not satistied then I will use the SqlDataSource with another SelectCommand string. Unfortunately, I have yet to know how to write code lines in order to do that with the SqlDataSource. Plz help me out!