Running Multiple SQL Query Files
Jan 8, 2008
I have several SQL files that make several different table updates and data changes.
How would I create something (maybe an SSIS package?) or even write a T-SQL script that runs through these files and executes the queries contained in them?
(I put this thread in SQL General also because I wasnt sure if this could be implemented in T-SQL or not.)
Right now what I have is a ForEachLoop Container with an ExecuteSQL task component inside of it. I'm thinking that the contents of each of the sql files would be copied over to a "Temp.sql" file and then executed in the loop. So I've created a Temp.sql file for this purpose and the properties of the ExecuteSQLTask component are setup as so:
(I've created my connection managers appropriately with an OLEDB source and a File Connection pointing to Temp.SQL)
ConnectionType OLEDB
Connection servertest
SQLSourceType File connection
FileConnection Temp.sql
And for my ForEachLoop container, my properties have been set as:
Collection
--> Enumerator Foreach File Enumerator
--- Enumerator Configurations ---
Folder: C:lahblahblah
Files: *.sql
with retrieve file name set as Fully Qualified
Have I set this up properly? (obviously not, because it's not working )
View 7 Replies
ADVERTISEMENT
Mar 15, 2007
Can multiple instances of SQL 2005 Express attach to the same database files on a network share? I have seen this done before with MSDE where the database files are stored on the server, but instead of having a SQL server running on the network and then connecting to it, only the database files exist on the network share and the users connect through MSDE running on the local machine. Is this possible with SQL2005Express? I do not have the ability to share an SQL instance from one workstation to another nor do I have the ability to install an instance on the corporate server. Is it as simple as creating the database and storing the files on the share then attaching the database to the SQL Instance on each workstation?
View 3 Replies
View Related
Jun 12, 2015
I want the below query to run 24 hours ..once the insert is complete, run again , so on for 24 hours .
there is a way to run every second in as job but i want to run only after run complete ..is there a way to run the query after every complete run ? and keep in job
INSERT INTO [dbo].[Audit_Active]
([SPID],[LoginName],[HostName],[ProgramName],[Command],[LastQuery],[DBName],[ServerName])
SELECT
--DISTINCT
p.SPID,
p.LogiName,
p.HostName,
[Code] ....
View 8 Replies
View Related
Jun 16, 2015
I have a requirement where in i have around 15 different flat files , filenames are fixed but folder path can be changed(i think i should use a variable for folder path). These 15 files data should go to their respective tables in the database.
Whether I need to create separate data flow task for each file or separate package? In addition to these, example : while importing product data into product table, if product ID already exists, we need to ignore it and upload only the new records.
View 4 Replies
View Related
Jun 27, 2006
I have a couple of hundred flat files to import into database tables using SSIS.
The files can be divided into groups by the format they use. I understand that I could import each group of files that have a common format at the same time using a Foreach Loop Container.
However, the example for the Foreach Loop Container has multiple files all being imported into the same database table. In my case, each file needs to be imported into a different database table.
Is it possible to import each set of files with the same format into different tables in a simple loop? I can't see a way to make a Data Flow Destination item accept its table name dynamically, which seems to prevent me doing this.
I suppose I could make a different Data Flow Destination item for each file, in the Data Flow. Would that be a reasonable solution, or is there a simpler solution, or should I just resign myself to making a separate Data Flow for every single file?
View 9 Replies
View Related
Aug 14, 2012
I am trying to restore multiple .bak backup SQL database files onto a new server. However, I have found that it will not allow me to restore multiple databases at once. Is there a way to do this so that I do not have to manually upload one at a time? I tried adding all the .bak files at once to the backup device window but it only did the first one listed. It would be so much easier to restore them all at once so that I do not have to continue this manual process. I am restoring them via device.
View 13 Replies
View Related
Feb 15, 2008
I need to be able to bulk insert a bunch of tables from their corresponding flat file. I have created an XML file (see below) which has the file name/table name pair at each node. I then created a ForEachLoop task and used the Node enumeration type and the following OuterXpathString: ReferenceFiles/File. At this point I get lost. How do I pass the 2 inside node values (file name and table name) to variables which I can then use as expressions for the bulk insert task inside the Foreach?
Here is XML file:
Code Snippet
<ReferenceFiles>
<File>
<FileName>Ref_Categories.txt</FileName>
<TableName>Ref_Categories</TableName>
</File>
<File>
<FileName>Ref_Configs.txt</FileName>
<TableName>Ref_Configs</TableName>
</File>
</ReferenceFiles>
Thanks.
View 1 Replies
View Related
Nov 29, 2007
I used the data export wizard to export a single table to a single flat file (multiple wasn't allowed). I saved the package as a *.dtsx file which I'm attempting to edit to add the additional tables.
Creating additional sources is fairly easy copy of the first source and change to the table name.
I've tried copying the destination connection and changing to a new text file, but can't get past having to add each column manually to the new destination.
How can I duplicate the mapping that must be taking place in the wizard in the *.dtsx editing environment?
This seems like a simple / common task, but I've been unable to find a solution.
Thanks, Richard
View 1 Replies
View Related
Sep 1, 2006
If I start a long running query running on a background thread is there a way to abort the query so that it does not continue running on SQL server?
The query would be running on SQL Server 2005 from a Windows form application using the Background worker component. So the query would have been started from the background workers DoWork event using ado.net. If the user clicks an abort button in the UI I would want the query to die so that it does not continue to use sql server resources.
Is there a way to do this?
Thanks
View 1 Replies
View Related
Oct 26, 2007
i have 20 .sql files that were written to create new views in a Database. We are moving over to a new database and need to run all of these scripts against the new database to add the views (each time we refresh for testing) Is there away to write a procedure that calls all of these .sql files so they do not have to be opened and run individually? I am thinking something like
EXEC xp_sqlmaint '
or
EXEC Master..xp_cmdshell
but I can not figure out the right syntax or else these are the wrong direction. What do you suggest?
Thanks,
Swoozie
View 5 Replies
View Related
Jun 1, 2007
Hi,
I have searched but not found quite the best way to look at this so far..
I have an application that outputs data to several text files (up to 30). These have commonality by an object name, but then contain completely different column data.
In DTS I had each of the source text file connections going to one OLE DB connection and then individual transform data tasks pointing to the one OLE DB connection.
Looking at SSIS, it would appear that I would need to have one source and one destination for each of these and therefore 30 parallel data flows?
Just wondering if there is a neater way of doing this??
It is a regular data import that happens a few times a day - the text files are named the same as the SQL tables - ie app_userdata.txt goes to app_userdata table.
Hope that explains ok and thanks in advance.
Mike
View 3 Replies
View Related
Feb 28, 2002
Hi,
I'm trying to run a simple batch file from a SQL job (SQL 7.0 sp4). No errors are received but the job does not complete. When I try to run the job manually, I get a message stating "Error 22022: SQLServerAgent Error: Request to run job my_job (from User DomainAdminUser) refused because the job is already running from a request by Schedule 127 (Schedule 1).
The services are running as a domain admin account.
Help!
Thanks in advance.
View 1 Replies
View Related
Aug 27, 2007
Folks -
Did you have any success when running bat files (Execute Process Task) from SSIS through SQL Server Agent jobs? My package will succeed when I run it from my machine, when I ask the DBA to run it manually from the Server but not when we run from a job.
The job will hang and the bat file does not seem to be executed. The executable property does evaluate to the right path and the package owner does have write/execute permissions in the folder where the bat file is located.
Everything I could find close to that is under <http://support.microsoft.com/kb/918760>. Would you shed any light on this? When I run the same bat file from a DTS through SQL Server Agent job, it will work with no issues. Any help would be very much appreciated.
Thanks.
View 13 Replies
View Related
Mar 19, 2007
I have created a master controller package which runs as follows
deletes all the log files -> deletes few flat files on different drives -> preprocess task(execute package task) -> c# executable (execute process tasks) -> postprocess tasks (execute package tasks)
i need a create a task just before the preprocess task with an user input asking whether he wants to run a particular batch file before proceeding to preprocess. if the user says yes it should run a batch file followed by preprocess tasks, c# and post process or else it should directly goto preprocess, c# and post process (neglecting batch file task)
can anyone help me how to do it.
View 9 Replies
View Related
Sep 20, 2005
Hi my data files sit in the default directories and I think they are causing my partition to run out of space. I mainly use one db that I created but don't use the others (ie master, model, tempdb, etc). Yet I see their MDF and LDF files are growing. What can I do to shrink them down or perhaps move them off to a larger partition after shrinking?
View 6 Replies
View Related
Aug 4, 2004
I am trying to write (my first, unfortunatly) DTS, and am having some problems.
I need to be able to import multiple flatfiles (all in the same format, just with different schema), each one going into a different table. I have written an application to call my DTS, sending it variables for the tablename and the filename. This works fine when I test it on a single flatfile.
My problem is, the Tranformation object does not reset after each DTS call, so I get "Column does not exist" errors after the first successful import. I can go into the DTS Manager and reset the Transformation options, but that would defeat the purpose of automation. Is there anyway to reset, or another technique, the Transformation object so that it will continuosly work on files that use different schema?
I am very new at DTS, so please consider me "ignorant" when replying.
Thanks in advance.
- Jordan
View 4 Replies
View Related
Aug 26, 2013
We are in the process of upgrading to a new SQL 2012 server but we have many packages that load data from dbf files created with FoxPro into one of our databases. We have not converted the packages and run them with DTS but we get the following error:
Error: 2013-08-26 11:05:27.36
Code: 0xC0209303
Source: BenchmarkLoad Connection manager "OLEDB NPIONE.Investment.middleTierSQL"
Description: The requested OLE DB provider SQLNCLI.1 is not registered. If the 64-bit driver is not installed, run the package in 32-bit mode. Error code: 0x00000000.
[Code] .....
I searched for OLEDB and ODBC drivers for SQL 2012 64 bit but cannot find any that is newer than the 2005 that we have. What can I do short of changing the source files to overcome this issue?
View 8 Replies
View Related
Sep 9, 2005
I have many, many T-SQL scripts that create stored procedures and gives proper permissions. (uploading to production)I've been looking around for a way to run them all at once with a single script to pull them into the sql analyzer.Not copying and pasting them all, there are too many, but reading and executing.Anyone know a link with a sample to do this?Thanks,Zath
View 3 Replies
View Related
Jun 20, 2002
We have a DTS package that is submitted every minute by a SQL Agent job (2000). We see instances where before the first job is completed, the second one starts running.
Has anyone encountered this type of behavior before?
View 3 Replies
View Related
Oct 1, 2007
I get the following message when I execute a mantenance plan to delete files older than 1 day.
Error # -1073548784
Executing the query "EXECUTE master.dbo.xp_delete_file 0,N'',N'',N'2007-09-30T07:56:09'
" failed with the following error: "Error executing extended stored procedure: Invalid Parameter". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Any help would be appreciated!!
View 3 Replies
View Related
Mar 2, 2005
Hi,
I was wondering if this can be done...
I have a complex query which has to do a few calculations. I'm using subqueries to do the calcs, but most of the calcs have to use a value gotten from the first subquery. I don't want to have to type the subquery out each time, so is there a way of assigning it to a variable or putting it in a UDF or SP?
E.g.
I have a table with 2 cols - amount, date.
SELECT total_amount, closing_amount,
FROM table1
GROUP BY month(date)
Total amount is the SUM(amount) for the month.
Closing amount is the Total Amount plus the amounts for the current month with a few extra calcs.
As I have to use SUM(amount) in the second subquery, is there a way I can do it without having to type hte subquery out again?
This is only a basic example, what I'm trying to do will invovle a lot more calcultions.
Hope someone can help,
Thanks,
Stuart
View 5 Replies
View Related
Feb 18, 2004
I've got the stored procedure which first creates and then starts a job.
This stored procedure can be invoked from a number of triggers (after insert, after update)
The problem is:
when a sequence of statements is being executed one by one (like an insert immediately followed by several updates) and the stored procedure is invoked from the propper insert or update trigger once for each statement, the jobs are created in the right order (first for insert statement, then for updates in the order of the initial statements), but are executed in the wrong order.
Is there any explanation to this? And any solution?
The order of such jobs execution is vital for my application.
Thanx a lot in advance.
Please see the enclosed screenshot for an example list of jobs. Name of the job contains time of it's creation and another column shows the time the job was executed.
View 7 Replies
View Related
Feb 29, 2008
Hi,
I have zero experience running any databases that spread further than 1 machine, so I have a few theory questions here that hopefully someone can help with. Hopefully this is the right forum, I'm not sure if it classifies as 'clustering'.
Anyways, we are launching a web app that is going to start with just 1 webserver/db server. For speed reasons, after some growth we might have to have a load balanced setup with a webserver in europe and one in north america. Basically the webservers are going to be serving 100,000's of files and each time a file is served it needs to be recorded in the database.
I think that if I'm connecting my european webserver across the internet to my db server, thats killing the purpose of having a webserver in europe to make for faster responses.
I am thinking that this european web server/db serving is only going to be logging the files served. Is there a way to import them into north american database everynight ?
I'm not sure what the best approach would be for something like this, but any suggestions are greatly appreciated.
Thanks very much,
mike123
View 2 Replies
View Related
Jun 23, 2007
hi !
I am using sql server 2005 sp1 standard edition.
I have a transactional replication set up from one server to another.
From last couple of days i am noticing two distribution agents running for the same subscription in the replication monitor. Before this only one distribution agent was running. Is there any problem having two distribution agents running in parallel for the same subscription on a server?
Any help would be appreciated.
Thanks
View 1 Replies
View Related
Oct 9, 2007
Hi Techs,
I have created several DTS Packages those are working fine, tested and verified.
I need to run them by a single click, either by batchfile or some sort of command.
Thanks
Iqbal Ansari
View 4 Replies
View Related
Jun 25, 2007
I’m binding the distinct values from each of 9 columns to 9 drop-down-lists using a stored procedure. The SP accepts two parameters, one of which is the column name. I’m using the code below, which is opening and closing the database connection 9 times. Is there a more efficient way of doing this?
newSqlCommand = New SqlCommand("getDistinctValues", newConn)newSqlCommand.CommandType = CommandType.StoredProcedure
Dim ownrParam As New SqlParameter("@owner_id", SqlDbType.Int)Dim colParam As New SqlParameter("@column_name", SqlDbType.VarChar)newSqlCommand.Parameters.Add(ownrParam)newSqlCommand.Parameters.Add(colParam)
ownrParam.Value = OwnerID
colParam.Value = "Make"newConn.Open()ddlMake.DataSource = newSqlCommand.ExecuteReader()ddlMake.DataTextField = "distinct_result"ddlMake.DataBind()newConn.Close()
colParam.Value = "Model"newConn.Open()ddlModel.DataSource = newSqlCommand.ExecuteReader()ddlModel.DataTextField = "distinct_result"ddlModel.DataBind()newConn.Close()
and so on for 9 columns…
View 7 Replies
View Related
Jan 26, 2006
I have several sequence containers in one package that fire off execute package tasks. I would like each of the sequence containers to start at the same time when the job starts running. However when I set them up to do that, i get an error that the variable cannot be read because it is locked. I have the variables setup as readonly so not sure why they are being locked. When I run the package and have each sequence container fire off after the previous one ends it runs fine.
Any ideas on how to get around this?
Thanks!
View 1 Replies
View Related
Feb 13, 2006
In my application code I am trying to invoke multiple threads in which each thread is loading an instance of the same SSIS package and would initialize the package variables with different values and execute the different instances in parallel. In each thread - after the package execution has completed successfully - I read that instance's SSIS package variables to get result information from that Instance run.
When I load the same package in different thread using LoadFromSqlServer() method
- does the code create multiple instances of the SSIS package and load the distinct instances in each of the thread
- Will the Package Execution ID be different for the different instances?
- Are the package level variables instance safe?
View 2 Replies
View Related
Jul 12, 2007
During customer engagement, I was told that there are Microsoft add-on to be installed on SQL 2005 and the SQL 2005 can be simulated as SQL 2000, SQL 2000 SP4, SQL 7.0 in different SQL instances.
Is this true? Are there URLs about this topic?
Thanks a lot.
Po-Shan.
View 7 Replies
View Related
Dec 17, 2003
I have multiple .sql files, exe each one manually is a pain, so how do you run multiple .sql files all at once? Beside creating a batch file, are there t-sql commands that could execute .sql files?
Thanks!
View 4 Replies
View Related
May 18, 2004
I have a rather large sale transaction DB. Basic header, and detail tables. I am providing a third party company with daily sales information, and I need to give them back data from about 8 or 9 months ago. I currently have a DTS package that gets sales for the current day, but since I have to go back, I have to manually edit the query in the DTS package, and change the date range...UNLESS ...
Blah, blah, blah. The problem is that they can only take the data in Daily files. So, there would be ONE file for each day. I really don't need to be manually running these jobs, so I'm wondering if someone could point me to a way of writing a package (maybe ActiveX, not sure) that would run through a loop, basically, of dates, and create a seperate file for each day. Versus having to edit a generic DTS package, and changing the date range 350 times...
View 6 Replies
View Related
Jan 24, 2007
I need to restore a database that has a bak file and many log files. Is there a way I can do this with one recovery step?
Or do I need to use a different restore object for each file?
MTmace
View 1 Replies
View Related
Jun 7, 2006
Using an expression to set the log filename to include the date and time results in 3 log files being created.
Ummm. Why? I only ran the package once. Is SSIS not sharing my log file connection among the different components?
On first thought my workaround would involve using a script task to "set" the log file name to a variable and use an expression to set the log connection to the variable. But the problem with that is that logging starts BEFORE the script task is run...
View 8 Replies
View Related