Writing Sqlcmd Batch Files To Do Bulkfile Copies Or Imports
Sep 23, 2007
I want to write a batch file that will do just that. The problem is bcp or bulkcopy never works with my code, it doesn't recognize it. Maybe I'm going the wrong way to do this, but I could use some help. Ideally I'll make the batch to do the importing via bulkcopy or something like it of a text file into a table and use windows scheduler to automate it. Be as specific as you can please, I'm very new to sql server.
I am using the following batch file to execute a script that creates a db and all its objects in the local sql express:
sqlcmd -S (local)SQLExpress -i C:CreateDB.sql
This works fine, but I'm wondering if there's an easy way to put the script in the batch file, so users don't have to worry about putting the script in the C drive. I tried getting rid of the i parameter and pasting the script from the sql file into the batch file, but it didn't work.
Ok I have 2 batch files and I have rum them one after one another. I am using
call batch1.bat call batch2.bat
It is running the 1st batch fil successfully but it is not running the second one. I used a pause to see th e error it says some internal and external batch error.
Ok batch1 is at the desktop. and batch2 is in one of the folders in the desktop.
Nature of batch1 is that it runs successfully.
Nature of batch2 is that it gives and error if I individually execute it. But when I am running together then its not showing the error.l
Please if u did not understand this situation atleast show me how to run two batch files using command lines.
Hi all, I have the "Northwind" database in my Sql Server Management Studio Express.
In my C:ProSSEAppsSamplesForChapter02Chapter02 folder, I have the following 2 files: (1) ListColumnValues (MS-DOS Batch File) sqlcmd -S .sqlexpress -v DBName = "Northwind" CName = "CompanyName" TName = "Shippers" -i c:prosseappschapter02ListListColumnVales.sql -o c:prosseappschapter02ColumnValuesOut.rpt (2) ListColumnValues (Microsoft SQL Server Query File) USE $(Northwind) GO SELECT $(CompanyName) FROM $(Shippers) GO When I ran the following SQLcmd: C:ProSSEAppsSamplesForChapter02Chapter02>ListColumnValues.bat I got the following "ColumnValuesOut.rpt" with error messages:
'Northwind' scripting variable not defined. Msg 102, Level 15, State 1, Server L1P2P3SQLEXPRESS, Line 1 Incorrect syntax near '$'. 'CompanyName' scripting variable not defined. 'Shippers' scripting variable not defined. Msg 102, Level 15, State 1, Server L1P2P3SQLEXPRESS, Line 1 Incorrect syntax near 'CompanyName'.
I copied these T-SQL statements from a book and I do not know how to correct them. Please help and tell me how to correct these errors.
I'm trying to run a simple batch file from a SQL job (SQL 7.0 sp4). No errors are received but the job does not complete. When I try to run the job manually, I get a message stating "Error 22022: SQLServerAgent Error: Request to run job my_job (from User DomainAdminUser) refused because the job is already running from a request by Schedule 127 (Schedule 1).
The services are running as a domain admin account.
I need to be able to query a db on server A, in a batch file, return the result (DateTime value) into a variable, and then use that date as a parameter in a query that I will query on server b.
Hi, I'm relatively new to SQL and ASP.NET and I'm after some assistance to what I hope is an easily achievable goal. I need to upload some files (.pdf, .doc, .jpg - whatever) to a directory on my webserver. Once this has been done, I'd like to write some data to an SQL Database Table that will contain information about these files in a datagrid (or something similar) along with a link to download them. E.g I upload car.jpg which is located at /images/secure/car.jpg and is a photo of car. When a user logs in, they will be shown a datagrid which pulls information from said SQL database and in this datagrid will be something like: Filename Description Link car.jpg A pic of a car Download
If anyone is able to help me out with regards to linking to a file from an SQL dB it would be greatly appreciated, and if you require any further information just let me know :) Ben.
Hi guys,I have a stored procedure that I will subsequently post below thismessage. What the stored procedure does is, it reads some specifictables in a database and created XML off it.The problem comes in when the data is bigger than a few hundred/thousand characters. The data is truncated and the XML file is notfully made.Now i have been reading some posts by some people and tried using TEXT/NTEXT/ IMAGE type to write the files but they give me errors which,again, I am adding to this email.STORED PROC:****************************CREATE PROCEDURE usrp_sp_makexmlfile@str varchar(50),@templatefile varchar(255),@ReturnValue as image OUTASSET NOCOUNT ONDECLARE @strVar as varchar(8000)/*DECLARE @ReturnValue as varchar(255)*/Set @strVar = 'Select * from ' + @str + ' FOR XML AUTO'Set @templatefile = 'c: emplate.tpl'EXEC (@strVar)SELECT @ReturnValueGOTHIS IS WHAT I GET WHEN I USE IMAGE/NTEXT/TEXT TYPE:************************************************** *******An unhandled exception of type 'System.InvalidOperationException'occurred in system.data.dllAdditional information: Parameter 2: '@ReturnValue' of type: Byte[],the property Size has an invalid size: 0Also, if i try to add an integer value to the image/text/ntext like8000 or something less than that, it tells me that the result has to bediscarded because the current process has had some error.Can someone give me a suggestion on how to resolve this or an examplethat would illustrate the solution? Thank you all in advance.Pi.
Hi guys,I have been trying to write data from a database to XML files andalthough i have been successful in doing that, the file(s) that are madeare not of the form that I would want.I am using:SQLDataSet.WriteXml(strFileName, XmlWriteMode.WriteSchema);to generate the files where strFileName is the name of the file. Thefiles when viewed with XML Editor are seen as lines of data instead ofthe regularly generated XML where each row is one below the other. Whenviewed with WordPad/Notepad i see elements />< betweeen the dataand somehow I think this has to do something with the problem.Can someone help me with this?? Any help or suggestion would be greatlyappreciated.Thanks.*** Sent via Developersdex http://www.developersdex.com ***
I need to write text files of the size more than 1GB! First thought of doing this with BCP but then fetching data from stored proc was taking too long (more than one hour) and so BCP did not seem to be the right option as we have less flexiblity in terms - handling errors, knowing the status as to what point the writing has been complete etc. Most importantly getting such a huge amount of data did not seem to be a one time job! So I am now thinking about writing a CLR procedure which fetches smaller amount of data at a time and writing to file. Something like - open connection, get data, close connection, write to file and continue in cycles of this till whole file is written. Does this makes sense considering the amount of data to be fetched?
Any suggestion/comment on this will be much helpful!
I have created a master controller package which runs as follows
deletes all the log files -> deletes few flat files on different drives -> preprocess task(execute package task) -> c# executable (execute process tasks) -> postprocess tasks (execute package tasks)
i need a create a task just before the preprocess task with an user input asking whether he wants to run a particular batch file before proceeding to preprocess. if the user says yes it should run a batch file followed by preprocess tasks, c# and post process or else it should directly goto preprocess, c# and post process (neglecting batch file task)
Hi;Before I start let me say I know this is a silly way to go aboutthings, but it is one of those "my company made me do it" things.My company is in the process of ( finishing ) migrating from foxpro tosqlserver.They have a foxpro program that does a lot of various updates. Iwould like to write a tsql script to replace it.The trick is that the tsql script would have to have some proceduralprogramming artifacts that I don't think it has.I/my company would like the script to generate output messages thatthe user can see while the script is running. Everytime I have ran ascript in query analyzer I usually don't see the output ("print"statemetns, etc ) until all of the work isdone. Is there a way to run thing so I can see the output statementsas the work is being done?Can print statements work from within a tsql loop?The other thing is that I/my company wants the tsql script to writeoutput messages to a log file while it is operating ( for errorchecking and other logging).Is this possible with tsql?Thanks in advance for the infoSteve
I am having a scenario where all the stored procedures are stored in a folder (one sql file per sproc). Stored procedure does not have 'IF Exists .... DROP Procedure' in the script so before creating them we have to drop all sproc manually.
Can anyone help me writing a script / SSIS process to loop through each file in folder and write "IF EXISTS ... DROP PROCEDURE" with the procedure name in it ??
I can create a package that loop through each file in FOR each Loop task but dont know how to write in file using .net
I am new to Sql Server [Express version] and am not even sure I'm making the right choice. So here I am, seeking advice.
My database needs are nothing sophisticated. They just involve: (a) create tens of thousands of separate data files each under a unique file name of up to 8 characters, with file names read in from a pre-determined file name list. (b) store/insert VOLUMINOUS numerical data into each of the data files, with the data indexed by date&time, plus maybe one or two additional character or string fields. (c) for each data file, retrieve a subset of its data, perform extensive numerical calculations, and then store the results in one or more separate corresponding files, e.g. if a file name in (b) is F12345, (c) creates F12345R1, F12345R2, F12345R3, etc. which stores different sets of calculated results.
Thus, this is purely a console application, doing a batch job, and requiring no graphical user interface. Both automation and speed are important here, due to the large number of data files that must be created and/or updated, and the very extensive numerical calculations on the data.
The goal is to automate the daily or weekly creation of each of the tens of thousands of Sql Server database files, insert fresh data (read in from a fresh ASCII file) into each file, numerically process the data and then store the results in one or more separate, corresponding result data files, with all the steps automated and without need for GUI. Once coding is done, the entire data processing session is expected to run for many hours, or even days, in an automated manner, and without human intervention.
What would be the most efficient way of doing this under Visual Basic Express (which is what I'm learning to use) by directly calling Sql Server Express without having to go through GUI to create database files? What is the proper interface utility or library to use to enable direct database function calls without the need to learn SQL language? Is Visual Basic and/or Sql Server even good choices for what I want to do? I want to be able to call the basic, simple database functions directly and simply from program code in a non-GUI, non-interactive manner for the tens of thousands of separate data files that will be used.
I really miss the good old days when one can do a straightforward batch job via a console application, with simple, direct calls to create new data files, insert and index fresh data, retrieve any subset of data to do extensive calculations, create new files to store the results, etc. all under automated program control and iterating through unlimited number of data files, until the job is finished, all without human intervention during processing.
Or am I missing something because all this can still be done simply and easily under VB and Sql Server? I've several books here about Visual Basic 2005 and Visual Basic 2005 Express, all showing how to create a database via a GUI utility. That's fine if one needs to create just one or two databases, but not hundreds, or even tens of thousands (as in my case) of them on the fly.
So, I am looking for the simplest and most direct database interface that will allow me to do the above under VB program code alone, and easily. For something as simple as I have described above, I don't think I should have to learn the SQL language or manually create each database file.
As you can see, I just want to get some heavy duty numerical processing job done over tens of thousands of data files as simply and efficiently as possible, and with as little fanciful detour as possible. So, ironically, I am trying to use Visual Basic without being cluttered by having to learn its "Visual" aspects, yet GUI is what most VB books devote to or emphasize heavily. Similarly, I would much rather use simple, "lean and mean", direct database function calls than having to learn a new vocabulary of "English-like" SQL language.
Yes, I'm not used to this tedious detour of learning the GUI aspect of VB, or learning the Structured Query Language of Sql Server, just to try to do something simple that I need to do in batch mode via a console application.
Are there any good books or other helpful URLs that will help a guy like me? Am I even using the wrong language and the wrong database to do what I want to do? What are the better alternatives, if any? Any advice, experience and pointers on any of the above issues raised would be very much appreciated. Thank you!
We have a package that is using a ForEach loop container to access files on a network drive. For some reason I am getting a message that the ForEach enumerator is empty and did not find any files that matched the pattern. For the pattern I left the default *.* for testing purposes. I have specified the file folder as \remoteserverfilesharesubfolder and also as \remoteserverc$filesharesubfolder and have gotten the same message. However when I map a network drive and set the file folder to the network drive it finds the files. Is this a permissions issue?
After I finish processing the file I want to move it to a new directory. Once this is deployed in production, the package will not be running under a domain account and probably won't have access to the network folder. Is there any way to specifiy in the connection manager itself that it should use a specific account to access the folder?
I have about 1200 sql files in one of my folders. Almost all of these files do data inserts and updates, so they should be run only once. As and when required I have manually ran around 150 of them already. Whenever I ran any of these scripts, I log that file name into a log table in my sql server including the execution time. Since running 1000+ more files takes a lot of time, I want to automate running of these files through a batch file. But I also want to filter the files that are already run.My file list looks like follows.
select * from sqlfileexecutionlog FileNameRunTimeResult --------------------- DeleteDuplicateOrders.sql03/12/2014 14:23:45:091Success UpdateInventory.sql04/06/2014 08:44:17:176Success
Now I want to create a batch file to run the remaining files from my directory to my sql server. I also want to wrap each of these sql file executions in a transaction and log success/failure along with the runtime and filename into sqlfileexecutionlog table. As I add new sql files into this directory, I should be able to run the same batch file and execute only the sql files that have not bee run.
Is there a quick way to extract a full dump of 50 tables to 50 corresponding text files?
i.e.
table_a has to be extracted to table_a.txt
table_b has to be extracted to table_b.txt
table_c has to be extracted to table_c.txt etc.
I don't want to have to add each one separately by hand in the DTSX package designer. I can't see any way to do it in a loop (because you have to do the field mapping). I can't seem to get the DTS Wizard to help - it only seems to be able to handle one table-to-text extract at any one time. And I've tried editing the DTXS file directly (in XML) but it looks like it's going to be rather complex, even if I only do it to define the connection managers. Feel free to suggest any better way to do this, though the specification has already been agreed, so I'm unlikely to be able to change it. Thanks
I have 2 copies of the SP i work with. An original one and a backup. I usually work with the backup. Is there any way to update the original after updating the backup?. I usually forget what I have changed in the end of the day.
at work all the databases i use are only accessible via the local network, i am wanting to take a copy of the databases and save them as a database file so i can burn them onto cd and take home with me so i can get some extra work done on the weekend (i'd also like to start doing this so i have backed up copies of the database incase anything happens to it)
i was wondering how i can do this? are there any tutorials on the web that someone can point me to? i have SQL Server 2005 Management Studio Express.
I'm getting 3 copies of the result set expected, could someone take a look and tell me why. I know I covered this in school, but I can't remember the issue. Thanks
SELECT P.Quantity as Qty, P.ItemID, P.VendorCode, P.Descr as Description, P.UnitPrice as Price, P.Amount, I.Freight, Rcvd = 0, I.QtyRcvd as Ship FROM PurchaseOrderItems P, ReceivedItems I, Received R WHERE P.POID = R.POID AND R.IntRcdID = I.PRID AND P.POID = 193
I work for a Geographic Information Systems contractor and we are new to SQL 2005. One of our clients sent us a copy of an .mdf file. I have tried to attach the file but they did not send us the associated log file. What is the most efficient way for us to exchange copies of SQL 2005 databases with clients. Most of our clients are novices with SQL 2005 also.
In previous threads I saw that in a scenario where Log Shipping is active any other Log Backup activities should be avoided in order to let the Log Chain stay intanct. Until now we just used Mirroring and Full Backup and Log Backup. Introducing Log Shipping to a third Server in a separate location would mean that the existing Log Backup Jobs must be removed or Log Shipping will not work.
I am glad to have read these threads before falling into that trap myself.
A question was raised by my colleague who is responsible for the System and Network Administration. Are these Log Backups that will be performed by Log Shipping just differential? He would not like to see that Log Shipping is pushing or pulling Log's that grow during time.
I named to him two reasons that it must be differential. A Log Backup truncates the existing Log which implies that the next Log is just the difference since the last Backup. Also, it must be differential because the Transations can only be committed once anyway, so Log Shipping implies that those Log Backups are just as big as the incoming Transactions since the last Log Backup.
I'm using CLR integration on SQL Server 2005 to call functions from a COM dll made with VB6. I do this to keep some important parts in packed unmanaged code, to make reverse engineering harder.
I keep the COM dll in the system directory. When it's referenced from a .net project, VS copies and uses it as 'interop.myCOM.dll'. When I add the assembly to SQL Server, it needs the referenced 'interop.myCOM.dll' file added. When a class from the COM dll is called by a CLR stored procedure, it needs to locate the real COM dll.
So, the same dll has to be in two different places in two different forms, one as an assembly (interop), and one in its own place. I guess the 'interop' one is not really used, but it just helps to retrieve class information during design time and initialization or something. Everything works ok, but thinking about the packing step and others, this makes things messy. How can I prevent using two copies of my COM dll?
I load the COM class as follows:
Code Snippet
Private Shared ReadOnly m_myClass As myCOM.clsGeneric = New myCOM.clsGeneric
then map stored procedures (shared functions) to related functions of the m_myClass object.
is it ok to run VSS (shadow copies) on an SQL server?
I have IIS and SQL on the same server. I would like to make shadow copies of my IIS files, but don't want to do anything that could corrupt my SQL databases.
I'm working in a team, and need to share a database between the group - we don't have the luxury of sharing a server and connecting to it.
So What we'd like to do is make a copy of the database schema, and then share that through our repository.
I figured I just need to copy the .mdf file from the sql server database folder, and put it in the visual studio directory to work on it?
So I've tried this, but when I try to create a TableAdapter to the database through visual studio, It gives me an "access denied". I figure this is probably something to do with security settings?
Do you think I'll need to change my Database connection from windows authentication to sql authentication?
I don't really know exactly how to do what I'm trying to do, so any help will be appreciated.
there are two developers in different offices and they have started with the same db and keep modifying the stored procedures. in a point they want to update their copies of the DB with each other. Is there a easy way to do this in MS Sql server 2005.
there are two developers in different offices and they have started with the same db and keep modifying the stored procedures. in a point they want to update their copies of the DB with each other. Is there a easy way to do this in MS Sql server 2005.
I am creating a program that will take a master database and create separate databases for class room training.creating my own app to do this since it will have other stuff to do.i will have a master database that i will need to create multiple copies of. 2-20 copies, it is about 7GB large. it is used in a classroom training course for our company software. it will also copy a folder on the server onto multiple subfolders.each computer in the classroom will access its own copy of the database/windows folders.
What i am looking for is a fast/reliable way to create the multiple database copies. then when the training class is over and a new one is getting started, we will run my program to reset everything back to start.Should i detach/copy/attach or create a master backup and restore it 20 times. What kind of user access pitfalls will i need to look out for.
Wondering how this is handled by Multicast component. If multiple copies are created in the memory and the size of the dataset is large, this could cause some performance problems. Any thoughts? TIA