SQL 2012 :: Export Data To CSV Using Batch Scripting?
Jan 15, 2015Execute a pre written SQL on double clicking a batch file which will then export the results to a csv.
View 5 RepliesExecute a pre written SQL on double clicking a batch file which will then export the results to a csv.
View 5 RepliesHi!
I'm trying to do the following in a batch-file:
1) Install SQLExpress unattended providing settings in an .ini-file
2) Run an osql-command which creates a database and some tables etc.
My batch-script looks like the following:
REM Install SQLExpress
start /wait .SQLExpresssetup.exe /qb /settings %CD%mysettings.ini
osql -S (local)MyExpress -U sa -P passwd -i .createdb.sql
...
The problem is that the SQLExpress setup updates the %PATH%, but the change is not visible in my cmd-context . So, the osql-command fails.
It seems like I have to start a new cmd-window from the windows shell to get the updated environment.
Is there any way to make the environment changes visible to my script?
This might be more a sort of general batch-programming topic, but any help is appreciated. :)
Regards,
Sigurd Ringbakken
Hi to all
Is there any option in sql server DTS or any other third party tool that can script data. By scripting data i mean that....
if a table "Employee" contains 50 rows, i want the tool to write 50 insert queries for me so that i can run in it anywhere.
Problem is i have to insert data in a remote server where i cannot use DTS. I just have a text area to write my query and press the run button..
Hope u understand my problem. In case of any explanation please reply. Waiting for your response. Thanx in advance.
by to all
Background: In my current company the business users maintain a huge quantity of master data using excel. Then a series of SSIS jobs are edited and manually executed.
Goal: the challenge is to replace this process using MDS. One of the requested features is the possibility for the users to edit or insert new master data using the Web UI or the Excel Add-in and when they are done perform a merge of the master data in the target, in this case in the reporting DB.
The perfect solution for me is something like trigger the execution of a SSIS package to export the data from the subscription views to the reporting DB after the business rules are apply to a specific entity.
We have no certified SQL DBAs. Mainly because 95% of our production data is on Oracle and DB2 databases.
One of our Oracle DBAs is trying to write scripts that can be run by the AS400 robot to export data and backup the databases. I (being the only person with any SQL expierance) have been asked to ensure that the scripts she wrote will not only work but not bring down the server.
Can anyone point me to the correct place (web, book, ect) to find out how this is to be done? In years past my SQL servers have been stand alone dedicated units. With db backups scheduled through the SQL Manager and tape backups handled by the sysadmin.
Please help,
Cwells
I need to export the data directly using a query from sql server. This is just a temporary extract. Copy pasting the result in excel is giving mis-alignment.
View 1 Replies View RelatedI have a database in 3rd normalized form and want to export data and import it in a copy of that database. Is there a way for the import/export wizard to unload and reload the data in the right order, so that parent tables are loaded before child tables? If the import export wizard can't do that, what other options are available?
View 2 Replies View RelatedI'm trying to export a query to a csv file from a batch file.
batch file:
sqlcmd -S loni-sbs -d lonikahn0207 -E -Q "select * from ael" -o "AELORI.csv" -h-1 -s"," -w 700
BUT, I'm getting a user name and password issue inside the csv file:
Cannot open database "lonikahn0207" requested by the login. The login failed.
Msg 18456
Login failed for user 'LONIBOOKSAdministrator'.
Using server 2012 on local machine, I created an SSIS package that will execute in integrated services and Visual Studio solution but will not work when creating a job. Other solutions work well except when exporting data. The program pulls data from query and exports into .csv file. The messages I get are -
Data flow task 1:error:destination- Stage.csv failed the pre-execute phase and returned code 0xC020200E
and
Data flow task 1:error:Cannot open the datafile "pathStage.csv".
Version-
Microsoft SQL Server Management Studio11.0.3128.0
Microsoft Analysis Services Client Tools11.0.3128.0
Microsoft Data Access Components (MDAC)6.1.7601.17514
Microsoft MSXML3.0 6.0
Microsoft Internet Explorer9.11.9600.17041
Microsoft .NET Framework4.0.30319.18444
Operating System6.1.7601
Using below script to export the select statement result to .xls
declare @sql varchar(8000)
select @sql = 'bcp "select * from Databases..Table" queryout c:bcpTom.xls -c -t, -T -S' + @@servername
exec master..xp_cmdshell @sql
But result is not exporting in seperate tabs, all 4 column details are exporting in single cell.
how to export the data in columns to separate tabs in excel.
I face a task to edit definition for bunch of sp: cleaning <IF EXISTS THEN DROP> and changing CREATE for ALTER.
Do you think there is a good way to do it in batch and take original source from system.tables (even we have this as physical file in VSS)?
Surely enough <IF Exists then drop> is not uniformed, could be variations with syntax and on diff number of lines, IF Exists... vs OBJECT_ID is not null etc...
CREATE__sp_PROCEDUREA vs CREATE_spPROcedure (with diff spacing).
I have an interesting problem. A number of spids are being blocked by a single select statement. The select statement is the same as returned from sp_who2, sysprocesses, sp_whoisactive of dbcc inputbuffer. It is not waiting on anything and has status as sleeping.
Clearly it is not 'just' a sleeping select statement as I can see over a thousand locks held by the spid on 2 user databases and tempdb. I'm working on the theory there is a begin transaction with a bunch of statements and no closing commit. But I want to be able to prove that. How can I show what statements were previously executed as part of this transaction?
Additional Info: SQL 2012 Enterprise Edition. This is a test server but is a reproduction of a live issue. At this point the application team cannot isolate the code causing the problem, only the set of processes the code resides in.
I am using Variables to configure a lot in an SSIS Packages.
Over the years you add new variables that are usefull to your default-package.
However to use those in "older" packages you have to open add them manually.
Any way (e.g. skript) to add a set of variables to every SSIS-package in a folder ?
I have to use sql cmd and run diagnostic queries.
I need to run multiple dmvs as a batch file and storing the dmv result to some place.
I have backed up databases from a 2008 server and now I would need to restore them to a 2012 , the only issue is that I need a script bcuz I have over a hundred databases.
View 9 Replies View RelatedI have a table with about 466 Million rows. In this table there is a int column called WeeksToRetain as well as a EventDate column containing the date the row was inserted. I am trying to delete all the rows that that should be deleted according to the WeeksToRetain. For example, if the EventDate is 5/07/15 with a 1 in the WeeksToRetain column the row should be removed by 5/14/15. I am not sure what days SQL considers the beginning and end of the week. However the core issue I am having is the sheer mass of deletions I must do and log growth.
So I am trying to do the delete in batches. More specifically I want to load a temporary table with a million rows, then use the temporary table to load a sub temporary table with 100,000 rows and join this temporary table to the table I want to delete from looping through 10 times to get 1 million. The Logging.EvenLog table which is the table I'm trying to purge has a clustered index on EventDate (ASC). I would like to run this in a schedule job with enough time between executions for log backups to run.
DECLARE @i int
DECLARE @RowCount int
DECLARE @NextBatchDate datetime
CREATE TABLE #BatchProcess
(
EventDate datetime,
ApplicationID int,
[Code] .....
Hi,I need to take data from a SQL Serer 2005 database, and load into aremote 2000 database. I've already been able to script and create thedatabase objects (MS SQL Server 2005 has a nifty option which allowsyou to scripting for SQL Server 2000 compliance). Now i just need toget the data in.Is there a tool or utility out there that i can use to generate insertstatements for all the tables in database?Thanks much for any advice regarding this.
View 2 Replies View Relatedi'm working on sql 2000. I would like to take the db script and also have the data of static tables.
so that If i run the script, at once it should create my db and also fill my static tables data.
plz help me.urgent.
I have a remote batch file on machine B that I need to execute using 'Execute process task' control from a package on machine A.
How Can I achieve this....
In another forum post, a poster was deleting large numbers of rows from a table in batches of 50,000.
In the bad old days ('80s - '90s), I used to have to delete rows in batches of 500, then 1000, then 5000, due to the size of the transaction rollback segments (yes - Oracle).
I always found that increasing the number of deleted rows in a single statement/transaction improved overall process speed - up to some magic point, at which some overhead in the system began slowing the deletes down, so that deleting a single batch of 10,000 rows took more than twice as much time as deleting two batches of 5,000 rows each.
good rule-of-thumb numbers (or even better, some actual statistics and/or explanations) as to how many records should be deleted in a single transaction/statement for optimum speed? 50,000 - 100,000 - 1,000,000 or unlimited? Are there significant differences between 2008, 2012, 2014?
Ho all SQL gurus, I've searched for samples on how to automaticallyscripting SQL2000 tables to export data between databases via a sqlscript. (somehing like:INSERT INTO [ges1gara].[dbo].[CategAtleti]([CodCat], [Denominazione],[LimiteBassoDonne], [LimiteBasso], [LimiteAltoDonne], [LimiteAlto])VALUES(<CodCat,smallint,3>,<Denominazione,varchar(50),"Maschietti/Bambine">,<LimiteBassoDonne,smallint,6>, <LimiteBasso,smallint,6>,<LimiteAltoDonne,smallint,6>, <LimiteAlto,smallint,6>)INSERT INTO [ges1gara].[dbo].[CategAtleti]([CodCat], [Denominazione],[LimiteBassoDonne], [LimiteBasso], [LimiteAltoDonne], [LimiteAlto])VALUES(<CodCat,smallint,4>,<Denominazione,varchar(50),"Giovanissimi/Giovanissime">,<LimiteBassoDonne,smallint,7>, <LimiteBasso,smallint,7>,<LimiteAltoDonne,smallint,7>, <LimiteAlto,smallint,7>))Can you pls.point me to the right direction? TIAfrom tesis-Italy*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
View 4 Replies View RelatedHow can I Export Database with foreing Key and primary key.
Operation is that
SQL2005 Management Studio/Database/Tasks/Export Data
Before Version is SQL2000 we can Selected Copy Object and data between server and then Use Default Options click checked and Select Copy Index, Copy Foreing Primary key vs vs
But this options is not found in the SQL2005 Management Studio/Database/Tasks/Export Data wizard or I can't found it.
How can I export foreing Key and primary key with SQL2005 Management Studio/Database/Tasks/Export Data wizard.
Best Regards,
Athena.
Hi all,
I've got to change values in my source database as follows:
Source: Target:
X 1
Y 1
Z 2
Can I create a lookup table and us a look up task in SSIS to do this or do I need to script it?
Thanks
F
Hi,
I need to transfer data from my test server to the deployment server, is there any way or tool to achieve that. Any help will be much appreciated.
Thanks.
Kabir
I am learning SSIS, I am try to figure out how to run a SQL batch that returns a result set and export it to Excel using a Data Flow. I am using a OLE Db Source with a SQL batch shown below and the Destination Query is an Excel file.
How is this done?
So far I have had no luck getting the tasks to run. I need more than just simple queries.
The SQL is below:
SET NOCOUNT ON
DECLARE Tables CURSOR FAST_FORWARD FOR
SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE != 'VIEW'
DECLARE @RowCount INT
DECLARE @TableSchema NVARCHAR(128)
DECLARE @TableName NVARCHAR(128)
DECLARE @SqlCmd NVARCHAR(1024)
CREATE TABLE #temp ([Name] NVARCHAR(128), [RowCount] INT)
OPEN Tables
FETCH NEXT FROM Tables INTO @TableSchema, @TableName
WHILE @@FETCH_STATUS = 0
BEGIN
SET @SqlCmd = 'SELECT ''[' + @TableSchema + '].[' + @TableName + ']'', COUNT(*) FROM [' + @TableSchema + '].[' + @TableName + ']'
PRINT @SqlCmd
INSERT #temp EXEC sp_executesql @SqlCmd
FETCH NEXT FROM Tables INTO @TableSchema, @TableName
END
SELECT * FROM #temp
CLOSE Tables
DEALLOCATE Tables
DROP TABLE #temp
Hi,
I have an ASP.NET Web Service that accepts a DataSet object passed to it. This DataSet will contain a large number of records in it's table. What I want to do (if possible) is insert all records in a SQL table in a batch mode (one go). Is this doable?
Thanks,
Hello, Everyone: Greetings!
I am new to Sql Server [Express version] and am not even sure I'm making the right choice. So here I am, seeking advice.
My database needs are nothing sophisticated. They just involve:
(a) create tens of thousands of separate data files each under a unique file name of up to 8 characters, with file names read in from a pre-determined file name list.
(b) store/insert VOLUMINOUS numerical data into each of the data files, with the data indexed by date&time, plus maybe one or two additional character or string fields.
(c) for each data file, retrieve a subset of its data, perform extensive numerical calculations, and then store the results in one or more separate corresponding files, e.g. if a file name in (b) is F12345, (c) creates F12345R1, F12345R2, F12345R3, etc. which stores different sets of calculated results.
Thus, this is purely a console application, doing a batch job, and requiring no graphical user interface. Both automation and speed are important here, due to the large number of data files that must be created and/or updated, and the very extensive numerical calculations on the data.
The goal is to automate the daily or weekly creation of each of the tens of thousands of Sql Server database files, insert fresh data (read in from a fresh ASCII file) into each file, numerically process the data and then store the results in one or more separate, corresponding result data files, with all the steps automated and without need for GUI. Once coding is done, the entire data processing session is expected to run for many hours, or even days, in an automated manner, and without human intervention.
What would be the most efficient way of doing this under Visual Basic Express (which is what I'm learning to use) by directly calling Sql Server Express without having to go through GUI to create database files? What is the proper interface utility or library to use to enable direct database function calls without the need to learn SQL language? Is Visual Basic and/or Sql Server even good choices for what I want to do? I want to be able to call the basic, simple database functions directly and simply from program code in a non-GUI, non-interactive manner for the tens of thousands of separate data files that will be used.
I really miss the good old days when one can do a straightforward batch job via a console application, with simple, direct calls to create new data files, insert and index fresh data, retrieve any subset of data to do extensive calculations, create new files to store the results, etc. all under automated program control and iterating through unlimited number of data files, until the job is finished, all without human intervention during processing.
Or am I missing something because all this can still be done simply and easily under VB and Sql Server? I've several books here about Visual Basic 2005 and Visual Basic 2005 Express, all showing how to create a database via a GUI utility. That's fine if one needs to create just one or two databases, but not hundreds, or even tens of thousands (as in my case) of them on the fly.
So, I am looking for the simplest and most direct database interface that will allow me to do the above under VB program code alone, and easily. For something as simple as I have described above, I don't think I should have to learn the SQL language or manually create each database file.
As you can see, I just want to get some heavy duty numerical processing job done over tens of thousands of data files as simply and efficiently as possible, and with as little fanciful detour as possible. So, ironically, I am trying to use Visual Basic without being cluttered by having to learn its "Visual" aspects, yet GUI is what most VB books devote to or emphasize heavily. Similarly, I would much rather use simple, "lean and mean", direct database function calls than having to learn a new vocabulary of "English-like" SQL language.
Yes, I'm not used to this tedious detour of learning the GUI aspect of VB, or learning the Structured Query Language of Sql Server, just to try to do something simple that I need to do in batch mode via a console application.
Are there any good books or other helpful URLs that will help a guy like me? Am I even using the wrong language and the wrong database to do what I want to do? What are the better alternatives, if any? Any advice, experience and pointers on any of the above issues raised would be very much appreciated. Thank you!
Regards,
ConsoleApp
I created a package in SSIS to export data from mutiple SQL server tables to a single flat file. Once the export is completed i need to update 2 tables ( a batch table and a history table) In the batch table I need to insert a record with batch # (system generated),batch date, status( success/failed), record count( no of records in the flat file), batch filename). In the history table, i need to insert a record for each of the rows in the flat file with the foll info. batch number,datetime,status(success/failed)
My question is how do I get the batch status,record count, batchfilename for batch table update and how do i get to update the history table.
BTW, i am executing this package as a SQL Server Agent job.
I am new to Integration Services and feel lost.
Any help ?
I am loading a lot of Excel and CSV files to SQL Server. Some loading may fail for various reasons. I want a file either be load as a whole or nothing. Currently I keep a list of failed filename and remove it at the end (I add a column for source file name).
Any better way to make sure a file is loaded as a whole or nothing?
Thanks,
Matt writes "Greetings! Warning, I'm a rookie. I wrote a stored procedure to pull data in order to do a nightly export/import from one system to another. I have a batch file that looks like this:
bcp "exec WinSNAP_retrieveStudents '0607'" queryout c:WinSNAPData06.txt -c -U user -P password
Sometimes, the file works and I get perfectly formed data, with everything just as I've requested (mostly basic demographic information: names, addresses, etc.).
But, other times the output file contains nothing but garbage characters, like this:
剒乏†††††††††ठ䅍啎䱅†††䴉㐉㐷‰䕓䄠䅐䡃⁅剄
The file size looks right, but it contains nothing but characters like this from beginning to end. I can find no pattern as to why/when good data gets pulled versus the corrupt data. I can run the batch file one minute and get good data, and run it the next minute and it's all corrupt. We have the batch file scheduled late at night when no users are online, and I get the same results -- one day it works, the next it doesn't.
Forgive me if this is a well-documented issue -- my searches so far haven't turned up a thing!
Thanks much for any advice you can provide!!
Matt Smith
DeSoto County School District
Arcadia, FL"
I'm trying to export some tables from SQL Server 2012 into .xml format in a local drive and have run into a problem when reading the file in Excel for example.
The code I'm currently using is:
DECLARE @FileName varchar(50),
@bcpCommand varchar(8000)
SET @FileName = 'D:SQLextractsdbo.vwPOLICIES_Test'+'.xml'
SET @bcpCommand = 'bcp "SELECT pvTransactionID, pvPolicyID, pvParentPolicyID, pvProductTypeName, pvClientID,
[Code] ...
I've gone through each of the fields exported and the column pvInsuredName is what's causing the error. If I include column pvInsuredName in the script I end up with the Excel error message 'invalid file reference. the path to the file is invalid, or one or more of the referenced schemas could not be found' .
I'm guessing there's an issue with some of the characters contained within the column pvInsuredName causing the error.
I need to have a process extract some data, export it to CSV and then send it as a plain (non-MIME) message, or as a single-part MIME message. Multi-part MIME messages can't be used.
This process will be called about 20 times daily with different subsets of data.
What is the best way to approach this?
while exporting database table to ms excel, i m getting yellow marked icon on selected columns. Why is it so?
View 3 Replies View Related