SQL Server 2012 :: Run Script Within A Table / String And Output Results To File?
Sep 21, 2015
I have been given a request at work that requires us to run the SQL scripts that are held in a report configuration table and output the results as .csv or.xls files in a desired folder with a file name that is also specified within the report config table.
An example of the table is as per script below with column A being the desired file name to be created and Column B contains the script that will be executed.
Ideally I require a script that i can drop into a Stored Proc that will run down each row in the table and export and create each rows results as a separate file in a desired folder.
------------------------------------------------------------------------------------------
-----SAMPLE TABLE SCRIPT -----------------------------------------------------------
------------------------------------------------------------------------------------------
/****** Object: Table [dbo].[Test_Table_PS] Script Date: 21/09/2015 09:14:57 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Test_Table_PS](
[File_Name] [nvarchar](100) NULL,
[Code] ....
View 7 Replies
ADVERTISEMENT
Jul 31, 2015
I have the following code below where I need to have all of the query results output into a .csv file to use in a VBA macro. The issue I am running into is that the data is not deliminating correctly and my rows are being shifted incorrectly. Any better way of out putting the results into a .csv file with a common delimiter.
-- Declare the variables
DECLARE @CMD VARCHAR(4000),
@DelCMD VARCHAR(4000),
@HEADERCMD VARCHAR(4000),
@Combine VARCHAR(4000),
@Path VARCHAR(4000),
@COLUMNS VARCHAR(4000)
[Code] ...
Output from query (please post in a text editor. The line starting with (only ) should be on line 1 after 20 pks and is shifted to a new line.):
557898^1^9885E25^80082^9.0 CM GLASS FIBER PADS 20PKS
(only 12 pks in stock that will ship today)
^12.00000000^.00000000^18.32000000^219.84000000000^28.30000000
^339.60000000000^9.98000000^35.2650176678445^DR9146322^0^
[Code] ....
View 1 Replies
View Related
Jan 21, 2008
I am using vs 2005 and sqlserver 2005 to manage a large database. I need to create a series of text files based on the value of one of the columns. Is there an sql command that will pipe the output directly to a file?
Thanks,
Mike
View 1 Replies
View Related
Jan 20, 2000
What would be the correct synatx if I wanted to output the results of a stored to a file
instead of printing?
View 1 Replies
View Related
Apr 3, 2001
I have come across manual queries being run daily and the results saved to a txt file on the network.
I basically want to set run these up a stored procedure and set up as a scheduled task to run daily.
Does anyone know if you can automatically save the results of a query/stored procedure to a text file on a network??
Many Thanks..
View 1 Replies
View Related
Feb 3, 2008
I have a stored procedure that outputs an XML result. As it is right now, I will periodically go into SQL Server Mngmnt Studio and run the procedure, then save the XML output to an XML file that my website uses. The procedure takes too long run (it is recursive through the entire data set) to have the website call the procedure on its own, and wait for the results.
My question is this: Can I create a trigger that will run this stored procedure and save the results to a file? Is that possible? The trigger would call the procedure any time a specific table is updated (which will be a rare occurrence), then save the file to the path provided so the website users could always have the most up-to-date information without having long wait times caused by long waits for the procedure to run.
Any advice on how best to accomplish this will be appreciated.
Thanks in advance.
View 1 Replies
View Related
Jul 23, 2005
I have a dynamic database that will be periodically queried to selectthe data from a blob field. This blob data field is text of a variablelength. The data will be selected using an id field and a date range.There will be multiple blob fields returned that I would like to outputinto a txt file in a local folder.I have the blob fields showing up as text in the field and not areferring link. Can someone point me to an output to text solution?Thanks
View 1 Replies
View Related
Aug 13, 2014
I've the following T-SQL block that I'm executing where I want to print the following output in next line but not sure how to do this:
This is printing everything in one line:
SET @BODYTEXT = ' Dear ' + @fname +',We realize you may have taken ' + @course_due + ' in '+ @month_last_taken +'.'
How do I do this:
SET @BODYTEXT = ' Dear ' + @fname +
',We realize you may have taken ' + @course_due + ' in '+ @month_last_taken +'.'
Also how can I create a table in this variable, something like this:
(TABLE) LIST THE COURSE CODE, COURSE NAME , EMPLOYEE ID, EMPLOYEE NAME
(Course Name) (Last Completed) (Now due in Month/year)
My T-SQL code:
DECLARE @email varchar(500)
,@intFlag INT
,@INTFLAGMAX int
,@TABLE_NAME VARCHAR(100)
,@EMP_ID INT
,@fname varchar(100)
[Code] ......
View 2 Replies
View Related
Apr 17, 2014
I need to display the output of a table in a specific format, I have attached the sample screenshot and code for reference.
I tried with pivot but does not seems to be working.
View 2 Replies
View Related
Jun 9, 2014
I have a single column returned from a select statement. How can I have this returned as a vertical string? I looked into using PIVOT but my scenario seems too simple to use Pivot. I'm not requiring any aggregate functions or anything.
So, I want to turn this:
SELECT CountyNames + ',' FROM Counties
---------------------
County1 ,
County2 ,
County3 ,
..etc
TO:
County1, County2, County3...etc
Please note that the list of rows is unknown. Could be 1. Could be 50.
View 2 Replies
View Related
Jul 13, 2006
I have a nasty situation in SQL Server 7.0. I have a table, in whichone column contains a string-delimited list of IDs pointing to anothertable, called "Ratings" (Ratings is small, containing less than tenvalues, but is subject to change.) For example:[ratingID/descr]1/Bronze2/Silver3/Gold4/PlatinumWhen I record rows in my table, they look something like this:[uniqueid/ratingIDs/etc...]1/2, 4/...2/null/...3/1, 2, 3/...My dilemma is that I can't efficiently read rows in my table, match thestring of ratingIDs with the values in the Ratings table, and returnthat in a reasonable fashion to my jsp. My current stored proceduredoes the following:1) Query my table with the specified criteria, returning ratingIDs as acolumn2) Split the tokens in ratingIDs into a table3) Join this small table with the Ratings table4) Use a CURSOR to iterate through the rows and append it to a string5) Return the string.My query then returns...1/"Silver, Platinum"2/""3/"Bronze, Silver, Gold"And is easy to output.This is super SLOW! Queries on ~100 rows that took <1 sec now take 12secs. Should I:a) Create a junction table to store the IDs initially (I didn't thinkthis would be necessary because the Ratings table has so few values)b) Create a stored procedure that does a "SELECT * FROM Ratings," putthe ratings in a hashtable/map, and match the values up in Java, sinceJava is better for string manipulation?c) Search for alternate SQL syntax, although I don't believe there isanything useful for this problem pre-SQL Server 2005.Thanks!Adam
View 2 Replies
View Related
May 6, 2015
I am getting inconsistent results when BULK INSERTING data from a tab-delimited text file. As part of my testing, I run the same code on the same file again and again, and I get different results every time! I get this on SQL 2005 and SQL 2012 R2.
We have an application that imports data from a spreadsheet. The sheet contains section headers with account numbers and detail rows with transactions by date:
AAAA.1234 /* (account number)*/
1/1/2015 $150 First Transaction
1/3/2015 $24.233 Second Transaction
BBBB.5678
1/1/2015 $350 Third Transaction
1/3/2015 $24.233 Fourth Transaction
My Import program saves this spreadsheet at tab-delimited text, then I use BULK INSERT to bring the data into a generic table full of varchar(255) fields. There are about 90,000 rows in each day's data; after the BULK INSERT about half of them are removed for various reasons.
Next I add a RowID column to the table with the IDENTITY (1,1) property. This gives my raw data unique row numbers.
I then run a routine that converts and copies those records into another holding table that's a copy of the final destination table. That routine parses though the data, assigning the account number in the section header to each detail row. It ends up looking like this:
AAAA.1234 1/1/2015 $150 First Purchase
AAAA.1234 1/3/2015 $24.233 Second Purchase
BBBB.5678 1/1/2015 $350 Third Purchase
BBBB.5678 1/3/2015 $24.233 Fourth Purchase
My technique: I use a cursor to get the starting RowID for each Account Number: I then use the upper and lower RowIDs to do an INSERT into the final table. The query looks like this:
SELECT RowID, SUBSTRING(RowHeader, 6,4) + '.UBC1' AS AccountNumber
FROM GenericTable
WHERE RowHeader LIKE '____.____%'
Results look like this:
But every time I run the routine, I get different numbers!
Needless to say, my results are not accurate. I get inconsistent results EVERY TIME. Here is my code, with table, field and account names changed for business confidentiality.
TRUNCATE TABLE GenericImportTable;
ALTER TABLE GenericImportTable DROP COLUMN RowID;
BULK INSERT GenericImportTable FROM 'SERVERGeneralAppnameDataFile.2015.05.04.tab.txt'
WITH (FIELDTERMINATOR = ' ', ROWTERMINATOR = '', FIRSTROW = 6)
ALTER TABLE GenericImportTable ADD RowID int IDENTITY(1,1) NOT NULL
SELECT RowID, SUBSTRING(RowHeader, 6,4) + '.UBC1' AS AccountNumber
FROM GenericImportTable
WHERE RowHeader LIKE '____.____%'
View 3 Replies
View Related
Mar 5, 2015
I have a table which is updated daily using a MERGE statement. As records are insert, updated and deleted, I am saving the OUTPUT from the MERGE statement into a history table with a timestamp and action$ column appended to the record.
Using this history table, I'd like to rebuild the data based on specific past date. I was able to create a stored procedure that inspects each record in the history table and apply it to the data in a temp table. The stored procedure solution uses multiple queries to rebuild the data at a point in time. I was curious if there was an easier and more efficient solution using a table function.
View 2 Replies
View Related
Apr 19, 2014
I have written this sample query to search a full-text indexed table and return the results. If the word occurs more than once I want it to return as a new record and the results show a short summary of the location. I was using 'like', but the full table scans were really slowing it down. Can performance be improved for the following (The results returned by my query are accurate)
Query
DECLARE @searchString nvarchar(255);
DECLARE @searchStringWild nvarchar(275);
SET @searchString = 'first';
SET @searchStringWild = UPPER('"' +@searchString+ '*"');
SELECT id, '...' + SUBSTRING(searchResults.MatchedCell, searchResults.StartPosition, searchResults.EndPosition - searchResults.StartPosition) + '...' as Result
[Code] ....
View 2 Replies
View Related
Oct 7, 2015
I'm trying to replace a particular part of a row in a table with a new value.
The row is called "DataPath" and it has a lot of values like so:
mashOperationsComponent Data FilesSantec
I want to run a query to replace the mash with our DFS namespace share name companysharesDepartments but keep everything else past the mash part of the original row.
I'm currently running this query, it says it is altering 30,000 rows, but it doesn't look like it's doing anything at all..
UPDATE dbo.Part
SET DataPath = REPLACE(DataPath,'company.localsharesDepartments','mash')
WHERE DataPath like 'mash\%'
So for example, it would change the mash above to
company.localsharesDepartmentsOperationsComponent Data FilesSantec
View 2 Replies
View Related
Nov 16, 2013
I'm a newbie to SQL. I'm using SQL Server 2012 on my local machine and I need to find a way to output my queries as Excel files. I came across these codes for Interactive SQL (what is intercative SQL by the way?) but they don't work in the SQL query window:
SELECT * FROM SalesOrders;
OUTPUT USING 'Driver=Microsoft Excel Driver (*.xls);
DBQ=c: estsales.xls;
READONLY=0' INTO "newSalesData";
It seems the "OUTPUT" command is not a valid command. I really liked that piece of code (may be because it is so simple and carries over very few lines)! Do we have something similar for SQL Server 2012 that can do the job?
View 9 Replies
View Related
Sep 26, 2014
Using a string of IDs passed into a stored procedure as a VARCHAR parameter ('1,2,100,1020,') in an IN without parsing the list to a temp table or table variable. Here's the situation, I've got a stored procedure that is called all the time. It's working with some larger tables (100+ Million rows). The procedure passes in as one of the variables a list of IDs for the large table. This list can have anywhere from 1 to ~100 IDs passed to it.
Currently, we are using a function to parse the list of IDs into a temp table then joining the temp table to get the query:
CREATE PROCEDURE [dbo].[GetStuff] (
@IdList varchar(max)
)
AS
SET NOCOUNT ON
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
[Code] .....
The problem we're running into is that since this proc gets called so often, we sometimes run into tempDB contention that slows this down. In my testing (unfortunately I don't have a good way of generating a production load) swapping the #table for an @table didn't make any difference which makes sense to me given that they are both allocated in the tempDB. One approach that I tried was that since the SELECT query is pretty simple, I moved it to dynamic SQL:
CREATE PROCEDURE [dbo].[GetStuff] (
@IdList varchar(max)
)
AS
SET NOCOUNT ON
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
[Code] ....
The problem I had there, is that it creates an Ad Hoc plan for the query and only reuses it if the same list of parameters are passed in, so I get a higher CPU cost because it compiles a plan and it also causes the plan cache to bloat since the parameter list is almost always different. Is there an approach that I haven't considered that may get the best of both worlds, avoiding or minimizing tempDB contention but also not having to compile a new plan every time the proc is run?
View 9 Replies
View Related
Nov 5, 2015
I am downloading a webpage as a text file in order to read a specific string to assign it as a variable/parameter in order to create an output file name. I would like to know how would I be able to look for a specific string and output as another variable for the rest of the package.
2015 Conforming Loan Limits
------------------------------------------------------------------------
o _Loan Limits for Calendar Year 2015--All Counties _[XLS]
</DataTools/Downloads/Documents/Conforming-Loan-Limits/FullCountyLoanLimitList2015_HERA-BASED_FINAL_FLAT.xlsx>_ ,
_[PDF]
</DataTools/Downloads/Documents/Conforming-Loan-Limits/FullCountyLoanLimitList2015_HERA-BASED_FINAL.pdf>_
o _List of 46 Counties with Increases in Loan Limits for 2015
[Code] ...
To explain it a more better way, I have a sample webpage text here. I should be searching for "FullCountyLoanLimitList" appended by the current year (like FullCountyLoanLimitList2015) and copy the entire file name in the text file and assign it to another variable so that I can download that specific file using WebClient connection.
View 4 Replies
View Related
May 16, 2008
Hey,
I'm looking for a good way to rewrite my T-SQL code with 'bcp' to SSIS package, any help would be greatly appreciated?
I have table1 contain account numbers and output-filename for each account,
I need to join table2 and table3 to get data for each account in table1, and then export as a txt file.
Here is my code using bcp (bulk copy)
DECLARE @RowCnt int,
@TotalRows int,
@AccountNumber char(11),
@sql varchar(8000),
@date char(10),
@ArchPath varchar(500)
SET @RowCnt = 1
SET @date = CONVERT(CHAR(10),GETDATE(),110)
SET @ArchPath = '\D$EDATAWorkFoldersSendSendData'
SELECT @TotalRows = count(*) FROM table1
--select @ArchPath
WHILE (@RowCnt <= @TotalRows)
BEGIN
SELECT @AccountNumber = AccountNumber, @output_filename FROM table1 WHERE Identity_Number = @RowCnt
--PRINT @AccountNumber --test
SELECT @sql = N'bcp "SELECT h.HeaderText, d.RECORD FROM table2 d INNER JOIN table3 h ON d.HeaderID = h.HeaderID WHERE d.ccountNumber = '''
+ @AccountNumber+'''" queryout "'+@ArchPath+ @output_filename + '.txt" -T -c'
--PRINT @sql
EXEC master..xp_cmdshell @sql
SELECT @RowCnt = @RowCnt + 1
END
View 7 Replies
View Related
Jul 8, 2015
I would like to know if it is possible to date stamp the output file for the DBCC integrity check?
The following cmd works fine however I tried different ways to date stamp the output file with no luck.
sqlcmd -U backup -P Password -S Server -Q"DBCC CHECKDB('DatabaseName') WITH ALL_ERRORMSGS" -o"G:LOGSDBCCResults.txt"
View 5 Replies
View Related
Jun 5, 2014
I have a process to rollover prior quarter data to new quarter in a table.
For example, i have a table with (col1, col2, year, qtr) with data like ( Note: col1 is identity(1,1) )
1,'today',2014,1
2,'tomorrow,2014,1
3,'friday',2014,1
Now when i run my process, above 3 records will be rolled over new quarter 2014 Q2 and the table will be like
1,'today',2014,1
2,'tomorrow,2014,1
3,'friday',2014,1
4,'today',2014,2
5,'tomorrow,2014,2
6,'friday',2014,2
Row 1 with identity 1 has rolled over to new quarter row 4 with identity 4 ( qtr fields are changed )
Row 2 with identity 2 has rolled over to new quarter row 5 with identity 5. Same with last row as well.
Here, i have another table called "ident_map" with columns like (old identity, new identity ) and during rollover i am supposed to load ident_map table with old and new identity. So after rollover is complete, ident_map table should look like
1,4
2,5
3,6
I know using output clause I can capture the new identity values. 4,5,6 in this case. But is there any way to capture both old identity and new identity during rollover so that i can load the ident_map table with old and new identity.
View 9 Replies
View Related
Mar 2, 2014
I am trying to create an ssis package with dynamic csv file as output. and out format contains query output.
sample file name:
Unique identifier + query output + systemdate();
The expression is looking like this.
@[User::FilePath] + @[User::FileName] + ".CSV"
-- user filepath is a variable from ssis package. File name is the output from SQL query. using script task i have assigned the values to @[User::FileName] .
When I debugged the script task the value getting properly but same variable am using for Flafile destination. but its not working.
View 3 Replies
View Related
May 28, 2015
I have this table:
with actividades_secundarias as (
select a.*, r.Antigo, r.Novo, rn = row_number()
over (PARTITION BY a.nif_antigo, r.novo ORDER BY a.nif_antigo)
from ACT_SECUNDARIAS a inner join
((select
[Code] ....
I want to make a delete statement like this:
select * into #table1 from actvidades_secundarias where rn>1
Delete from act_secundarias where act_secundarias.nif_antigo = #table1.nif_antigo and act_secundarias.cod_cae = #table1.cod_cae
But it seems that I cant delete like this.
View 5 Replies
View Related
Aug 31, 2015
I have a FileTable as part of an application that a client uses. In general the response of the SQL Server has been great. However, they have started to make use of a part of the application that handles images. The application has a FileTable set up. However, there is a Stored Procedure that retrieves the path of the requested file. It is taking the Server over two seconds to return a single file path.
USE [BMIPictureDB]
GO
/****** Object: StoredProcedure [dbo].[GetFileTablePath] Script Date: 8/31/2015 6:07:37 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[GetFileTablePath]
[Code] ....
View 2 Replies
View Related
Aug 31, 2015
I have been tasksed to create a data table and stored procedure to extract a special formatted XML file that is an attachment with a standard XML envelope. The XML file is an attchment in a node within the XML wrapper. There are other MIME files (pdf's ) that are handle by a seperate procedure. But I need to just extract the XML file attached along with those and put it into the datable with some other PK?FK fields.
Is a blob the best datatype. How to I insert that XML file into it?
View 2 Replies
View Related
Mar 20, 2015
I'd like to know how to get windows events script below into a SQL Server Table.
Get-WinEvent -LogName application -MaxEvents 200 | where {$_.LevelDisplayName -eq "Error"}
View 1 Replies
View Related
May 2, 2014
I've been trying to access the file table shares exposed on a server (I can select, delete, update, etc via SQL) via untransacted access with no luck (the share just sits there doing something when I try to access it, no error message of any kind).
So I thought I'd look to see if there were any open file handles via
SELECT * FROM sys.dm_filestream_non_transacted_handles;
I found there were 10 non transacted handles open, and they've been open for 3 days for automated processes and the files are about 1-2Kb in size.
so I tried to kill them via
exec sp_kill_filestream_non_transacted_handles
that's normally not something that takes very long (in fact when I cleared two for a different database it took just a second or two).
So leaving the stored proc running, I figured I'd connect and fire up Adam Mechanic's sp_whoisactive.
I see the processes to kill the open file handles, with a wait state of FFT_NSO_FILEOBJECT
View 2 Replies
View Related
Jun 6, 2015
how to import a text file with a list of NI numbers into a new table with a column to list all the NI numbers? I think I use the Select INTO clause, but not sure how to do this?
View 1 Replies
View Related
Apr 15, 2014
To get the results of a stored proc into a table you always had to know the structure of the output and create the table upfront.
CREATE TABLE #Tmp (
Key INT NOT NULL,
Data Varchar );
INSERT INTO #Tmp
EXEC dbo.MyProc
Has SQL 2012 (or SQL 2014) got any features / new tricks to let me discover the table structure of a stored procedure output, - i.e treat it as a table
EXEC dbo.MyProc
INTO #NewTmp
or
SELECT *
INTO #NewTmp
FROM ( EXEC dbo.MyProc )
View 9 Replies
View Related
Mar 20, 2014
We have some URLs within a bulk block of text some of which are very long. I need to identify rows where such urls exceed say 100 characters in length in amongst other text.So the rule would be return a record if within the string there is a string (without spaces) longer than 100 characters.
View 9 Replies
View Related
Aug 25, 2015
If I use msdb..sp_send_dbmail or save query results as text (using sqlcmd) and include the column headers I get the dashed separator line.
e.g.
custID, name
------, ------
1,bob
2,jamesI would like this
custID, name
1, bob
2 ,james
I found this method [URL] ....
::
sqlcmd -E -S (local) -d myDB -W -w 1024 -s "," -i "SELECT * FROM tblCust" | findstr /V /C:- /B > C: emp.csv
Can the same result be achieved sending as attachment with dbmail?
EXEC msdb..sp_send_dbmail @attach_query_result_as_file = 1I don't want to have to add column names as part of the query
Change the query to return column headers in resultset
SELECT 'CustID' as f1, 'name' as f2
UNION ALL
SELECT CAST(CustID as Varchar(10)), name FROM tblCustand set
msdb..sp_send_dbmail @query_result_header = 0
View 0 Replies
View Related
Jul 22, 2013
Overall goal: Write a Bulk Insert statement using the UNC path of a filetable directory.
Issue: When using the UNC path of the filetable directory in a Bulk Insert Statement, receiving "Operating system error code 50(The request is not supported.)" Looking for confirmation as to whether this is truly not supported.
Environment: SQL Server 2012 Standard. Windows Server 2008 R2 Standard
View 9 Replies
View Related
May 29, 2015
I am looking for a way to convert the following format into a sql table. The format it is Bib Tex.
Essentially a new row in the table would be for each entry, denoted by an @ logo and each column is denoted by an =, as you can see from the example data no one contains all the possible columns and some fields can be over two lines long.
To load this I was considering loading it into a table as each line being a row. Adding a row number, then a column counting the @ signs in order and essentially grouping each record, then for each group running through and looking for the column keywords 'author' , 'title' etc then splitting the data out into those constituent parts using substring and charindex.
@Book{hicks2001,
author = "von Hicks, III, Michael",
title = "Design of a Carbon Fiber Composite Grid Structure for the GLAST
Spacecraft Using a Novel Manufacturing Technique",
publisher = "Stanford Press",
year = 2001,
[Code] ....
View 9 Replies
View Related