SQL 2012 :: How To Do Selective Read Of File Stored In File Table
Jul 2, 2015
I have a filetable that contains a binary file. I need to do a selective read of the file stored in the file table. I can write a C# CLR function that will open the file, read n bytes the from a starting byte. Or I can write a SQL statement that reads the stream in the filetable into a VARBINARY variable using SUBSTRING beginning at the starting byte (offset from 1) for the same n bytes.
Both give me the same result. However, the SQL statement takes considerably longer to read. I know there is overhead in reading through SQL (interpreted language), but the difference in performance is substantial, and I can only attribute this performance degradation if SQL first tries to "load" the entire stream before it identifies the portion of the stream that it needs to read beginning at the starting byte offset.
I wonder if this is the case or if there is another option to read a stream from a filetable directly through SQL queries that is more efficient.
I have a test server (TEST1) running SQL 2012 and Windows 2012R2. One of the developers wants to use OPENROWSET to read in data from an Excel 2010 file (an xlsx file).
I have loaded the Microsoft Drivers in "AccessDatabaseEngine_64.exe" and enabled the Ad Hoc Distributed Queries option in SQL.
This is the sample code we are working with:
SELECT X.MEMBID FROM OPENROWSET( 'MSDASQL', 'Driver={Microsoft Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)};DBQ=etworkserverFolder1Folder2MEMBIDs.xlsx', 'SELECT * FROM [Sheet1$]' ) AS X
I can run the sample query from my laptop with SSMS (I have admin rights) and I can also run it as SA from my laptop. So all is good, right?
But if I RDC into TEST1, I cannot run the query. I get this error:
OLE DB provider "MSDASQL" for linked server "(null)" returned message "[Microsoft][ODBC Excel Driver] Your network access was interrupted. To continue, close the database, and then open it again.". OLE DB provider "MSDASQL" for linked server "(null)" returned message "[Microsoft][ODBC Excel Driver]General error Unable to open registry key Temporary (volatile) Ace DSN for process 0x574 Thread 0xb74 DBC 0x1d07f08 Excel'.". OLE DB provider "MSDASQL" for linked server "(null)" returned message "[Microsoft][ODBC Excel Driver]General error Unable to open registry key Temporary (volatile) Ace DSN for process 0x574 Thread 0xb74 DBC 0x1d07f08 Excel'.". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "MSDASQL" for linked server "(null)".But wait! It gets better.
I can run the query as SA from TEST1.
And of course, the developer can't run it either.
And it works fine in the production server.
I'm thinking the basics are there but something isn't right in some permission somewhere.
I'm reading a Flat File. The 1ste record containts special Info that is needed for first preparing the database on a buld insert of the remaining lines.
How could i realize this, just read 1 line, store some of that data into variables, execute some proc's and then read the rest of the file ?
I would like to store and read .doc file in a table of sql server using C#. Can some body help me on this? my id is padam76@gmail.com. Thanks & Regard,Padam Kumar Tripathi
I was wondering how I could pass on the following parameters from an ini file to a stored procedure within a DTS package. The parameters in the ini file look like:
[DatabaseCleaner] ! -- TableToBeCleaned_N=<table name>,<months to hold on db>,<months to hold on hd> ! -- <N> must be a successive number starting from 1 ... TableToBeCleaned_1=Transactions,24,24 TableToBeCleaned_2=Payments,24,24 TableToBeCleaned_3=PresenceTickets,24,24
As I do not know how many tables that will be declared in the ini file I have to loop through until the last parameters and pass it over to the SP.
Hello Experts, I am createing one task (user control) in SSIS. I have property grid in my GUI and 2 buttons (OK & Cancle). PropertyGrid has Properties like SourceConnection, OutputConnection etc....right now I am able to populate Connections in list box next to Source and Output Property.
Now my question to you guys is depending on Source Connection it should read that text file associated with connection manager. After validation it should pick header (first line of text file bases on record type) and write it into new file when task is executed. I have following code for your reference. Please let me know I am going in right direction or not.. What should go here ? ->Under Class A
Any easy way for a batch file or automated process to read from one db and table and what ever entry is missing out of another database + table it writes those missing entries to.
This is a simple table in one db that is filled with usernames, I want to see if there are missing usernames in another db and table and write those entries.
db1.usr_table.usr_name = jdoenew If jdoenew is missing in the 2nd db I will need to write entries like: db1.usr_table.usr_name = jdoenew db1.usr_table.password = tmppassword db1.usr_table.active = 1
Hai Everbody, for me in my project i want to read data from a csv file and insert it in a sql server databse table.The csv file may contain n number of columns,but i want only certain columns from that, and insert it in the database table.How to achieve this. Plz help me it is urgent. Thanks in advance. Thanks and regards Biju.S.G
INPUT: A table contains 2 columns with data i need. column A=Filename and column B=FileContent
PROCESS: I need to loop through ea record in the table and retrieve columns A and B. Then for ea column i need to write the Content hold in column B into File hold in column A.
I so far found out, that i need a Execute SQL Task in Control Flow querying the table and get columns A and B into 2 variables, plus a 3rd var holding the object. Then the output goes into a Foreach Loop Container. From this point i don't know how to continue. I tried to put a Data Flow Task inside the Foreach Loop, but couldn't find out how i now get the 2 variables to the Data Flow Task and use them to for the file to be written and the content to be placed in the file.
Is there any example similiar to that so i could learn how to start on that?
I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file.
Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.
Which is the best way to setup the Dataflow task for this requirement?Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
It would be a stored procedure that would read the XML attributes into 2 tables, the number of attributes could be 1-N, so I thought XML would be a good choice. Also, one of the attributes could be up to 4000 characters. I think this may limit our options, can 100-150 4000 character strings be passed in a standard call to a query/proc in SQL?
Currently the client application makes round-trip network calls to save upwards of 100 pairs of data. 1 header row, and many detail rows. All within a transaction.
I think If we move an XML file to the SQL box, then do all the import/save work on the "Server" side it would be much better. Cutting the transaction time down a lot by not doing so many round-trips at network speed...
OBJECTIVE: I would like to read a text file from SQL Server 2000, read the text file content, and load its conntents in a RichTextBoxTHINGS I'VE DONE AND HAVE WORKING:1) I've successfully load a text file (ex: textFile.txt) in sql server database table column (with datatype Image) 2) I've also able to load the file using a Handler as below: using System;using System.Web;using System.Data.SqlClient;public class HandlerImage : IHttpHandler {string connectionString;public void ProcessRequest (HttpContext context) {connectionString = System.Configuration.ConfigurationManager.ConnectionStrings["NWS_ScheduleSQL2000"].ConnectionString;int ImageID = Convert.ToInt32(context.Request.QueryString["id"]);SqlConnection myConnection = new SqlConnection(connectionString);string Command = "SELECT [Image], Image_Type FROM Images WHERE Image_Id=@Image_Id";SqlCommand cmd = new SqlCommand(Command, myConnection);cmd.Parameters.Add("@Image_Id", System.Data.SqlDbType.Int).Value = ImageID;SqlDataReader dr;myConnection.Open(); cmd.Prepare(); dr = cmd.ExecuteReader();if (dr.Read()){ //WRITE IMAGE TO THE BROWSERcontext.Response.ContentType = dr["Image_Type"].ToString();context.Response.BinaryWrite((byte[])dr["Image"]);}myConnection.Close();}public bool IsReusable {get {return false;}}}'>'> <a href='<%# "HandlerDocument.ashx?id=" + Eval("Doc_ID") %>'>File </a>- Click on this link, I'll be able to download or view the file WHAT I WANT TO DO, BUT HAVE PROBLEM:- I would like to be able to read CONTENT of this file and load it in a string as belowStreamReader SR = new StreamReader()SR = File.Open("File.txt");String contentText = SR.Readline();txtBox.text = contentText;BUT THIS ONLY WORK FOR files in the server.I would like to be able to read FILE CONTENTS from SQL Server.PLEASE HELP. I really appreciate it.
I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?
I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.
I need find out the number of columns in flat file before i process that particular file.I have file name in @filename variable and file path is @filepath variable.But do not not that how i will check the column name in before i will process that file.
@filePath = C:DatabaseSourceFilesCAHCVSSourceFiles And i am using for each loop container to read the file one by one and put the file name in @filename variable.and my file name like
Now what i have to do is i need to make sure that ID,Name,City,County,Phone is there in flat file.if it is not there then i have to send mail to client saying that file is not valid.I need to also calculate the size of flat file.
The TEMPDB transaction log file keeps growing.The database server is new and the transaction log was presized to 1 GB on installation. After installing a number of databases, the log file grew over a day to 38GB. Issuing a manual checkpoint was the only way to free some space to allow it to be shrunk back to a usable size. The usage of the file is still going up.
I am struggling to find what process is causing the log to be used so heavily. Looking at the log reuse wait desc for tempdb returns "Nothing" and tempdb itself isn't being used very much or growing in size.
I have the Excel Connection Manager and Source to read the contents from an Excel file. For some reason couple of numeric fields from the Excel worksheet are brought over as nulls even though they have a value of 300 and 150. I am not sure why this is happening. I looked into the format of the fields and they are set to General in Excel, I tried setting them to numeric and that did not help.
All the other content from the excel file is coming thru except for the 2 numeric fields.
I tried to bring the contents from the excel source to a text file in csv format and for some reason the 2 numeric fields came out as blank.
Any inputs on getting this addressed will be much appreciated.
I want to import an XML file in an Table. The XML file is on an Other Network location than the Database server I’m connected to and I have access to this location with my windows Credential.
I connect from my local PC to the Database server “RS1” with my Windows Credentials.
There I run this script
CREATE TABLE XMLwithOpenXML ( Id INT IDENTITY PRIMARY KEY, XMLData XML, LoadedDateTime DATETIME ) INSERT INTO XMLwithOpenXML(XMLData, LoadedDateTime) SELECT CONVERT(XML, BulkColumn) AS BulkColumn, GETDATE() FROM OPENROWSET(BULK '<Networkname><MAP>Name_.xml', SINGLE_BLOB) AS x;
I Get the error:
Msg 4861, Level 16, State 1, Line 10 Cannot bulk load because the file "<Networkname><MAP>Name_.xml'" could not be opened. Operating system error code 5(Access is denied.).
(<Networkname><MAP>Name_.xml' is not the real name )
So it looks like the openrowset connects to the network location with other credentials.
Where I can find (and change) the credentials which is used to connect to the network location?
I found this article on MSDN (Security Considerations) but cannot find solution [URL] .....
Why shrinkfile empty file does not redistribute data evenly in the primary file group with multiple files:
Please run the script attached to see what the end result is.
This is what I set up last night on my test machine.
1) Create database [FGTest] size 200MB 2) Create table called TEST on primary 3) Insert 40MB of data into test 4) Create another file group called temp in primary size 200MB 5) Shrinkfile('FGTest',emptyfile) so that all data is transfered from FGTest into temp file group. 6) Add another 2 files called DATA2 and DATA3. Both are 200MB. 7) We now have 3 empty files that I want data distributed evenly on. FGTest, DATA2 & DATA3 8) Shrinkfile('temp',emptyfile) to move all the data from temp over the 3 file groups evenly
I would expect at this stage to have the following:
FGTest = 13MB, DATA2 = 13MB, DATA3 = 13MB
(40MB of data over 3 files should be about 13 MBish in each file)
What I actually end up with is this:
FGTest = 20MB DATA1 = 10MB DATA2 = 10MB
It looks as though SQL Server is allocating 50% of all data to the original file and then 50% evenly over the remaining files in PRIMARY.
We have a large 'History' database that is currently about 4.5TB, with most of that in a datafile that is 4.2TB. We wanted to stop growth on the one large data file and have SQL Server allocate new data to the other data files, but this throws an error when we attempt to change the MAXSIZE settings:
ALTER failed for Database 'History' MODIFY FILE failed. Specified size is less than or equal to current size.
The SQL Server is saying we can have a max size of 2TB, and anything over that is blocked. Since this is being blocked, the file continues to grow.
Is there any way to cap the growth of the 4.2TB file and not allow any more data to be written to it?
I have a FileTable as part of an application that a client uses. In general the response of the SQL Server has been great. However, they have started to make use of a part of the application that handles images. The application has a FileTable set up. However, there is a Stored Procedure that retrieves the path of the requested file. It is taking the Server over two seconds to return a single file path.
USE [BMIPictureDB] GO /****** Object: StoredProcedure [dbo].[GetFileTablePath] Script Date: 8/31/2015 6:07:37 AM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[GetFileTablePath]
I need to add an existing shared folder to a SQL FileTable. So this is the path and I created a SQL Filetable and now I need to add it to the filetable.
I have been tasksed to create a data table and stored procedure to extract a special formatted XML file that is an attachment with a standard XML envelope. The XML file is an attchment in a node within the XML wrapper. There are other MIME files (pdf's ) that are handle by a seperate procedure. But I need to just extract the XML file attached along with those and put it into the datable with some other PK?FK fields.
Is a blob the best datatype. How to I insert that XML file into it?