Transact SQL :: Can Store File Data Without Streaming On Server
Oct 14, 2015
I need to store file(s) to SQL without streaming / reading at Server. I have created a Web API with AngularJS and SQL. e.g.Â
var fileType = httpRequest.Files[file].ContentType;
    var fileStrm = httpRequest.Files[file].InputStream;
    var fileSize = httpRequest.Files[file].ContentLength;
    byte[] fileRcrd = new byte[fileSize];
    var file_Name = Path.GetFileName(filePath);
    fileStrm.Read(fileRcrd, 0, fileSize);
Is it possible to send file data to SQL (in bytes) without streaming / reading at server?I don't want to put a load on server for large files. just read  the data and send them to SQL where SQL will do the streaming and store data as varbinary.
CREATE TABLE [dbo].[IntegrationMessages] ( [MessageId] [int] IDENTITY(1,1) NOT NULL, [MessagePayload] [varbinary](max) NOT NULL, )
I call the following insertmessage stored proc from a c# class that reads a file.
ALTER PROCEDURE [dbo].[InsertMessage] @MessageId int OUTPUT AS BEGIN
INSERT INTO [dbo].[IntegrationMessages] ( MessagePayload ) VALUES ( 0x0 )
SELECT @MessageId = @@IDENTITY
END
The c# class then opens a filestream, reads bytes into a byte [] and calls UpdateMessage stored proc in a while loop to chunk the data into the MessagePayload column.
ALTER PROCEDURE [dbo].[UpdateMessage] @MessageId int ,@MessagePayload varbinary(max)
AS BEGIN
UPDATE [dbo].[IntegrationMessages] SET MessagePayload.WRITE(@MessagePayload, NULL, 0) WHERE MessageId = @MessageId
END
My problem is that I am always ending up with a 0x0 value prepended in my data. So far I have not found a way to avoid this issue. I have tried making the MessagePayload column NULLABLE but .WRITE does not work with columns that are NULLABLE.
My column contains the following: 0x0043555354317C... but it should really contain 0x43555354317C...
My goal is to be able to store an exact copy of the data I read from the file.
Here is my c# code:
public void TestMethod1() { int bufferSize = 64; byte[] inBuffer = new byte[bufferSize]; int bytesRead = 0;
byte[] outBuffer;
DBMessageLogger logger = new DBMessageLogger();
FileStream streamCopy = new FileStream(@"C:vsProjectsSandboxBTSMessageLoggerInSACustomer3Rows.txt", FileMode.Open);
try { while ((bytesRead = streamCopy.Read(inBuffer, 0, bufferSize)) != 0) { outBuffer = new byte[bytesRead];
//This calls the UpdateMessage stored proc logger.LogMessageContentToDb(outBuffer); } } catch (Exception ex) { // Do not fail the pipeline if we cannot archive // Just log the failure to the event log } }
Hello everyone. I have an interesting problem. We have a SQL database with a field that is an image type. In it are records where the image field are text files. What I would like to do is look at these files and stream them line by line and do some processes for each line read. So let's say a client machine uploaded a file called myfile.txt into this database. I would like the asp.net (vb) application tkae this file and read each line and do some processing. I looked at memorystream and streamreader but just can not figure this out. Can you please help? Thank you Kameron
Interested in feedback from the SQL grand wizards (and would-be wizards) that haunt these forums.
Let's say you need to constantly stream data into an OLTP system. We are talking multiple level hierarchies totaling upwards of 300 MB a day spread out not unlike a typical human sleep cycle (lower data during off-peak, still 24/7 requirements). All data originates from virtual machines running proprietary algorithms. The VM/data capture infrastructure needs to be massively scalable, meaning that incoming data is going to become more and more frequent and involve many different flat record formats.
The data has tremendous value when viewed both historically as well as in real-time (95% of real-time access will be read-only). The database infrastructure is in it's infancy now and I'm trying to develop a growth plan that can meet the needs of the business as the data requirements grow. I have no doubt that the system will need to work with multiple terabytes of data within a year.
Current database environment is a single server composed of a Dell PowerEdge 2950 (Intel Quad Core 5355, 16 GB RAM, 2 x 73 GB 15K RPM SAS ) with an attached Dell PowerVault MD1000 (15 x 300 GB 10K RPM SAS in RAID 5+0 [2x7] w/hot spare) running Win 2k3 64-bit and SQL Server 2005 x64 Standard, 1-CPU.
I am interested in answering the following questions:
Based on the scaling requirements of the data capture and subsequent ETL, what transmission method would you find most favorable? For instance, we are weighing direct database writes via stored procedures for all VM systems versus establishing processes to collect, aggregate and stream CSVs into a specialized ETL environment running SSIS packages that load data and then call SQL Stored procedures to scrub and prepare for production import. The data will require scrub routines that need access to current production data, so distributing the core data structures to multiple ETL processing systems would be expensive and undesireable. Cost is very important to the overall solution design. In terms of database infrastructure, how would you maximize business value while keeping cost as low as possible? For instance, do you think there is more value in an ACTIVE/ACTIVE cluster (2 x CPU licenses) where one system acts as ETL and the other as OLTP or would you favor replication of production data from ETL to OLTP or (vice-versa). With the second scenario, am I mistaken in thinking we could get away with a Server/CAL licensing model for the ETL server?. Are there any third party tools that I should research that would greatly aid me here?
I appreciate all feedback, criticism, and thoughts.
After the staging_temp data gets inserted into main table.my probelm is to handle such a file where number of columns are more than the actual table.
If you see the sample rows there are 4 column separated by "¯".but actual I am having only 3 columns in my main table.so how can I get only first 3 column from the satging_temp table.
When I send my query results to a file in SQL Server Management Studio, how come I'm seeing the following in Notepad++? FHÂ Â TEST "FH" which I thought should be in a CHAR(2) data column is there but "TEST" seems to start in Column 6...not column 3 as I would have expected. I was expecting... FHTEST.
I am curious what the "best practice" is for exporting data programmaticly from SQL Server to Excel. Â Is it best to do it straight from SQL Server, or should I do it with in my C# code? Â My program is going to pull the data, put in the excel file, then email the file. Â So I could write an SP that gets the data and puts it in the file, then have the C# code run the SP and email the file; Or I could have the code do everything, pull the data, export it & email it.
If it is considered better to have the SP do it, why and what is the best way? ROWSET functions?
hi.. I want to store a RMVB file to SQL SERVER 2000 ,and read from it,iwant to play the RMVB file in web,the size of the RMVB file is more than 300MB less 1G.the SQL Field Image can include it. Now My Quesstion is How can i Store and Read the RMVB file from SQL Server2000? I used SqlInsertCommand.ExecuteNoquery() in my program,but it Too slow,ao make a unknown error. Thank you for your help.
USE TESTÂ GOÂ /****** BULK INSERT Â ******/ BULK INSERT [Table01] FROM 'C:empdata.csv'
[code]....
I am using above code to insert csv file data which consist of arabic data as well. Â Upload is successful however Arabic field data is uploaded with invalid characters and getting the following error Msg 4864, Level 16, State 1, Line 3...Bulk load data conversion error (type mismatch or invalid character for the specified codepage)
writing a T_SQL query for the following scenario..I have a SQL DB Audit file that gets populated with data as the activity on DB goes on.I have multiple monthly tables setup that the import should go into these monthly tables based on the event_time value in the SQL DB Audit file.all the data Like event_time '2015-08-25 15:59:39.033' should go to SQL table Audit_tbl_Aug2015 Query for reading SQLDB Audit file
SELECT * FROM sys.fn_get_audit_file ('C:ackupAudit*',default,default) order by event_time desc GO --DML for Audit table CREATE TABLE [dbo].[Audit_tbl_Aug2015]( [id] [bigint] IDENTITY(1,1) NOT NULL, [event_time] [datetime2](7) NOT NULL, [sequence_number] [int] NULL, [action_id] [varchar](4) NULL, [succeeded] [bit] NOT NULL, [permission_bitmask] [bigint] NOT NULL, [is_column_permission] [bit] NOT NULL, [session_id] [smallint] NOT NULL, [server_principal_id] [int] NULL, [database_principal_id] [int]
I have an existing MS SQL database (2008 R2). I have a very simple SQL script. I need to run this script on a daily basis and save the results as a .csv file.
I've tried creating a job in SQL Manager - that didn't work. Then I found a tutorial that had me use MS Visual Studio... couldn't get that to function either.
what program I should be using, and how to go about creating this as a task / job that can run on a scheduled basis?
I am running Microsoft SQL Server 2012 SP on a Windows Server 2008 R2 Standard SP1 box. The SQL Server service is running as a simple windows domain user (nothing special, no admin rights, etc.) I am having some issues with using Bulk Insert when the data file is on a network share when using Windows Authentication. What is known is that the SQL Server service account has access to the network resource, which is shown by logging into SQL Server with a SQL account and doing the Bulk Insert. I also have rights to the files on the share, as shown by the fact that I put the files there. My SQL is in the form of:
Bulk Insert [table name] From '[server][share][filename]' With (FirstRow = 2, FormatFile='FormatFile.xml')
Now, when connecting to SQL Server with Windows Authentication and running the Bulk Insert I get the following error:
Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "[server][share][filename]" could not be opened. Operating system error code 5(Access is denied.).
I found this snip at
BULK INSERT (Transact-SQL)Security Account Delegation (Impersonation), which says, in part (emphasis mine):
To resolve this error [4861], use SQL Server Authentication and specify a SQL Server login that uses the security profile of the SQL Server process account, or configure Windows to enable security account delegation. For information about how to enable a user account to be trusted for delegation.
How to Configure the Server to be Trusted for Delegation, and we tried the unconstrained delegation and I rebooted the SQL server, but it still does not work. Later we tried constrained delegation and it still does not work.
I have verified the SPNs:
C:>setspn adsvc_sqlRegistered ServicePrincipalNames for CN=SVC_SQL,OU=Service Accounts,OU=Users,OU=ad domain,DC=ad,DC=local:    MSSQLSvc/SQLQA.ad.local:1433    MSSQLSvc/SQLDev.ad.local:1433    MSSQLSvc/SQLQA.ad.local    MSSQLSvc/SQLDev.ad.local I have verified that my SQL connection is TCP and I am getting/using a Kerberos security token. C:>sqlcmd -S tcp:SQLQA.ad.local,1433 -E1> Select dec.net_transport, dec.auth_scheme From sys.dm_exec_connections As dec Where session_id = @@Spid;2> gonet_transport auth_scheme------------- -----------TCP KERBEROS(1 rows affected)1>
If I move the source file to a local drive (on the SQL server), all works fine, but I must be able to read from a file share?
I've imported an SSIS package into Management Studio (2008 R2) and I've set up a SQL Server Agent job to call the package but it fails due to error code: 0xc00160aa.
As far as I can tell this is because it is unable to read the location of the package despite it being a file system location within Management studio. Also I can run the package manually within Management Studio, but when I try to call it via the job it fails.
I am using the SQL Server 2005 JDBC driver. I need to write a piece of code that makes use of the streaming ResultSet. That is as soon as I get the first row, a worker thread should be able to begin processing on it without waiting for the second row to arrive. When the second row arrives, the second worker thread should start processing on this new row without waiting for the third row to arrive and so on. Usually, with a ResultSet, I need to wait for all the rows to arrive first before I can start navigating the rows in the ResultSet. But in my code, I need to start navigating the ResultSet even as more rows are pouring in from the DBServer. How can I do that? Any pointers in this direction will be helpful. Further, I want to know will setfetchsize be of any help here? If I set the setfetchsize value to 1, does that mean that as soon as I get the first row, I can start working on this row of the resultset(viz start navigating the ResultSet) without waiting for the second row to arrive in the ResultSet?
We€™re doing a website for a TV station, and they want to have a live TV broadcast on their website, our experience is basically in CMS and portal solutions, but we never had any experience in live streaming, so this is all new to us, so we€™re looking for a technical and financial proposal with your suggestions and recommendations, this is the details of the project with some of the inquiries we have: -The signal is SDI (Pal 25 fps) -The output (online streaming) 15 fps -The user will have two options to view the movie in: 1. Low: 56K 2. High: 300K -The video should come within a customized designed page, probably as a code we can place inside an html table€™s cell, not as a link for external link -We€™re expecting about 200 users for the video each month; each user€™s logon time expected 10 min. -We have a Red hat Linux on our server that will host the site, and we use php/MySql technology.
Inquires: 1. Why Windows media is better then Real in terms of performance, can you provide us with a comparison table to include it to our client? 2. If we choose Windows media, what are the requirements we need to have, form both; the TV station side, and in our Linux server? And how much would it cost? 4. Is there any hardware required from the TV station side such as DVR, or some sort of encoder, if so; what€™s you€™re recommendations (type/ brand), please note that this is a TV station and it needs a high quality professional hardware 5. How (and this is probably the most ambiguous task) 5.1. Do you broadcast the digital signal coming from the encoder (I guess) to the server? 5.2. And then place this signal on a PHP page? Do you give us a link to embed in our code? 6. If we choose to host the live streaming on a separated server and include the URL in our code, would that still required from our server to support Windows media, or it doesn€™t matter as long as the video streaming is hosted on a Windows streaming server 7. How much disk space does it need to host a live streaming (with the details mentioned above)?
We would really appreciate your comments and suggestions, as this is our 1st time to dealing with live streaming.
I want to insert lots of data into two tables. For this I want to use table valued parameters and a stored procedure. So, what is the better way for best performance? Using two stored procedures or a single procedure with two parameters? How does SQL Server consume the data if I use a single procedure with two parameters. Is it really streaming the data? I mean, is SQL Server already starting to insert the first rows as soon as it gets it even if the client is still sending more and more rows and than the same with the next table? e.g. I have a procedure like this:
CREATE PROCEDURE [dbo].[usp_UpdateElements] ( @tvpElementsToInsert As [dbo].[tvpElements] Readonly, @tvpElementValuesToInsert As [dbo].[tvpElementValues] Readonly) AS                Begin                    Insert Into Elements Select * From @tvpElementsToInsert;                    Insert Into ElementValues Select * From @tvpElementValuesToInsert;                End
I am developing a simple ASP based form, that stores user info in MS SQL server. I have created a table in the SQL server to store the data and defined the body field with this line: `body char(255)`. The problem is that if the user inputs a string longer then 255 characters it gets choped off. How would you suggest solving this problem? Should I use `text` datatype instead?? Any comments are very appreciated !!!
security ids seem to be made up of at least 3 32 bit unsigned numbers and a few smaller numbers. We believe their lengths vary. We dont mind dropping the "S" from the front. What data type do you recommend be used for their storage? We expect only limited joins and user visibility on this column. We may wish to create an index on this column. We think varchar and varbinary are the two major choices.
I am using Windows 2003 Server English Version. I wanna store the big-5data so I install the sql server 2000 as if i install it in the Windows2000 with Server Collation of the Chinese_Taiwan_Stroke_CL_AS.However, the data are stored into the database server in unicodeinstead of big-5 in that of windows 2000 OS.I would like to ask how i can set so that the Sql Server 2000 can storethe big-5 data
A DB2 store procedure returns two data sets, when executed from SSMS, using linked server. Do we have any simple way to save the two data sets in two different tables ?
We are storing changed data of tables into XML format for auditing purpose. The functionality is already achieved. We are using FOR XML Path clause to convert relational data of tables into XML format.
Now, a table is having column name with '(' . For example name of the column is, ColumnName(). In this case we can not convert into XML using For XML clause. Showing error as,
Column name 'columnName()' contains an invalid XML identifier as required by FOR XML; '(' (0x0028) is the first character at fault.
I have stored procedure which runs on some period, what i want is when it first run i want to store that date permanently or till the next time it runs again, and then i need to take my data from that store proc , where hist_date between (date stored when it ran first time,example, 08/21/2015) and today date.(exampple, 08/24/2015)
Now next time when it runs , date stored should be updated in this case it should be (08/24/2015) .
How can i do this in stored proc, I tried to use temp table  but it didn't work .
I use new query to execute my store procedure but didnt return any value is that any error for my sql statement??
USE [Pharmacy_posicnet] GO /****** Object: StoredProcedure [dbo].[usp_sysconf] Script Date: 22/07/2015 4:01:38 PM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER procedure [dbo].[usp_sysconf]