Timeouts On Delete With Image And Varbinary(max) Data
Oct 8, 2007
I have a database that I am using as an archive for emails and am storing them in varbinary(max) data types. The database works fine for inserts and retrieval but I cannot delete large sets of records ( > 30K) without it timing out. I assume this is because SQL server isn't simply releasing the handle to the blob but is instead doing a lot of work reclaiming the space.
Is there any flag I can set or approach I can use to resolve this issue?
I was considering moving the blobs into a seperate simple table and putting async triggers on the primary to delete the blobs, will this work?
Any ideas are appreciated. Besides the "store files in the file system" idea.
- Christopher.
HiI moved to SQL Server 2005 (from 2000) and noticed there is a bettervariable to deal with binary arrays.I hava a table that hold 9 columns of images (BLOB). each array has adifferent size and can be larger than 4k.I've changed the column data type from image to varbinary(max).After the chnage, the size of the table grew from 22MB to 26MB.Any idea why? I though the new variable should be better.I have another table where there are 3 columns with binaries, the sizeof the arrays there is much smaller and vary from 32bytes to 150byteslong. when I changed the data type to varbinary the size of the tableshrunk by half!!now I'm completely confused...Gilad.
what is the best way of comparing image/varbinary and nullable column in sql server? this is what i do to find out the different in image column in 2 tables sharing the same structure: Select * From TblA s Left Join TblB c On
s.Id = c.Id And Coalesce(Convert(VARBINARY(MAX), c.Image),'') <> Coalesce(Convert(VARBINARY(MAX), s.Image),'') Where c.Id IS NULL
alternatively, i was thinking to compare the column with the size, but worry about the accuracy: Select * From TblA s Left Join TblB c On
s.Id = c.Id And ISNULL(Datalength(c.Image),0) <> ISNULL(Datalength(s.Image),0) Where c.Id IS NULL
We used SSIS to move data from a table in SQL 2000 which had a column with the image datatype to a column in a table in SQL 2005 that has a datatype of varbinary(max). No errors were produced from the SSIS package.
There were a number of records where the DATALENGTH of the column with the image datatype was greater than 8000. Was the data truncated for these records?
This is probably a very elementary question, but I am not familiar with the application or the data.
Below is the source table in SQL 2000 and a select count(*) ...
When -all- records from a table with a varbinary(max) column are deleted (not via truncate), the table properties still show a dataspace size from before the delete operation. Inserting new blob records only leads in the growth of the allocated space withouth reusing the empty already allocated space.
Runnning commands like dbcc updatusage/checkdb/cleantable/reindex or sp_spaceused @updateusage = N'TRUE' seem to have no effect.
Does anyone know when space allocated by a varbinary col. is released?
thanks, Derk
running SQL 2005 STD ed SP2. DB is in simple recovery mode.
I am using the 3-tiered architecture design (presentation, business laws, and data acess layers). I am stuck on how to send the image the user selects in the upload file control to the BLL and then to the DAL because the DAL does all the inserts into the database. I would like to be able to check the file type in the BLL to make sure the file being uploaded is indeed a picture. Is there a way I can send the location of the file to the BLL, check the filetype, then upload the file and have the DAL insert the image into the database? I have seen examples where people use streams to upload the file directly from their presentation layer, but I would like to keep everything seperated in the three classes if possible. I also wasn't sure what variable type the image would be in the function in the BLL that receive the image from the PL. If there are any examples or tips anyone can give me that would be appreciated.
I am trying to create a clustered index on a View of a table that has an xml datatype. This indexing ran for two days and still did not complete. I tried to leave it running while continuing to use the database, but the SELECT statements where executing too slowly and the DML statements where Timing out. I there a way to control the server/cpu resources used by an indexing process. How can I determine the completion percentage or the indexing process. How can I make indexing the view with the xml data type take less time?
We serialize a custom object into a byte array (byte[1000]) and store it in a SQL Server 2005 table column as varbinary(1000). There are a lot of rows retrieved with each SqlDataReader from C# code: up to 3,456,000 rows at a time (that is, roughly 3.5 million rows).
I am trying to figure out what will be more efficent in terms of CPU usage and processing time. We have come up with quite a few approaches to solve this problem.
In order to try a few of them, I have to know how I can extract certain "pieces" of data from a varbinary value using T-SQL.
For example, out of those 1000 bytes, at any given moment we need only the first 250 bytes and the third 250 bytes. Total: 1000 -> [250-select][250-no-need][250-select][250-no-need]
One approach would be to get everything and parse it in C#: get the 1st and the 3rd chunks of data and discard the unneeded 2nd and 4th. This is WAY TOO BAD.
Another approach would be to delegate the "filtering" job to SQL Server so that SqlDataReader gets only what it needs.
I am going to try a SQL-CLR stored procedure, but when I compared performance of T-SQL vs. SQL-CLR stored procs a few weeks ago, I saw that the same job is done by T-SQL a bit faster AND (more importantly for us) with less CPU consumption than SQL-CLR.
So, my question is: how do I select certain "pieces" of varbinary column data using T-SQL?..
In other words, instead of SELECT MyVarbinary1000 FROM MyTable how do I do this: SELECT <first 250 from MyVarbinary1000>, <third 250 from MyVarbinary1000> FROM MyTable ?
I need to store 256 bit hash (SHA-2 alogrithmn) in one of the table'sprimary key. I would prefer to use numeric data type rather varcharetc.* Decimal datatype range is -10^38 +1 to 10^38 -1. I can split my 256bit hash into two decimal(38, 0) type columns as composite key* I can store the hash as varbinary. I never used it and don't havemuch understanding in terms of query writing complexities and dealingit through ADO (data type etc.)It would be heavy OLTP type of systems with hash based primary keyused in joins for data retrieval as well.Please provide your expert comments on this.RegardsAnil
I need to put .doc data into a varbinary column for full text searching. I have created the db and columns but am unsure as to how to insert the varbinary data. I have found some discussions about inserting images but nothing explicitly on .doc files. Can anyone suggest resources or sample code?
Hey all, I have a profile table for members of a website that allows them to upload an avatar for use throughout the site. I decided to add the option to just remove the avatar if they want to no longer use one at all and I am stuck. I am running an update statement against thier entire profile incase they change more then just removing thier avatar but of course when I try this statement. myCommand.Parameters.AddWithValue("@avatar", "") I get a conversion error, which makes sence. So what instead of "" do I need for code to just insert empty data to overwirte thier current avatar data. Any help would be great, thanks for you time.
Hey all, Another varbinary question. I am trying to move an image stored in a table varbinary(max) directly from one table to another programmatically. The rest of the data is just nvarchar(50) so I just use a T-SQL select statement in the code behind and feed all of the date into an SqlDataReader, I do this becuse there is some user interaction when the data is moved, so there may be some new values updated when transfering from one table to another, so once the old and possibly new data is stored in seperate variables then I use a T-SQL insert statement to move all of the data into the other table. Problem is I am not really sure how to handle the image data(varbinary(max)) to just do a straight up transfer from the one table to another. I get conversion errors when trying to handle the data as a string which makes sense. Not sure what to put for code examples since I really am stumped with this, but here is what is not working. Dim imageX As String SqlDataReader Code - imageX = reader("imageData") Insert code - myCommand.Parameters.AddWithValue("@imageData", imageX) Thanks in advance,
I'm unable to create a field with the type varbinary(max). When I try doing this with Management Studio, it tells me that the maximum length is 8000 bytes. I've also tried creating the field with DDL as shown below, but that doesn't work either. If I create the varbinary field with a length of 8000 or less, it works fine. Is there some trick to using varbinary(max)?
I have some source tables which contain timestamp fields (that's timestamp data type not datetime). My dimension table holds the maximum timestamp value as a varbinary(8).
I want my package to have a variable that holds that value but I don't know which data type to use for this. The reason for this is because I want to use that variable to then retrieve all records from my source table that have a timestamp value greater than the value stored in the variable.
Could someone help me with writing the code to export the contents of a varbinary field in my database to make the contents be written to the local harddrive? I can do this in Visual Foxpro but my company wants it in C# and I have no clue about C#.
I have a SQL script file which creates a table and and inserts all data required by the client app into the table.
The table also holds images in a varbinary field. How do I insert an image held on the file system (e.g. "C:ImagesMyImage.gif") into a varbinary field using a script that is run using sqlcmd on the command line?
I have several tables a varbinary column in a database. They have names like CSB_BLOB or OBJECT_BLOB. Now I am having intermittent success with getting the data out.
For example this query returns readable text from this data.
0x46726F6D3A20226465616E6E6167726.....etc --data as stored in the column
SELECT CAST(CSB_BLOB AS VARCHAR(MAX)) AS 'Message' FROM OBJECT_BLOB
However this column has the following query results.
0x0001000000FFFFFFFF01000000000000000C....etc. --data as stored in column
--this query returns empty result
SELECT (CSB_BLOB AS VARCHAR(MAX)) AS 'Message' FROM CSB_STATUS_LOG
--this query returns no change???
SELECT CONVERT(VARCHAR(MAX), CONVERT(VARBINARY(MAX), CSB_BLOB, 2), 2) FROM CSB_STATUS_LOG 0001000000FFFFFFFF01000000000000000C....etc
Obviously there is a difference between the two but I am not educated enough to interpret this difference. What do I need to learn / read so I can look at the data in one of these BLOB columns and know how to convert it to something meaningful?
Something like:
1. Try to cast as varchar to see if it is text. 2. Turn into a byte array and see if it is a jpg 3. Turn into a byte array and see if it is a pdf 4. Convert it to hex and then cast as varchar 5. etc....
CREATE TABLE [dbo].[IntegrationMessages] ( [MessageId] [int] IDENTITY(1,1) NOT NULL, [MessagePayload] [varbinary](max) NOT NULL, )
I call the following insertmessage stored proc from a c# class that reads a file.
ALTER PROCEDURE [dbo].[InsertMessage] @MessageId int OUTPUT AS BEGIN
INSERT INTO [dbo].[IntegrationMessages] ( MessagePayload ) VALUES ( 0x0 )
SELECT @MessageId = @@IDENTITY
END
The c# class then opens a filestream, reads bytes into a byte [] and calls UpdateMessage stored proc in a while loop to chunk the data into the MessagePayload column.
ALTER PROCEDURE [dbo].[UpdateMessage] @MessageId int ,@MessagePayload varbinary(max)
AS BEGIN
UPDATE [dbo].[IntegrationMessages] SET MessagePayload.WRITE(@MessagePayload, NULL, 0) WHERE MessageId = @MessageId
END
My problem is that I am always ending up with a 0x0 value prepended in my data. So far I have not found a way to avoid this issue. I have tried making the MessagePayload column NULLABLE but .WRITE does not work with columns that are NULLABLE.
My column contains the following: 0x0043555354317C... but it should really contain 0x43555354317C...
My goal is to be able to store an exact copy of the data I read from the file.
Here is my c# code:
public void TestMethod1() { int bufferSize = 64; byte[] inBuffer = new byte[bufferSize]; int bytesRead = 0;
byte[] outBuffer;
DBMessageLogger logger = new DBMessageLogger();
FileStream streamCopy = new FileStream(@"C:vsProjectsSandboxBTSMessageLoggerInSACustomer3Rows.txt", FileMode.Open);
try { while ((bytesRead = streamCopy.Read(inBuffer, 0, bufferSize)) != 0) { outBuffer = new byte[bytesRead];
//This calls the UpdateMessage stored proc logger.LogMessageContentToDb(outBuffer); } } catch (Exception ex) { // Do not fail the pipeline if we cannot archive // Just log the failure to the event log } }
I cannot find the data type for parameter mapping from Execute SQL Task Editor to make this works.
1. Execute SQL Task 1 - select max(columnA) from tableA. ColumnA is varbinary(8); set result to variable which data type is Object.
2. Execute SQL Task 2 - update tableB set columnB = ? What data type should I use to map the parameter? I tried different data types, none working except GUI but it returned wrong result.
Does SSIS variable support varbinary data type? I know there's a bug issue with bigint data type and there's a work-around. Is it same situation with varbinary?
I am using Master Data Service for couple of months now. I can load, update, merge and soft delete data in MDS. Occasionally we even have to hard delete data from MDS. If we keep on soft deleting records in a MDS table eventually there will be huge number of soft deleted records. Is there an easy way to hard delete all the soft deleted records from all MDS tables in a specific Model.
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
The ERP manufacturer used an image data type to store large text data fields. I am trying to move these data types from one database to another database using either Sql Queries or MS Access. I can cast them as an 8000 char varchar to read them directly but have no luck importing into these image data fields.
Access and Crystal are not able to read these fields directly.
Any suggestions? Most information about these fields has to do with loading files but I am just moving data.
of query timeouts in SQL Server Management Studio?
I have increased the timeout setting in many places and still receive a timeout message within 40 seconds for certain update queries that involve large tables and many records. The only workaround is to break up the job into smaller queries but this makes management, unmanageable.
Is there a solid alternative to SQL Server Management Studio?
hi!I am having some timeout issues.I am running sql 2k with 3gig available ram.I did a 600,000 record delete on a table that gets written to by theactive/production application and my application timed out when it wasdoing the delete.why did stored procedures that ran fine before I started the largedelete slow down?Some of the procedures which slowed down were accessing the same tablewhere i was doing the delete.Thanking you in advance!!parez
I am getting Timeout Errors quite often and cannot figure out why. I am using Enterprise Library 2.0 when accessing the database. It is not from any particular function or page either and when I check the database there may only be 2 or 3 connection from my app. Any ideas of what could be causing this? Below is my error and stack.
Error: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Stack: at System.Data.ProviderBase.DbConnectionPool.GetConnection(DbConnection owningObject) at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) at System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) at System.Data.SqlClient.SqlConnection.Open() at Microsoft.Practices.EnterpriseLibrary.Data.Database.OpenConnection() at Microsoft.Practices.EnterpriseLibrary.Data.Database.ExecuteReader(DbCommand command) at HM.Security.SecurityData.GetAllUsers(Int32 filter) at HM.Admin.Security.SecuritySearch.SetUpAutoFill() in f:InetpubWwwroothomemiSectionsactSearch.aspx.cs:line 74 at HM.Admin.Security.SecuritySearch.Page_Load(Object sender, EventArgs e) in f:InetpubWwwroothomemiSectionsactSearch.aspx.cs:line 26 at System.Web.Util.CalliHelper.EventArgFunctionCaller(IntPtr fp, Object o, Object t, EventArgs e) at System.Web.Util.CalliEventHandlerDelegateProxy.Callback(Object sender, EventArgs e) at System.Web.UI.Control.OnLoad(EventArgs e) at System.Web.UI.Control.LoadRecursive() at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
Hello all, I am managaing an NT infrastructure that supports a web application. The web server is configured to use an ODBC System DSN to access data on an SQL server over a 100MB private segment. When users run a particular query through the web page, they get an ODBC "Timeout Expired" message. Does anyone know a fix for this. I could not find anything that explicitly identified this problem in MS Support. The query runs fine from an Eneteprise Manager T-SQL window both on the server and remotely.
Thanks in advance, Ed Molinari Technical Architect Eamerald Solutions
A timeout issue is occurring resulting in following error. This issue is sporadic and may be related to network. This error occurs on the client: Microsoft OLE DB Provider for SQL Server error `80040e14` OLE/DB provider returned message: Timeout expired "conn_info".asp, line 444
Subject: SQL server job timeouts?We have a job that uses WinHTTP inside a stored procedure. We haveanother SP wrapper that runs this for a couple hundred records.When we add this as a job using SQLAgent it times out half way. Runningit from SQL Query analyser and it completes to the end.Anyway to set the timeouts for jobs?
I've had my SQL server database running for two years now without aproblem.However, just today one of the main tables started returning an error.The table is contained within a database called engineering. I backit up once a week and the file size is up to about 40 MB.The error returned when trying to return data from one table(DbLucent) is:"[Microsoft][ODBC SQL Server Driver]Timeout expired"I can open/query any of the other tables in the database. I can opendesign table for this table. But it won't return any query.I'm debating whether to restore the database from the last backup.Any suggestions would be appreciated. Being located reomotely, Irather not fly back to the city where the server is and work on itthere either.-David