Database Blobs && Performance
Jun 25, 2007
Hi geniuses,
I have a core database question in MS Access and SQL Server.
Assume the following table in MS Access or SQL Server:
Employees table:
EmpID, Name, Hiredate, Photo;
and the following query:
"SELECT EmpID, Name, Hiredate FROM Employees;"
The question is, "does the blob object (Photo) affect the performance of the query even when not included in the select statement?"
I mean, will I gain some additional performance when separating the blob fields to another table?
Thanks for the answer.
View 4 Replies
ADVERTISEMENT
Mar 4, 2004
I'm trying to store files such as pdfs in my SQL Server as blobs for a particular .NET application and I'm noticing a rather large performance hit when compared to if the pdf was simply stored in the web server's file system.
I have a pdf of 7MB. If this is on a web server it takes roughly 9 seconds to download. When I store the same pdf in my database and then write it onto the page using the code below it takes roughly 40 seconds to display. Can anyone help me to improve this performance difference? Should I not be using this method to store large files?
Dim PdfCol As Integer = 0 ' the column # of the BLOB field
Dim dr As SqlDataReader = cmd.ExecuteReader()
If dr.Read() Then
Dim mypdf(dr.GetBytes(PdfCol, 0, Nothing, 0, Integer.MaxValue) - 1) As Byte
dr.GetBytes(PdfCol, 0, mypdf, 0, mypdf.Length)
End If
Response.Clear()
Response.Buffer = True
Response.ClearContent()
Response.ClearHeaders()
Response.ContentType = "application/pdf"
Response.BinaryWrite(mypdf)
Response.Flush()
Response.Close()
View 4 Replies
View Related
Apr 29, 2015
I currently have a lot of XML blobs being stored in several of the production databases that I look after. They're there as a application "logging" feature, so that the developers can investigate issues if they ever occur.
I want to get these out of my OLTP databases asap and am currently looking at options. I know Brent Ozar recently posted an article about Content Addressable Storage systems [URL] ....
How do you handle point in time recovery? Any other ways of storing this data?
View 9 Replies
View Related
Apr 24, 2008
I have a table with embedded files in it and also ftp server/user/pwd fields so that each record has all the data and information required and need to configure an SSIS package to to FTP them to the destination as specified in the record within the databaseAt the moment I have a data flow task which uses a Derived Column and an Export Column transformation which creates the files to a temp folder then it runs a ForEach loop container back on the control flow to FTP each file and to then move each file to an archive folder.
All I think I need to know how to do is to somehow control the FTP connection settings with the fields from the database within this foreach loop, but it is a for each file collection. I also need to update each record saying it has been FTP'd as well
Do I need to have an initial query outside of this foreach loop to populate an array which can be used for the 4 FTP connection fields (server/user/pwd/directory), if so how?
Any ideas as to how to correctly finish this package.
Thanks
View 2 Replies
View Related
May 24, 2004
I have searched high and low for this answer, but still to no avail...
Does anyone have a reliable way of storing serialized data into a table which does not use direct SQL on the client side? I am using Stored Procedures and simply need to be able to read out of the table the object written in.
Inserting appears fine, but reading only seems to come back with 16 bytes of data and I just can't figure out why. Whats the recommended way of doing this?
Matt.
View 2 Replies
View Related
Mar 27, 2000
Are there any issues with BLOBs in SQL 7.0 ?
View 2 Replies
View Related
May 4, 2004
Hello to all,
Please help me to get a right decision:
storing BLOBs 5M in database or using URL. I saw a lot of articles about this matter and really cannot decide.
The database size is about 35 M, not too much DDL operations.
Advises?
View 3 Replies
View Related
Jan 4, 1999
Howdy,
Any one got any tricks on speeding up storing and retrieving image data? I have a typical 2MB image and I am trying to improve speed... it now takes 8 seconds to store the blob.
thanks
Dick Butler
View 1 Replies
View Related
Aug 3, 1998
Help! Where building a document mgmt system that will store MSword documents, etc in SQL Server.
When we BCP a table which contains image data to another copy of the table,
the image data seems to get clobbered and is not longer valid. Has anyone had this problem
View 1 Replies
View Related
Feb 15, 2007
We have a blob in one table that is storing pdf files. I need towrite a select query that will grab that column and write those *.pdffiles out to a location on my hard drive. Does anyone know how thatcan be done? All suggestions would be greatly appreciated....
View 2 Replies
View Related
Jul 8, 2007
Hi,
VS2005
SQl workgroup 2005
I have a DB where I store alot of Blobs, the blobs are tagged up with alot of id. My concern is deleting them automatically on a regular basis, too keep the DB size down. I will be deleting them based on based on age(date/time) and what category the blob is classed as.
- My concern is how to delete blobs sort of equally dispersed among the categories so no category ends up empty although it may have old pictures in it.
Any tips, links or code is much welcome.
(my app will be acessing the DB though ADO .NET 3.0 c++/cli)
Cheers,
Andreas
View 17 Replies
View Related
Aug 17, 2006
I want to store message database in SQLEv. Each message contains fixed header and a body. Body is UTF-16 encoded text. Average size of the body is about 700 symbols. Standard deviation about 1000 symbols. Maximum size - 200K symbols. Total count of the messages is up to 2 millions.
The only type I can use for body is text or binary blob. But blob larger than 256 bytes store in separate pages. Because of most message bodies lager than 256 bytes, each message take avg. PageSize / 2 not used for storing data, but occupied space.
For 2M messages this space will be 4096 / 2 * 2M = 4G. But SQLEv has 4G limit for database size!
Is it any way to store such data in SQLEv more efficiently?
View 5 Replies
View Related
Dec 14, 2006
Hi
I wanted to ask a question in terms of design (Performance, scalability, maintenance, flexibility etc.) is it better to refernce a document on the actual file system (on a file server) i.e. from a database link to the actual file at a disk location
e.g. in the database have something like \fileserver2006 estewtext.txt or C: estewtext.txt
or is it better to store all the files in a database as blobs?
The Question i would like to ask is more in the context of desigining a document management system?
View 5 Replies
View Related
Oct 14, 1999
Greetings
I'm clear about the use of a DELETE trigger to "move" your deleted record to a second database as a sort of recycle bin.
But SS7 has the limitation, and it's mentioned in BOL, that it cannot reference your TEXT, NTEXT or IMAGE fields in the DELETED table. It says to join the original table with DELETED to get at those fields.
The only problem is the original table's record has been deleted! Even though the transaction has not yet been COMMITTED.
Here's my Trigger:
CREATE TRIGGER AuditTest ON Activity FOR DELETE AS
INSERT AuditDB.dbo.Activity
SELECT Activity.* FROM Activity INNER JOIN Deleted
ON Activity.ActivityID = Deleted.ActivityID
And for discussion, here's my Table:
ActivityID uniqueidentifier
OrgId uniqueidentifier
Title varchar(600
Active bit
Comments text
LastUpdate datetime
Any suggestions? Has anyone been able to implement a DELETE Audit
Trigger on a table with BLOBs?
Thanks,
-Rich
Richard Hundhausen
Stuttgart, Germany
View 3 Replies
View Related
Apr 15, 2004
Hey Guys I knwo this may sound impossible but lets say I have a number of fields one of which is a Long blob or long text
is there a way to have MYSQL search the blobs for keywords and then to extract them to other fields?
basically what I am asking is it possible to parse a long text blob for keywords and then grab data before or after those keywords?/
anyone know a way???
View 3 Replies
View Related
Jul 20, 2005
I've seen plenty of posts regarding the estimation of table size,usually in the processing of planning for server storage needs.Well, I've got a different problem. I need to know how much data eachof our Customers are using in a Database. (1 SQL DB stores multiplecustomers).Basically, I want to be able to say: Customer A: 45.5 MB, Customer B:655 MB.So, how can I ask SQL Server how much data each Row in each table istaking up? I want to be able to calculate nightly the total size, so Iwould take each row in each table that belong to the customer, and addall the sizes together. I want to take into account blobs that arestoring images and PDF files also.Thanks in advance,Jesse Wolgamott
View 1 Replies
View Related
Feb 5, 2004
I am taking over a Database set up where we have 2 databases. One is a SQL SERVER 2000 db and the other is an Oracle 9i db.
THe previous DBA had/has a problem with transferring CLOB and Blob datatypes From the ORACLE db to SQL SERVER. He says that the OLEDB driver SQL SERVER provides has limitations particularly on the CLOB data because it truncates data over 4000 characters in length.
For the Blob issue:
We basically want to store/move attachments (word documents etc)
into image datatypes (SS) from the blob datatype (9i).
I figured I could do this using DTS or even using the Linked server method. I figured worst case scenario I would just store the documents on the server and then use SQL SERVER's textcopy utility to upload the documents into the image datatype column of the SQL SERVER table.
For the Clob Issue I have no idea what to do, because I cant find anything online that mentions SQL SERVER having issues with importing CLOB data over 4000 charcters in length, afterall the text datatype accomdates more than 4000 characters.
Does anyone have any suggestions for moving the Blob and most importantly the Clob datatypes from 9i to SQL SERVER without any truncation in the data????
View 5 Replies
View Related
Jul 20, 2005
Hi folks,I want to be able to store and retrieve UNSAVED Word documents as BLOBs. Igot all the info for storing them if they're already saved on the filesystem. But what if they're not already saved? I could save them to a tempfile first and then make it a blob, but I'd rather not put them on the filesystem at all. I could use Document.Contents to get the Range object for thewhole doc and then BLOB that, but I'm not sure that would be quite the sameas a .doc file, maybe things like Styles and the like would be missing.TIA!Dave
View 2 Replies
View Related
May 28, 2008
I got this error yesterday running an SSIS package:
"The buffer manager cannot create a file to spool a long object on the directories named in the BLOBTempStoragePath property. Either an incorrect file name was provided, or there are no permissions."
This same package was working fine on a different server, so I compared the configurations of the two boxes. It turned out that the service account that runs the package was in the Windows Administrators group on one box but not the other. I added the service account to Administrators on the box that had the problem and the problem went away.
I have looked around and have not found a list of the necessary Windows permissions that an account must have in order to run an SSIS package. This list would include write permissions to the path specified in BLOBTempStoragePath. Has anyone out there configured an account with minimal permissions to run a package that has a Data Flow?
Thanks,
Ron Rice
View 5 Replies
View Related
Aug 13, 2007
I'm trying to write SSIS packages to import DB2 6000 tables to SQL Server 2005 tables. The package vendor has defined CLOB and BLOB datatypes for columns in the DB2 tables, and going through the Data Flow tasks from OLE DB for DB2 to OLE DB for SQL Server, it tries to map CLOBs to TEXT columns, and BLOBs to IMAGE columns. The vendor has no conversion utility available, so I have to write the SSIS packages. However, the task generates a goodly number of errors just trying to open the OLE DB DB2 dataset. Errors like the following:
Error: 0xC0202009 at Set 5 1 2, DB2 vendor_master [7127]: An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft DB2 OLE DB Provider" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC0208265 at Set 5 1 2, DB2 vendor_master [7127]: Failed to retrieve long data for column "COMPETITOR_COMMENT".
Error: 0xC020901C at Set 5 1 2, DB2 vendor_master [7127]: There was an error with output column "COMPETITOR_COMMENT" (8866) on output "OLE DB Source Output" (7138). The column status returned was: "DBSTATUS_UNAVAILABLE".
Error: 0xC0209029 at Set 5 1 2, DB2 vendor_master [7127]: The "output column "COMPETITOR_COMMENT" (8866)" failed because error code 0xC0209071 occurred, and the error row disposition on "output column "COMPETITOR_COMMENT" (8866)" specifies failure on error. An error occurred on the specified object of the specified component.
Error: 0xC0047038 at Set 5 1 2, DTS.Pipeline: The PrimeOutput method on component "DB2 vendor_master" (7127) returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047021 at Set 5 1 2, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.
Error: 0xC0047039 at Set 5 1 2, DTS.Pipeline: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Set 5 1 2, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0xC0047039.
Information: 0x40043008 at Set 5 1 2, DTS.Pipeline: Post Execute phase is beginning.
In this case, 'COMPTETITOR_COMMENT' is defined as a CLOB(2 M) column format in the DB2 database table, and the output format in the SQL Server 2005 table is defined as a TEXT column.
Any assistance would be appreciated.
View 1 Replies
View Related
Nov 29, 2007
What is the performance comparison for XML and database?
Using system.IO to read the XML file will be slower than reading data from database , if only read data and not sorting?
The RAM/CPU memory usange will be higher for get data from XML compare to get data from database?
View 5 Replies
View Related
Dec 14, 2000
Same databse server, two databases, one a copy of the other origanol giving bad performance. New copy will return 300000 rows in a second. the origanol will take thirty seconds to return same data set. 7 users on bad one 3 on good one. Bad one has been reindexed, checkdb and newalloced with no errors. Still giving very bad performance. Any one got ant further ideas on what to do??? please help.
View 5 Replies
View Related
Jan 13, 2000
Hello
Im a operator for backup (and many other),but i had a problem
with Sql 6.5 and ArcServeIT. Sql is the database for Arcserve.
Many of the Fileservers (30 Servers) stored over 700.000 Filenames
in one Fullbackupsession (every Friday)into the database.
(the Table called astpdat)
When i search one File in the database (via Arcserve,its like clicking
the explorer-tree)the performance is veerryy slloww.
How can i make this speed up?
the database is 5GB ,the server on it is, has 4 cpu´s ,1gb Ram
the Sql Server has 100 MB Ram ,100 MB TempDB
(HP LXr 8000)
thanks for answer joe from (Bratwurst) germany
View 1 Replies
View Related
Jul 20, 2005
Hi, I wonder if someone can answer a quesiton for me: I'm modifying adatabase with the purpose of adding the new feature of address changehistory. My model would consist of a table for keeping clientname/logon (for a public site) info in one table, and address info inanother table because the login info would likely be more frequentlyaccessed/changed than address updates. Now a group that does dataentry internally through a web interface always need to see theaddress.For the first stage I don't want to change the old table, just have anew one for now. But moving forward, I thought it would be neat tohave all address update records in one table and have a Profile typevalue to distinguish whether data entry or a public website usercreated the update record.However, a thought occured to me: If one table is responsible forshowing current address as well as adding records whenever there is anaddress change, would it hurt performance? Would I get betterperformance, splitting the record types into two tables, or does itmatter since the table I'm thinking of creating would have nodeletions: Only insersions and modifying an expiry date field so weknow which record to use. I'm not a specialist on database performanceso if any of you database gurus out that can advise me on that thatwould be GREAT. Thanks a million guys.Jonah A. Libster
View 2 Replies
View Related
Apr 6, 2007
Hi All
In Oracle i can get Performance varables like Library Cache Hits, Dictionary Cache Hits, Database Buffers Read ,Redolog Buffers Read etc from the system dynamic tables.
I want to know how to get the same / related performance details in sql server 2000 and 2005. ( which are the parameters , Optimal value and which table/dynamic view to query).
Thanks in Advance
View 9 Replies
View Related
Sep 6, 2007
I have an asp.net application on SQL Server 2005. I have completed indexing all the physical primary and foreign keys, virtual primary and foreign keys, sorting order, where clause fields and so on.
On first day, I only index all the physical primary and foreign keys, virtual primary and foreign keys. I noticed the loading performance has improved. So I continue with the remaining index process on the second day. This time, I noticed the loading performance is slower by 0.5 to 1 second. Is there any possibility that the loading performance will be slower after indexing?
Please advise. Thanks.
View 3 Replies
View Related
Oct 10, 2002
Hello world -
Does anyone know any good resources to get ideas / scripts for measuring capacity and statistics on SQL Server 2000 database or otherwise? I want to incorporate SQL script jobs that will email me various statistics everyday and am trying to find a starting point.
Thanx!
View 1 Replies
View Related
Dec 12, 2004
First of all, sorry if this is in the wrong section, didn't know where to put it.
I'm doing a university paper comparing Microsoft Access versus SQL Server 2000 and I want to run a benchmark on them to see which is faster, does anyone know of any applications that will let me do this?
You may think this is silly since SQL Server is quite obviously faster, thing is a can't just say that in my paper, I have to be more specific so was hoping I could run some benchmarks and show the scores on the paper.
I know I can run an ASP script that can time how long the query takes to run, however this can't test multiple concurrent users accessing the system (useless I get all my friends computers around my house, bring up the page and get them to click on refresh all at the same time :) ).
I basically want to run a simple SQL SELECT statement on an identical database in both database systems (Northwind), but for mutliple users. Anyone know of any application?
Or does anyone know any performance tests that has been done on comparing Access with SQL Server? All I can find is material comparing high-end database against high-end database (Oracle vs SQL Server vs IBM DB2 etc.). If I can't do my own I can always use other peoples. Cheers!
View 3 Replies
View Related
Mar 6, 2004
Hi
I am using SQL server database with asp as front end. I use asp command object to call sql stored procedure. The procedure runs a while loop for say 100,000 records and based on the IF condition calls particular stored procs which process the record.
I am running this app on a p4 IBM pc with win 2000 sever and IIS on same m/c . The cpu utilisation goes upto 98-99% and the process has started running very slow of late. Is this slow processing speed a hardware/OS problem or is it due to calls for stored proc within stored proc? how can i optimise the process. Each stored proc called does have if conditions,table scans etc.
please advise
View 7 Replies
View Related
Mar 8, 2007
Hi there
This is the scenario:
I have a heavy duty database, that is being accessed very, very frequently (i.e. 100 times in a minute).
Now, I would like to make a backup of the database, just in case something goes wrong (recovery reasons, etc.).
My question is how will making a backup impact the performance of the database and how will I be sure that the backup is in the consistent state?
Thank you
View 6 Replies
View Related
Jul 31, 2007
dear friends,
i have dropped many objects and recreated in a database.suddenly my database became very slow. so please any one of friends give solution.
View 9 Replies
View Related
Oct 23, 2007
I have client tools installed on a server and I have registered our 30+ instances hosted on various servers to this one MS SQL 2005 Management Studios.
Question:
How can I use this set up to send an e-mail distribution list a nice monthly chat showing the sizes of the database, memory, cpu utilization of all the registered databases?
Many thanks for your help !!
seethem
View 3 Replies
View Related
Jan 15, 2008
Hi All,
from your experience in SQL 2005 - do i have any free software that can help in improve performance or can help in identifying performance bottleneck. two examples of performance and help that i use usually use are the maintenance plan that do (check DB > reorganized index > rebuild index > update statics) and the second software is the SQL 2005 DASHBOARD for the reporting help.
do you have any other free tools and help that you can give me for performance or any thing that i must have in my SQL 2005 servers.
Thx
View 3 Replies
View Related