Does A Large Amount 'image' Data Affect Overall SQL Performance?
Jul 20, 2007
Recently we added a new table into our SQL2000 database specifically to store scanned in images of documents. This new table contains a PK field, a couple of datetime fields, a couple of char(1) fields and one 'image' field.
Before adding this table, the database size was approx 6GB. Six months after adding this new table, the database has grown to 18GB - 11GB of this is due to the scanned in images.
Would this new table affect the SQL performance with regards to accessing other data in the database that has nothing related to the new table?
If so, would moving this new table into it's own database be recommended?
Hi,I am using SQL 2000 and has a table that contains more than 2 millionrows of data (and growing). Right now, I have encountered 2 problems:1) Sometimes, when I try to query against this table, I would get sqlcommand time out. Hence, I did more testing with Query Analyser and tofind out that the same queries would not always take about the sametime to be executed. Could anyone please tell me what would affect thespeed of the query and what is the most important thing among all thefactors? (I could think of the opened connections, server'sCPU/Memory...)2) I am not sure if 2 million rows is considered a lot or not, however,it start to take 5~10 seconds for me to finish some simple queries. Iam wondering what is the best practices to handle this amount of datawhile having a decent performance?Thank you,Charlie Chang[Charlies224@hotmail.com]
I have a large amout of dbs (150) on my SQL Server and when using the enterprise manager to do administrational tasks like Backup, Restore etc. it takes 1.5 hour to open the Database folder. Server is 2xP4, 3Gb RAM. Any ideas on how to manage this number of dbs on the same server and instance of SQL. Cheers!
i have just created a test database and now need to insert a large number of records into one of the tables, we were thinking of about 1 million records, has anyone got an sql script that i could use to create these records
This is a general question on the best way to import a large amount of datato a MS-SQL DB.I can have the data in just about any format I need to, I just don't knowhow to import the data. I some experience with SQL but not much.There is about 1500 to 2000 lines of data. I am looking for the best way toget this amount of data in on a monthly basis.Any help is greatly thanked!!Mike Charney
I need to delete data from a particular table which has more than half a million records. The data needs to be deleted is more than 200,000 records from the table. What is the best way to delete the data from the table other than importing into a temporary table and performing the same operation?
Let me know if the strategy to be followed is okay.
1. Drop all the triggers 2. Drop all the indexes 3. Write a procedure with a loop setting ROWCOUNT to 1000 and delete the records. ( since if I try to delete all the rows it will give timeout error ) The above procedure will delete 1000 records for each batch inside the loop till it wipes out all the data for the specified condition. 4. Recreate Indexes and Triggers.
Please let me know if there are any other optimal solution.
I have SQL 2000 and need to retrieve fairly large amout of data (~50.000 characters) in XML format and then insert it into the field ofthe text type.As 'FOR XML' can't be used with either local variables, INSERT INTO orSELECT INTO this makes "XML support" quite useless in many aspects.Can anyone please help me in solving this.Thanks a lot for your help and time.Pavel
i need to retrieve a large amount of data from the sql server database and make changes to one field and put the data back using a console application. how do i do it?
Hi, My application needs to retrieve data from a table which has more than 15 lakh records. The records keep increasing in thousands every 15 days. Is there anyway i can reduce the time to retrieve? basically i have a select statement with a few conditions and a clause for the id's of these records.
We have several database linked via merge replications. Due to business requirements, we need to delete 5M rows in one table, we did it on one subscriber. However, the publisher kept uploading the deletion operations from the subscriber and blocked any downloading operation from publisher to subscriber. How can we acceralte the replications now as this has already operated in 2 days, and will continue 1-2 days? Is it possible to set the publisher take the downloading before uploading? How to speed up large amount data deletion operations in replication environment?
I have an interesting challenge. we are not allowed to allow users direct access to data in our SQL Server. Audit requires us to take the data out of our production server and pass it to the user. my situation is i have a table in SQl with over 100,000 records and growing. i want to pass that to an access data base. i am utilizing DTS and a data transform.
i s there a better way? the package is running slow and even appears to freeze. i see this amount of data as growing as well.
I have a combo box where users select the customer name and can eithergo to the customer's info or open a list of the customer's orders.The RowSource for the combo box was a simple pass-through query:SELECT DISTINCT [Customer ID], [Company Name], [contact name],City,Region FROM Customers ORDER BY Customers.[Company Name];This was working fine until a couple of weeks ago. Now wheneversomeone has the form open, this statement locks the entire Customerstable.I thought a pass-through query was read-only, so how does this do atable lock?I changed the code to an unbound rowsource that asks for input of thefirst few characters first, then uses this SQL statement as therowsource:SELECT [Customer ID], [Company Name], [contact name],City, Region Fromdbo_Customers WHERE [Company Name] like '" & txtInput & "*' ORDER BY[Company Name];This helps, but if someone types only one letter, it could still bepulling a few thousand records and cause a table lock.What is the best way to populate a large combo box? I have too muchdata for the ADODB recordset to use the .AddItem methodI was trying to figure out how to use an ADODB connection, so that Ican make it read-only to eliminate the locking, but I'm striking outon my own.Any ideas would be appreciated.Roy(Using Access 2003 MDB with SQL Server 2000 back end)
guys,I have a project need to move more than 100,000 records from onedatabase table to another database table every week. Currently, usersinput date range from web UI, my store procedure will take those dateranges to INSERT records to a table in another database, then deletethe records, but it will take really long time to finish this action(up to 1 or 2 hours).My question is if there is some other way I should do to speed up theaction, I am thinking about use bcp to copy those records to datafileand then use bcp to insert it into SQL Server table. Is this the rightway to do it or should I consider other solution (then, what is thesolution.)Thanks a lot!
We think we're having performance problems, and among the areas of investigations is the tempdb database. Since it resets itself after SQL is restarted, is there a way to find out how big it has grown in the past ? Does leaving it at the default size cause a performace hit ?? Right now it's 8.75 MB, with 7.38 MB available, which sounds pretty harmless.
Everywhere I read, it states that running SQL Profiler can affect performance of your SQL Server. My question is - how much of an impact will it really make? Will I see a 1% degredation in peformance? 5%? 50%? I haven't been able to find a good answer. We currently have SQL Profiler running all day long for almost 3 years, and the databases are still humming.
Is it the amount of data you are requesting from the trace that affects performance? There are some compliance tools out there (Idera Compliance Manager, IPLocks, etc) that run a profiler trace to get data. There are other DBAs in my organization who don't want to use them because "profiler traces will degrade my SQL Server performance". How true is this really.
Any help I can get would be extremely appreciated.
Does multiplication with 1 affect query performance?I have a a stored procedure that converts results to another unit if required. In alternative 1 below, the results are returned with a separate select statement if no conversion is necessary - in other words, no multiplication with a conversion factor is required. However, the code is not very nice since I need to repeat the select statement again in case a conversion is required, this time including the conversion factor.Alternative 2 uses cleaner-looking code. The conversion factor is set to 1 if no conversion is required, and a single SELECT statement is used to return the data. The @factor variable is defined as a float.I would rather use alternative 2, but I wonder if there is any performance penalty for doing that if no conversion is required since the results are always multiplied with the @factor? Or can SQL server somehow understand that @factor = 1 and no multiplication is required?--- Alternative 1: ---IF @fromunit_sid = @tounit_sid-- Return unconverted results SELECT ISNULL(ls_totalWaterConsumption,0) AS ls_totalWaterConsumption,ls_theoreticalWaterConsumption AS ls_theoretical_WaterConsumption,ls_totalWaterConsumption - ls_theoreticalWaterConsumption AS ls_extra_WaterConsumption FROM Results WHERE scenario_id = @scenario_idELSEBEGIN -- Get conversion factor EXEC getConversionFactor @fromunit_sid, @tounit_sid, @factor OUTPUT -- Get the converted results SELECT ISNULL(ls_totalWaterConsumption * @factor,0) AS ls_totalWaterConsumption, ls_theoreticalWaterConsumption * @factor AS ls_theoretical_WaterConsumption, (ls_totalWaterConsumption - ls_theoreticalWaterConsumption) * @factor AS ls_extra_WaterConsumptionFROM Results WHERE scenario_id = @scenario_idEND --- Alternative 2: ---IF @fromunit_sid = @tounit_sidSET @factor = 1ELSE -- Get conversion factor EXEC getConversionFactor @fromunit_sid, @tounit_sid, @factor OUTPUT
-- Get the converted results SELECT ISNULL(ls_totalWaterConsumption * @factor,0) AS ls_totalWaterConsumption, ls_theoreticalWaterConsumption * @factor AS ls_theoretical_WaterConsumption, (ls_totalWaterConsumption - ls_theoreticalWaterConsumption) * @factor AS ls_extra_WaterConsumptionFROM Results WHERE scenario_id = @scenario_id And another question: is using an IF function considerably faster than making a call to another stored procedure?In alternative 2 above I use an IF statement to check if @fromunit_sid = @tounit_sid, and . But in fact the function getConversionFactor that I'm calling does exactly the same thing: if I pass in identical from- and to-values, it simply returns 1, so I could omit the IF statement completely and just use alternative 3. But is it slower?--- Alternative 3 -- Get conversion factor EXEC getConversionFactor @fromunit_sid, @tounit_sid, @factor OUTPUT
I have a db which I have little control over most of it's makeup because of the vendor supplied tools. We currently have over 700 tables and 19000 columns. Has anyone seen a problem or saturation pont with these kinds of numbers? The database delivered to the clients will be from 2-50 gig depending on the site. I can probably through hardware at problems, but if anyone has been down this road any suggestions are appreciated.
I'm considering adding domain integrity checks to some of my database tableitems. How does adding such constraints affect SQL Server performance? Forexample, I have a simple constraint that restricts a couple of columns tohaving values within the values assigned in my application by anenumeration:(([Condition] >= 0 and [Condition] <=3) and ([Type] >= 0 and [Type] <=2))This enforces domain integrity for two enumerations having values 0, 1, 2, 3and 0, 1, 2 in the application. Is this an efficient way of performing suchchecks? What are the pitfalls of domain integrity checking?ThanksRobin
We have created a databse in SQL Server 7 for support issues. We are having trouble with only one aspect of the databse and that is the body of the supported problem.
The databse is basically a Question/Answer Support database whch house Frequently Asked Questions. When the user does a search on a specific problem they have a list of matching questions shows on the screen (links). When the question is clicked on it shows the Questions with the Answer of possible fixes.
Our problem lies in the fact that SQL server is truncating the Answer portion. I have tried different Data Types with maximum lengths with no success it keeps truncating it. Right now I am on Data Type nvarchar with a length of 4000.
If anyone has any pointers on how to solve this problem all input is appreciated. You can post here to the forum or e-mail me directly at jason@flnet.com
Hello,I have experienced that some of my tables occupies an extremely large amountof pages but with few rows. An example is a table with 37 rows over 22000pages !. The columns are simple integer and char. I fixed the problem byintroducing a clustered index. Now it only uses 1 page. But can anyoneexplain this behaviour in SQLServer 2000 ?regards Jakob Mathiasen
Is there any way how to create indexies after insertion of any number of records (I dont want to create index after insertion of every record, but for example after insertion of 1000 records) ?
I heard it should be possible with „bulk insert“ or with transactions. Is it right ? I need do this with MS SQL Server 2005 (Workgroup edition).
I have been in contact with my Hosting provider who have told me that my website seems to be making a large amount of SQL commands, and locking the tables.
Hello all - I currently have a project that has a gridview on the front end. The user can select multiple items from this grid, hit a button and all of the selected records should be updated in the process. However, this resultset can have a large amount of data coming back, and I'm stumped on how to pass all of the ID's to the sp. I'd rather not call the SP for each record selected, as there could be 1,000 items selected, and well, I'd rather not call the SP 1000 times :p. I thought of generating a comma delimited list as I'm looping through the grid and using dynamic sql, but the IDs are about 6-7 numbers long, and including comma, would take up almost all of the max space in a varchar. Are there any good solutions to this problem? Passing the items as an array? Generating a data table in .NET and passing that? Any help would be appreciated. -Jaime
How can I make a recursive algorithm in TSQL without using a large amount of resources?
New hires are put on a 6 month probation period, Their probation ends at the 6 month mark + sick time isnull(floor(sick_hours/8),0) they accrued during that 6 months. Their accrued sick days are added to the 6 month probation period to establish a new probation end date.
I then have to do the following loop:
1)Determine if any of the days between the original probation end date and the new probation end date have any weekend days or holidays for a particular employee.
2)If it does, I add those days to the new probation end date to create a newer probation end date. a.I then go back up to step 1).
3)If it does not, I jump out of loop and look at the next employee on probation.
Note: I already have coding for determining number of sick days accumulated during an employee’s probation period. Recursive CTE? Recursive function? Other?
A friend reminded me of a problem we tried to solve a few years ago and were unsuccessful. Below is a copy of the email he sent me. We would very much appreciate any ideas from the community. Thanks!Lets start with a simple schema where you have 4 tables:
Hi every one, I am facing problem in printing the reports from browser and also when i export it to pdf,the problem i am facing is blank pages are coming when report column getting the large amount of text around 2500 characters into column value. can any one help me in this issue?. if the report is getting acceptable amout of data it is printing in proper way i.e no balnk pages at all.i maintained all properties like margins+body size < page size.
How to INsert and Update Large Amount of Records (4 Lacs) into Destination Table Through Business Intelligence Studio Using SSIS Pacakge .How to Achieve this .i tryed with left outer join & conditional split but the problem its not able to insert & update records simultaneously . can any one give me a sample . Thanks & Regards Jeyakumar.M
hi, I want to Display the image records of each employee of having 10 images of each.the number of records are more then one billion....... please provid me the optimized query that can i used in SP and display in Gridview.
In the full recovery model, if i run a transaction that inserts 10MB of data into a table, then 10 MB of data is moved in the data file. Does this mean then that the log file will grow by exactly 10MB as well?
I understand that all transactions are logged to the log file to enable rollback and point in time recovery, but what is actually physically stored in the log file for this transactions record? Is it the text of the command from the transaction or the actual physical data from that transaction?
I ask because say if I have two drives, one with 5MB/s write speed for the log file and one with 10MB/s write speed for the data file, if I start trying to insert 10 MB of data per second into the table, am I going to be limited to 5MB/s by the log file drive, or is SQL server not going to try and log all 10 MB each second to the log file?