ENVIRONMENT:
I have SQL Server6.5 running under a dedicated NT Server. NT configuration includes dual pentium 200Mhz processors, 256MB RAM and RAID system.
The Database size is 1GB with actual data size about 500MB.
PROBLEM:
I have an application which uses lots of joins to get the results. My select query is running too slow even when I run it on the server.
I updated the statistics and rebuild all the indexes on the tables used by the query.
Any suggestions on using SQL Trace and tuning the server/database are welcome.
Hi :I have a TableA with around 10 columns with varchar and numericdatatypesIt has 500 million records and its size is 999999999 KB. i believe itis kbi got this data after running sp_spaceused on it. The index_size wasalso pretty big in 6 digits.On looking at the tableAit didnot have any pks and hence no clustered index.It had other indicesIX_1 on ColAIX_2 on ColBIX_3 on ColCIX_4 on ColA, ColB and ColC put together.Queries performed in this table are very slow. I have been asked totune up this table.I know as much info as you.Data prior to 2004 can be archived into another table. I need to run aquery to find out how many records that is.I am thinking the following, but dont know if i am correct ?I need to add a new PK column (which will increase the size of thetableA) which will add a clustered index.Right now there are no clustered indices2. I would like help in understanding should i remove IX_1, IX_2, IX_3as they are all used in IX_4 anyway .3. I forget what the textbox is called on the index page. it is set to0 and can be set from 0 to 100. what would be a good value for it ?thank you.RS
a problem occurs when i run a large database, the running speed is very slow.
for example, if i want to seek the record of 1500 employees with 15 per person(average) within 1 year(12 months), that mean i have to find record for 1500 * 15 * 12 time.
so, could i find a way to solve this problem? is this call tune the sql server/index/view?
what different of tune the sql server with index and view?
I have written a query which fetches data from a table with huge amount of data. This query is actually being used in a stored-procedure and it takes a lot of time, hence slows down my SP.
Here is the query that I'm builiding in the stored-procedure: ------------------------------------------------------------- SET @selectLeadsInPeriodString = ' SELECT LM_Dealer, LM_Brand, SUM(LM_ImpressionCount) FROM LM_ImpressionCount_Dealer' IF(@timeCriterion IS NULL OR LTRIM(RTRIM(@timeCriterion)) = '' OR LTRIM(RTRIM(@timeCriterion)) = 'NULL') SET @selectLeadsInPeriodString = @selectLeadsInPeriodString + ' WHERE (CONVERT(varchar,[LM_ImpressionCount_Dealer].[LM_ImpressionDate],102) >= ''' + CONVERT(varchar,@startDateTime,102) + ''' AND CONVERT(varchar,[LM_ImpressionCount_Dealer].[LM_ImpressionDate],102) <= ''' + CONVERT(varchar,@endDateTime,102) + ''')' ELSE IF(@timeCriterion = 'CurrentMonth') SET @selectLeadsInPeriodString = @selectLeadsInPeriodString + ' WHERE MONTH([LM_ImpressionCount_Dealer].[LM_ImpressionDate]) = MONTH(GETDATE())' ELSE IF(@timeCriterion = 'PreviousMonth') SET @selectLeadsInPeriodString = @selectLeadsInPeriodString + ' WHERE MONTH([LM_ImpressionCount_Dealer].[LM_ImpressionDate]) = MONTH(GETDATE()) - 1' ELSE IF(@timeCriterion = 'YearToDate') SET @selectLeadsInPeriodString = @selectLeadsInPeriodString + ' WHERE ([LM_ImpressionCount_Dealer].[LM_ImpressionDate] >= cast((''1/1/''+cast(year(getdate()) AS varchar(4))) AS datetime) AND CONVERT(varchar,[LM_ImpressionCount_Dealer].[LM_ImpressionDate],102) <= CONVERT(varchar,GETDATE(),102))'
IF(@brand IS NOT NULL AND LTRIM(RTRIM(@brand)) <> '') SET @selectLeadsInPeriodString = @selectLeadsInPeriodString + ' AND [LM_ImpressionCount_Dealer].[LM_Brand] = ''' + @brand + ''''
SET @selectLeadsInPeriodString = @selectLeadsInPeriodString + ' GROUP BY LM_Dealer, LM_Brand'
The variables used in the query formation above are passed as input parameters to the SP. The table being queried has columns 'LM_Dealer', 'LM_Brand', 'LM_ImpressionCount' and 'LM_ImpressionDate'. Also, the table has a non-clustered index on the column LM_ImpressionDate.
With all this information, can anyone suggest as to how I can optimize the query above.
I have SQL desktop version installed and for the last few days it has really slowed down. I have ran many anti-virus etc. and all is okay on that front.
Any tips regarding how I can tune things up? What should I look for and how do I go about it. Deleting LOG etc. etc???
If anyone is able to provide advice for tuning the below query in sqlserver, it is much appreciated. In addition any index suggestions arealso appreciated as I have access to the tables. Thank you.select a.id, isnull(b.advisement_satisfaction_yes, 0) asadvisement_satisfaction_yes,isnull(c.advisement_satisfaction_no, 0) as advisement_satisfaction_no,casewhen isnull(b.advisement_satisfaction_yes, 0) >isnull(c.advisement_satisfaction_no, 0) then 'YES'when isnull(b.advisement_satisfaction_yes, 0) <isnull(c.advisement_satisfaction_no, 0) then 'NO'when isnull(b.advisement_satisfaction_yes, 0) =isnull(c.advisement_satisfaction_no, 0) then 'TIE'end as Satisfied_With_Advisementfrom aleft Join(select id, count(answer_text) as Advisement_Satisfaction_yes from awhere question = 'The level of Academic Advisement I received from theUniversity staff during this course was appropriate.'and answer_text = 'yes'GROUP BY id) bon a.id = b.idleft join(select id, count(answer_text) as Advisement_Satisfaction_NO from awhere question = 'The level of Academic Advisement I received from theUniversity staff during this course was appropriate.'and answer_text = 'NO'GROUP BY id) con a.id = b.idwhere question = 'The level of Academic Advisement I received from theUniversity staff during this course was appropriate.'
How can l trim the code and make the procedure run faster ???????????????
CREATE Procedure Disbursements_Cats (@startdate datetime,@enddate datetime) As Begin SELECT Transaction_Record.loan_No AS loan_no, Transaction_Record.transaction_Date AS Transaction_Date, Transaction_Record.transaction_type AS Transaction_type, Transaction_Record.transaction_Amount AS Transaction_Amount,
Product.product AS Product, Product_Type.product_Type AS product_type, Product_Type.loan_Type AS Loan_type,
Customer.first_name AS first_name, Customer.initials AS initials, Customer.second_name AS Second_Name, Customer.surname AS surname, Customer_identification.idno AS ID_No,
Bank.Bank_name AS Bank_Name, Bank_detail.Account_no AS Account, Bank_detail.Branch AS Branch
FROM Transaction_Record CROSS JOIN Bank_detail CROSS JOIN Bank CROSS JOIN Customer CROSS JOIN Product CROSS JOIN Loan_Type CROSS JOIN Product_Type CROSS JOIN Customer_identification End; GO
hi,I have a problem asked by one of my senior person and finding theanswer .What is the step by step procedure for tune a large sql query.OR how do we tune a large SQL query with somany joins
HELP!!!I am trying to fine tune or rewrite my SELECT statement which has acombination of SUM and CASE statements. The values are accurate, butthe query is slow.BUSINESS RULE=============1. Add up Count1 when FIELD_1 has a value and FIELD_2 is NULL, or bothhave a value.2. Add up Count2 when FIELD_2 has a value and FIELD_1 is NULL.4. TotalCount = Count1 + Count2 -- (Below, basically had to reuse theSQL from both Count1 and Count2)3. Add a NoneCount when both FIELD_1 and FIELD_2 are NULL.SQL Code========SELECTSUM(CASEWHEN ((FIELD_1 IS NOT NULL AND FIELD_2 IS NULL) OR (FIELD_1 IS NOTNULL AND FIELD_2 IS NOT NULL))THEN 1ELSE 0END) AS Count1 ,SUM(CASEWHEN (FIELD_1 IS NULL AND FIELD_2 IS NOT NULL)THEN 1ELSE 0END) AS Count2,SUM(CASEWHEN (FIELD_1 IS NULL AND FIELD_2 IS NOT NULL)THEN 1ELSE (CASE WHEN ((FIELD_1 IS NOT NULL AND FIELD_2 IS NULL) OR FIELD_1IS NOT NULL AND FIELD_2 IS NOT NULL) THEN 1 ELSE 0 END)END) AS Total_Count,SUM(CASEWHEN ( FIELD_1 IS NULL AND FIELD_2 IS NULL)THEN 1ELSE 0END) AS None_Count,FROMTABLE_1
I've create a bunch of views to expose a logical model of the underlying database of an application server.
To enforce the security control, I've also created a CLR UDF to call the application server's API for security check and audit log.
For example, we have a table, tblSecret, and the view, vwSecret, is,
SELECT
Id, ParentId, Description, SecretData FROM tblSecret WHERE udfExpensiveApiCall(Id) = 1
The udfExpensiveApiCall will return 1 if the current user is allowed to access the SecretData else 0. The CLR UDF call is very expensive in terms of execution time and resources required.
Currently, there are millions rows in the tblSecret.
My objective is to tune the view such that when the view is JOINed, the udfExpensiveApiCall will be called the least number of time.
SELECT
ParentId, SecertData FROM vwParent LEFT JOIN vwSecret ON vwSecret.ParentId = vwParent.ParentId WHERE vwParent.StartDate > '1/1/2008'
AND vwSecret.Description LIKE '%WHATEVER%'
Is there any way to specify the execution cost of the CLR UDF, udfExpensiveApiCall, such that the execution plan will call the UDF while it is absolutely necessary?
I want to tune the indexes on my database and I am trying to use the SQL Server Profiler to collect data for the Index tuning wizard to analyze. My question is what do I need to trace with the profiler so that the Index tuning wizard can work? I am looking at the trace properties in Profiler at the Events, Data Columns, and Filters tabs but I have no idea of what I need to capture.
i have a report that runs on a huge table rpt.AgentMeasures , it has 10 months worth of data (150 million records as of today and will keep increasing). i have pasted my proc below , the other tables that are joined to this huge table do not have more then 3k records.This report will be accessed by multiple users (expecting 20 ppl). as of now this reports runs for 5 mins if i pull for 1 month worth of data. if it is wise to use temp tables.
Hello Everyone,I have a very complex performance issue with our production database.Here's the scenario. We have a production webserver server and adevelopment web server. Both are running SQL Server 2000.I encounted various performance issues with the production server with aparticular query. It would take approximately 22 seconds to return 100rows, thats about 0.22 seconds per row. Note: I ran the query in singleuser mode. So I tested the query on the Development server by taking abackup (.dmp) of the database and moving it onto the dev server. I ranthe same query and found that it ran in less than a second.I took a look at the query execution plan and I found that they we'rethe exact same in both cases.Then I took a look at the various index's, and again I found nodifferences in the table indices.If both databases are identical, I'm assumeing that the issue is relatedto some external hardware issue like: disk space, memory etc. Or couldit be OS software related issues, like service packs, SQL Serverconfiguations etc.Here's what I've done to rule out some obvious hardware issues on theprod server:1. Moved all extraneous files to a secondary harddrive to free up spaceon the primary harddrive. There is 55gb's of free space on the disk.2. Applied SQL Server SP4 service packs3. Defragmented the primary harddrive4. Applied all Windows Server 2003 updatesHere is the prod servers system specs:2x Intel Xeon 2.67GHZTotal Physical Memory 2GB, Available Physical Memory 815MBWindows Server 2003 SE /w SP1Here is the dev serers system specs:2x Intel Xeon 2.80GHz2GB DDR2-SDRAMWindows Server 2003 SE /w SP1I'm not sure what else to do, the query performance is an order ofmagnitude difference and I can't explain it. To me its is a hardware oroperating system related issue.Any Ideas would help me greatly!Thanks,Brian T*** Sent via Developersdex http://www.developersdex.com ***
We are trying to create some alerts in our SQL Server 2014 BI edition.Issue is that, after I chose "Type" as "SQL Server performance condition alert" nothing is listed in the "Object" list box.SQL Server event alerts are working. Issue is only with "SQL Server performance condition alert".
I can't find 'SQL Server: SSIS Pipeline' performance object in performance monitor on a 64-bit SQL Server. I see it on a 32-bit. Does anybody know why?
I set up the collector, and specify the Run As as my AD account in the Collector Set - Properties - General screen. My AD account is the local admin of the remote server.
However, the collector does not seem to work. Although the collecting set is shown as running, the The blg file stays at 64K. If I open it, there is nothing inside (no counter at the bottom). What did I miss?
Hi, I am interested if anyone else has come across performance problems with the SQL Server linked servers to SQL Server. I suspect that the OLE DB Provider that I am using perhaps has some performance issues when passed parameters.
I have set the dynamic paramters option on, and use collation compatible.
Does anyone know of a quick check list I can use (config setup), for the Server running my SQl Server to enable best performance. The box is dedicated to the SQL Server. The problem is when running a farely heavy sp, the box lockeds up, CPU hits 100% and I'm ending up staring at a screen trying to load itself for ages.
We are currently in the process of loading large amounts of data into our database. We are running into a situation where we are getting a message of "Waiting for WRITELOG", and this is slowing down our process to a crawl. I do no have a lot of experience in the performance side. I would truly appreciate any help on this matter.
Hi guys, I have windows NT workstation 4.0 and SQL Server Client Configuration Utility installed on my computer,so I have remote administration of SQL Server. But because Performance monitor is'not a part of NT workst. I can't check behavior of Cache hit ratio and ..... May be you give me some idea how can I do it remotely without Performance monitor.
My company is contemplating which platform (Oracle or SQL Server) to develop a new imaging application in (large databases, many users). All of our current SQL Server applications are relatively small now (100 MB databases). Can anyone give me some idea of what size databases they are running in SQL Server and number of users concurrently accessing them. What can SQL Server realistically handle?
I have to admin a datbase which makes almost no use of stored procedures. The C++ frontend makes use of ad hoc calls to the database(No this is not my idea of how to do things, but I have no say). Any ideas on the best way to tweak the SQL Server to handle this? Thanks
We have an application that uses SQL 2000 server. I am almost certain all the performance issues we are having are due to the SQL server. I really need to confirm this.
We have an application that uses SQL 2000 server. I am almost certain all the performance issues we are having are due to the SQL server. I really need to confirm this.
Hello Everyone,Regarding stored procedures and views, I know that stored procedurescause SQL Server to create a cached execution plan. Is the same thingdone for views? Also, how bad is the performance hit for a storedprocedure that use 1 or a few views as opposed to re-creating the sameselect statement with the proper joins to the required tables?I know that there are a bunch of variables that affect this stuff, Ijust trying to get a ball park idea of how this stuff works.Thanks,Frank
I am investigating a SQL server performance issue where the systemoperates well at times and poorly at others. This SQL server isconnected to a SAN where I believe the issue lies. I have started sometesting using the SQLIOSimx86 utility from Microsoft with theapplication stopped. My initial results show that there are quite afew errors that indicate"IO requests are outstanding for more than 15 sec."I am next going to look @ the IO system as a whole (bios,driverversion)Any thoughts? I can't seem to find any documentation regarding thisand whether this is acceptable.Thanks,Scott
Recently created a library module (ASP.Net and SQL Server 2005) where we store files inside the SQL DB. We now have several thousand files and the DB is around 25 gb. I think we are starting to see performance problems when trying to select files for download. Filed under 10 Mb seem to download fine, but over 10 MB we are having problems with. Was wondering if someone could point me to a good article that might talk about these kind of performance issue and what I might do to over come it.