Optimising SQL Jobs
Apr 30, 2008
Hi all,
I have 20 SQL jobs thst are scheduled to run from say every 5mins to others that run every hour.
Does anyone know the best way to optimise these jobs to run.
At the moment once these jobs are running I cannot browse any tables in teh DB. I get a locked timeout rquest exceeded..
Do I need to stagger when the jobs run.
Or make one big job where they all run one after another ?
Any help ?
Ray..
View 3 Replies
ADVERTISEMENT
Feb 8, 2001
Hi ,
We are using a stored procedure which processes more than 11 million records .
The Time that the Stored Procedure takes to execute is around 15 to 20 days .
This is bad . We are not using any cursors , But Delete , Insert & Update Statements . There is some complex where clause also while performing deletes and updates .
Our job is to fine tune the SP . We run into problems like transaction log fillups , Tempdb full etc... U can imagine the problems when u look at the record count ..
Indexes donot help .
Can anybody recomend ways to fine tune the proc.
One More thing we do cross database updates ,inserts ,& deletes (I mean 2 databases in same server).
Bye
NAvin
View 2 Replies
View Related
Feb 8, 2001
Hi ,
We are using a stored procedure which processes more than 11 million records .
The Time that the Stored Procedure takes to execute is around 15 to 20 days .
This is bad . We are not using any cursors , But Delete , Insert & Update Statements . There is some complex where clause also while performing deletes and updates .
Our job is to fine tune the SP . We run into problems like transaction log fillups , Tempdb full etc... U can imagine the problems when u look at the record count ..
Indexes donot help .
Can anybody recomend ways to fine tune the proc.
One More thing we do cross database updates ,inserts ,& deletes (I mean 2 databases in same server).
Bye
NAvin
View 2 Replies
View Related
Jan 8, 2003
Hi,
I'm looking for tips, advice, best practice etc. on optimising a DB with over 300,000 user records to be accessed rapidly via a web interface. Any help would be greatly appreciated - specifically i'm looking at the different methods of DB optimisation indexing, clustering etc.
View 2 Replies
View Related
Mar 26, 2008
Hi there
Recently our company purchased a product from ip2location.com; a database containing 2.9million IP address ranges, and their approximate cities/countries of registration.
Naturally, I thought - "Hey, wouldn't it be great if we could cross reference this with our IIS logs so we could see where our visitors are from?".
So, I set about doing just that. Our IIS logs are already in SQL.
The trouble is, the ip2location database is so large that executing a query against it to find which range a particular IP address is within takes me 1 second. Multiply that by 1,000,000 log rows, and Houston - we have a problem.
One of the issues is that each record in the ip2location database comprises a FROM_IP and TO_IP range to describe a range of IPs. So to find which IP range a particular IP resides in, I have to join using a BETWEEN statement (or so, I think anyway!).
Does anyone have any suggestions on how to improve this process, or has anyone done anything similar before?
Ideally, I'd like to write a trigger to grab the IP region data (i.e. City/Country) and update the IISLog with that value when the new row is inserted, saving me having to do it later.
I tried this, and the batch import of IIS logs into SQL took so long I got bored and gave up :)
Any help anyone can offer would be appreciated.
Many thanks
Richard.
P.S. Somebody is bound to ask - "Why couldn't you just use Google Analytics?"; my answer is because we want to slice up our log data into chunks, and give it to our customers in semi-real time. Plus the logs report on other services - not just HTTP. ;)
View 18 Replies
View Related
Jul 23, 2005
Hi,I have a problem I would really appreciate help with. I am generatingdynamic SQL and need to optimise it. The specific example I am trying tooptimise looks like this:SELECT DISTINCT DataHeaderID FROM TB_DataDetailText T1 WHERE(EntityFieldID IN ( 31) AND (Data LIKE '12BORE%' )) AND(DataHeaderID=(SELECT DISTINCT DataHeaderID FROM TB_DataDetailText CT2WHERE T1.DataHeaderID = CT2.DataHeaderID AND (EntityFieldID IN ( 34)AND (Data LIKE 'SIDE BY SIDE%' )) ))AND(DataHeaderID=(SELECT DISTINCT DataHeaderID FROM TB_DataDetailText CCT3WHERE T1.DataHeaderID = CCT3.DataHeaderID AND (( Data LIKE 'church%' ))))I was OK optimising it with just 2 criteria and changed:SELECT DISTINCT DataHeaderID FROM TB_DataDetailText T1 WHERE(EntityFieldID IN ( 31) AND (Data LIKE '12BORE%' )) AND(DataHeaderID=(SELECT DISTINCT DataHeaderID FROM TB_DataDetailText CT2WHERE T1.DataHeaderID = CT2.DataHeaderID AND (( Data LIKE 'church%' ))))which took 26 seconds to using a derived tableSELECT distinct T1.DataHeaderID FROM TB_DataDetailText as T1inner join (SELECT distinct DataHeaderID, Data FROM TB_DataDetailText )CT2on T1.DataHeaderID = CT2.DataHeaderIDWHERE(T1.EntityFieldID IN ( 31) AND (T1.Data LIKE '12BORE%' ))and (( CT2.Data LIKE 'church%' )) which took 0.03 seconds on the same data.My problem is I need to write code to generate the SQL for 1 to n criteriaand am struggling to write the query for more than 2Best regards,Andrew
View 3 Replies
View Related
Mar 23, 2006
Hi.Maybe I'm just being dim, but I'm struggling to get my head aroundoptimising a query with regard to indexes. If I make a select query, suchas a pseudo-example 'select * from bigtable where foo='bar' and(barney>rubble and fred<flintoff)', and the table is indexed on 'foo', howcould I make that any better? What indexes could I add, or what could Ichange in the query?I know it looks simple, but so am I.CheersChris Weston
View 5 Replies
View Related
Jul 28, 2006
Dear All, Plz help me in optimising the following query,Reduce repeatable reads from the table via select ,ythe table sare nothaving referntial integrity constarints ,relationsCREATE proc Rolex136SyncasDECLARE @date varchar(50),@ydate varchar(50)print CONVERT(char(11),(GETDATE()-1),100)SET @date =substring(CONVERT(char(11),(GETDATE()),100),5,2)+' -'+substring(CONVERT(char(11),(GETDATE()),100),1,3) +'-'+substring(CONVERT(char(11),(GETDATE()),100),8,4) SET @ydate =substring(CONVERT(char(11),(GETDATE()-1),100),5,2)+'-'+substring(CONVERT(char(11),(GETDATE()-1),100),1,3)+'-'+substring(CONVERT(char(11),(GETDATE()-1),100),8,4)Print @datePrint @ydateinsert intobiiod.dbo.data_trans_currentday_test(MobileNo,UA,M essageID,ContentID,Description,MusicLabel,CPID,CPN ame,ContentType,Category,SubCategory,TransactionDa te,Units,Unitprice,Shortcode,Servicecode,OperatorI D,CatID,SubCatID,SpecialPackage,Royalties,Operator,Circle,OPGPName)(select mobileno,(SELECT CASE uawhen 'unknown' then nullelse uaend) as ua,(select case remarkswhen 'unknown' then nullelse remarksend) as remarks,contentid,(select case descriptionwhen 'unknown' then nullelse descriptionend) as description,(select musiclabel from datalogs.dbo.cont_master where contentid =datalogs.dbo.translogs.contentid) as musiclable,(select cpid from datalogs.dbo.contentprovider where cpname =datalogs.dbo.translogs.cpname) as cpid,cpname,contenttype,(select catname from datalogs.dbo.cont_Catg where catid in (selectcatid from cont_master where contentid =datalogs.dbo.translogs.contentid)) as category,(select subcatname from datalogs.dbo.cont_subCatg where subcatid in(select subcatid from cont_master where contentid =datalogs.dbo.translogs.contentid)) as subcategory,transactiondate,1 as Units, price,(select case servicenamewhen 'AIRTELIVE' then remarkswhen 'ALCOMBOPACKREG' then remarkswhen 'HINDI' then remarkswhen 'NOKIAGAL' then remarkswhen 'SUDOKU' then remarkswhen 'SUDOKU_APP' then remarkselse NULLend) as SHORTCODE,servicename,(select case servicenamewhen 'TSTTNEWS' THEN 600when 'TSTTWAP' THEN 600when 'TSTT_MMS' THEN 600when 'AKTEL' THEN 300when 'TELEMOVIL' THEN 700when 'COMCEL' THEN 701when 'QATAR2900' THEN 1ELSE(select operatorid from datalogs.dbo.operator where phoneseries =substring(datalogs.dbo.translogs.mobileno,1,len(ph oneseries)))end) as operatorid,(select catid from datalogs.dbo.cont_master where contentid =datalogs.dbo.translogs.contentid) as catid,(select subcatid from datalogs.dbo.cont_master where contentid =datalogs.dbo.translogs.contentid) as subcatid,(select specialpackage from datalogs.dbo.cont_master where contentid =datalogs.dbo.translogs.contentid) as specialpackage,(select Royalties from datalogs.dbo.cont_master where contentid =datalogs.dbo.translogs.contentid) as Royalties,(select case servicenamewhen 'AKTEL' then 'Aktel'when 'QATAR2900' then 'STAR MULTIMEDIA 2900'when 'TELEMOVIL' then 'TeleMovil'when 'COMCEL' THEN 'COMCEL'when 'TSTTNEWS' then 'TSTT'when 'TSTTWAP' then 'TSTT'when 'TSTT_MMS' then 'TSTT'when 'ALCLICKWIN6464' then 'Airtel'when 'ALMMSPORTAL' then 'Airtel'when 'ALMMSSMSDWN' then 'Airtel'when 'ALMYALBUM646' then 'Airtel'when 'HINDU6397' thensubstring(remarks,1,PATINDEX('%.6397.%',remarks)-1)else(select OPname from datalogs.dbo.operator where phoneseries =substring(datalogs.dbo.translogs.mobileno,1,len(ph oneseries)))end) as Operator,(select case servicenamewhen 'AKTEL' then 'Bangladesh'when 'QATAR2900' then 'STAR MULTIMEDIA 2900'when 'TELEMOVIL' then 'El Salvador'when 'COMCEL' THEN 'Gautemala'when 'TSTTNEWS' then 'Trinidad'when 'TSTTWAP' then 'Trinidad'when 'TSTT_MMS' then 'Trinidad'when 'HINDU6397' then substring(remarks,PATINDEX('%.6397.%',remarks) +6,len(remarks)-PATINDEX('%-%',remarks))else(select Circlename from datalogs.dbo.operator where phoneseries =substring(datalogs.dbo.translogs.mobileno,1,len(ph oneseries)))end) as Circle,(select case servicenamewhen 'AKTEL' then 'Aktel'when 'QATAR2900' then 'STAR MULTIMEDIA 2900'when 'TELEMOVIL' then 'TeleMovil'when 'COMCEL' THEN 'COMCEL'when 'TSTTNEWS' then 'TSTT'when 'TSTTWAP' then 'TSTT'when 'TSTT_MMS' then 'TSTT MMS'when 'ALCLICKWIN6464' then 'Airtel Click Win 646'when 'ALMMSPORTAL' then 'Airtel MMS'when 'ALMMSSMSDWN' then 'Airtel MMS SMS'when 'ALMYALBUM646' then 'Airtel My Album'when 'HINDU6397' then 'Hindu 6397'else(select OPname from datalogs.dbo.operator where phoneseries =substring(datalogs.dbo.translogs.mobileno,1,len(ph oneseries)))end) as OPGPNamefrom datalogs.dbo.translogs where transactiondate >= @ydate andtransactiondate < @date and servicename in('AIRTELMMS_SUB','ALMYALBUM646','HINDU6397','MTV', 'QATAR2900','SIFY'))go
View 2 Replies
View Related
Dec 14, 2004
Hi all
I have been doing some development work in a large VB6 application. I have updated the search capabilities of the application to allow the user to search on partial addresses as the existing search routine only allowed you to search on the whole line of the address.
Simple change to the stored procedure (this is just an example not the real stored proc):
From:
Select Top 3000 * from TL_ClientAddresses with(nolock) Where strPostCode = ‘W1 ABC’
To:
Select Top 3000 * from TL_ClientAddresses with(nolock) Where strPostCode LIKE ‘W1%’
Now this is when things went a bit crazy. I know the implications of using ‘with(nolock)’. But seeing the code is only using the ID field to get the required row, and the database is a live database with hundreds of users at any one time (some updating), I think a dirty read is ok in this routine, as I don’t want SQL to create a shared lock.
Anyway my problem is this. After the change, the search now created a Shared Lock which sometimes locks out some of the live users updating the system. The Select is also extremely SLOW. It took about 5 minutes to search just over a million records (locking the database during the search, and giving my manager good reason to shout abuse at me). So I checked the indexes. I had an index set on:
strAddressLine1, strAddressLine2, strAddressLine3, strAddressLine4, strPostCode.
So I created an index just for the strPostCode (non clustered).
This had no change to the ‘Like select’ what so ever. So I am now stuck.
1)Is there another way to search for part of a text field in SQL.
2)Does ‘Like’ comparison use the index in any way? If so how do I set this index up?
3)Can I stop a ‘Shared Lock’ being created when I do a ‘like select’?
4)Do you have any good comebacks I could tell the boss after his next outburst of abuse (please not so bad that he sacks me).
Any advice truly appreciated.
View 8 Replies
View Related
Jul 17, 2006
I have an application that reads a monitoring devices that produces 200 digital outputs every second and I would like to store them in a table. This table would get quite big fairly quickly as ultimately I would like to monitor over a hundred of these devices.
I would like to construct queries against each of the individual digital channels or combinations of them.
M first thought is to set up a table with 200 separate columns (plus others for date stamp, device ID etc) however, I am concerned that a table with 200 boolean (1-bit) fields would be an enormous waste of space if each field takes maybe one to four bytes on the hard disk to store a single bit. However, this would have the advantage of make the SQL queries more natural.
The other alternative is to create a single 200 bit field and use lots of ANDing and ORing to isolate bits to do my queries. This would make my SQL code less readable and may also cause nore hassle in the future if the inputs changed, but it would make the file size smaller.
In essence I am asking (hoping) the following : If I create a table with 200 boolean fields, does SQL server express automatically optimise the storage to make it more compact? This means that the server can mess around at the bit level and leave my higher level SQL code looking cleaner and more logical.
View 5 Replies
View Related
Jan 4, 2007
I need to merge replicate data to two different types of subscribers:
Clients subscribers which will have a very small percentage of the data from the central database. The data on these machines will be managed using dynamic filtering on host_name()
Server subscribers which will manage a copy of all the data from the central database
There will be far fewer server subscribers than client subscribers.
As I see it I have two options for the configuration
1) Use two separate merge publications €“ one which is filtered and one which isn€™t
2) Use a single merge publication and setup the filtering so that the server subscribers receive all the rows
Which option is likely to lead to better performance?
With option 1) there would be 2 complete sets of replication metadata which need to be maintained €“ so I am tending towards option 2. Are there any disadvantages in using a dynamic filter to return a very large number of rows?
View 6 Replies
View Related
Aug 2, 2005
Hi,It may sound funny but it is true. I have 2 version of same procedure, 1 is named as "update_aggre" other named as "update_aggre_2" .if I run these procedures using sql jobs the 1st procedure takes 12 seconds and the 2nd takes 2 second. I am stil surprise why is that both of them have same code.Any ideas please?and also I need to have lock mechanism within a procedure so that if some one is calling a procedure which is already called by some one ans is stil working on it the second user shld not be able to overwrite the existing data untill the first one is finished.TIAAmit
View 3 Replies
View Related
Mar 20, 2000
We're upgrading to Sql 7 from 6.5 but having problems running jobs with the following sql statement :
SQLMAINT.EXE -D master -BkUpDB e:dump -BkUpMedia DISK -DelBkUps 2 -Rpt d:salogmaster_dump.log
The error message is [Microsoft SQL-DMO (ODBC SQLState :28000}] Error 18456:[Microsoft]ODBC SQL Server Driver][SQL Server]Login failed for
user 'Domainuser'. Process Exit Code 1. The step failed.
The user is a domain account and already in local administrator group. It also starts up both Sql server & Sql server agent. It can run the same job on another server. I have compared the set up and can't find any differences.
Thanks in advance for your help.
View 2 Replies
View Related
Nov 17, 2000
If i've created myself a job on my SQL Server and then scripted it, how do i run this script and so doing create my job on another server?
View 2 Replies
View Related
Jan 18, 2001
Hi, I have a DTS package that runs correctly when executed manually, but when I schedule a job via the package, the job begins executing and nevers seems to complete the run. The Package runs in less that 1 minute, but the job sticks on executing and does not complete. I have to cancel, any ideas?
Thanks
View 1 Replies
View Related
Sep 6, 2000
I have 12 scheduled daily jobs, all the jobs are depending on one job.I want to run rest all the jobs if and only if the Job#1 gets success.Now all are running independently.Is there a way to force the sequence by using SQL.
Thanks!
View 4 Replies
View Related
Aug 30, 2000
If we have several jobs running several times daily,do we have to do any clean-up with the logs/job history so that they don't run into any problem later?Will it be o.k if we do not clear off the history frequently?
View 2 Replies
View Related
Feb 16, 2001
Hi all,
My firm has about 15 SQL Servers. 5 of them are still 6.5 and the other 10
are 7.0. I have noticed that the jobs I have setup to perform maintenance
tasks, backups etc. almost never fail in 7.0. Similar tasks routinely
fail on the 6.5 servers.
So I did a test. I deleted a database in 7.0 and ran a backup job. Although
the log for the job said that it couldn't find the db it still said the job
was successful. In 6.5 if you try to run a job on a non-existent db it
fails. What's going on in 7.0? What has to happen for a SQL Server agent job
to fail. This seems like an awfully dangerous new "feature".
If anyone could shed light on this I would appreciate it.
Thanks
JJ
View 2 Replies
View Related
Mar 5, 2001
Hi,
How do we add DTS pakages to a scheduled job ?
Ex: Lets say i have 2 DTS pakages One has got to run immediately after the other.
>Now i create a schedule for the main DTS pakage.
>Once i create a schedule, It goes into Jobs as a Scheduled job.
>Now in the propertied part of the first DTS job i need to add a second DTS
as Step2, mentioning that it should run after the First DTS.
How do we do this ?
Thanks in Advance,
Siv
View 1 Replies
View Related
Jul 23, 2001
Does anyone know how I can set up some form of alert that tells me which scheduled jobs are still running?
View 1 Replies
View Related
Sep 21, 2001
We had some jobs in Sql server 7.0 which ran together at the same time.
Now we have moved to sql 2000 and when we run those jobs together again they just sit up(keep on going without doing anything) and do nothing.I have monitored that there is no Blocking or any kind of Performance Overload.
These jobs run fine when they are executed Individually in Sql 2000
Any kind of help will be appreciated.
Thanks,
James
View 5 Replies
View Related
Sep 29, 2004
hey guys!! can someone tell me, what are jobs on MS SQL?
and what is a job scheduler? how do it function? what are the processes
involved? and additional information about this stuffs.. thanks..really need it.. tnx tnx..
p.s.
if your wondering
why i asked the questions
here..because it's really
hard to search the meaning
of it in the net..i hope you
guys could help me.. tnx
View 2 Replies
View Related
Oct 5, 2004
Thanks for the help with moving jobs from one server to another.
Now that I have moved them, they now fail each and all. I have copied the script and saved them as vbs files and placed them in a folder on the server and ran them successfully. I think it has something to do with the rights, but I don't know what. Could someone shed some light on possible causes?
Thanks,
Lee
View 6 Replies
View Related
Dec 6, 2004
Hi SQL Guru's
I am trying to run a job on Multiple SQL Servers in a domain.I am setting the job on my system,but the Radio button "Target Multiple Servers" is disabled.How do I enable the radio button "Target Multiple Servers".
Please let me know if this is possible.
Best Regards
SK.
View 1 Replies
View Related
Oct 31, 2005
http://www.dbforums.com/forumdisplay.php?f=246
View 6 Replies
View Related
May 13, 2004
What's the best way to backup jobs in SQL Server 2000?
View 3 Replies
View Related
Feb 5, 2007
I need some direction on SQL Jobs.
I can handle the setting up a SQL Job/DTS Package.
What I have is SQL Statement I need to run that creates a txt file Renames file to AUX_Export_020507.txt then zips that text file and Name it AUX_Export_020507.zip. Then move the zip file to a FTP directory.
I would like to get email after it runs also.
This issue I am having is the Part that zips the file
Renames the text file and zip file to add the current date on end like I have above and move it to a FTP directy..
The old days we would creat an autoex.bat file to move and rename.
Thanks
Michael Webb
View 5 Replies
View Related
Nov 9, 2007
Hi All,
I want to create a job which will basically fetch the information from the system. I mean job will fetch the following information about disk available in the system.
1) Drive
2) Total Size
3) Available Space
4) %free space
Suppose I have 3 drives d,h,i
The information should look like this
Drive Totalsize Available Space %free space
H 2 Gb 1 GB 50%
Can anyone please help me.
Regards,
Frozen
View 4 Replies
View Related
Nov 29, 2007
I already learned SQL by reading "Head First SQL" book (which uses MySQL). I also know how to use Excel(using basic function and creating pivot table.) However, I noticed that many available jobs require Microsoft SQL Server 2005 and also using SQL with Excel. Now, I am thinking to learn SQL Server 2005 Express Edition because it is free! I would really appreciate if anyone who has been working in industry give me some suggestions about what is the best way to learn SQL Sever 2005? or does it worth to learn? or is it hard to learn? Can you recommend good book or website for learning SQL Server 2005 Express Edition. Thanks!
View 3 Replies
View Related
Jan 9, 2008
How is it possible to schedule a job to run every 40 seconds.
It seems every 1 minute is the shortest time available in the job properties for scheduling a job.
Thanks
View 3 Replies
View Related
Feb 3, 2008
I have 2 jobs
in first job,there are 2 steps
in the 1st step it has to take the backup
in the 2nd step it has to call the 2nd job
in 2nd job,it has to restore the backup
i dont know how to call another job in the second step and one more thing is the restoration is done on another sql server instance
View 3 Replies
View Related
Feb 9, 2008
I have to create 2 jobs,The first job has 2 steps,in the last step of the first job,i have to call the 2nd job,How????Can any body help me?
View 1 Replies
View Related