I'm trying to figure out why my transaction log backup is taking up to an hour to complete. I started off with a full recovery model with a Full database back up every Sunday, differential backups every Tuesday/Thursday and log backups every 5 minutes. I would have thought that the log file backups would execute much quicker because I'm backing them up more often.
Here is my backup statement, I'm hoping I've got a wrong option that you can point out to me:
BACKUP LOG [xxxx] TO [LogFilexxxxBackups] WITH NOINIT , NOUNLOAD , NAME = N'xxxx log backup', SKIP , STATS = 10, NOFORMAT
IF (EXISTS (SELECT name FROM sysobjects WHERE (name = N'Fn_Get_Consensus_Curve_41_Data') AND ((type = 'P') OR (type = 'IF') OR (type = 'TF') OR (type = 'FN')))) DROP FUNCTION [dbo].Fn_Get_Consensus_Curve_41_Data
GO
*/ declare @p_ENTITYID INT declare @p_CUSTOMERID INT
Declare @p_Login_Type int Declare @p_Result_Status int set @p_Login_Type = (SELECT DBO.GET_USER_LOGIN_TYPE_ID(@p_UserID))
If @p_Login_Type=1 and not (@p_CustId is null or @p_CustId='') Set @p_Result_Status = 1 Else if @p_Login_Type > 1 Set @p_Result_Status = 2 Else Set @p_Result_Status = 0
If @p_Result_Status > 0 -- if user is valid and given enough parameters than Begin If @p_Result_Status = 1 -- if User is trader and gives customer id Begin Declare Cur_Fetch_Curve_Cust_Data cursor for Select Distinct Customerid From PricesRR PRR Where Convert(Nvarchar,Matchdate,101) = Convert(Nvarchar,@p_Match_Date,101) And Sector_Id = @p_Sector_Id And Location_Code = @p_Location_Code And CustomerID = @p_CustId And --CustomerID <> 0 --CustomerID not in (0, -1, -2, -3, -100, -200) CustomerId Not In (Select CustomerId From Fn_Get_PricesRR_Not_To_Include_Cust_Id('V')) and isnull(PRR.Record_Last_Action,'N') <> 'D' and Version = dbo.GET_PRICESRR_MAX_VERSION(@p_Location_Code, @p_Sector_Id, @p_Match_Date, PRR.EntityID, @p_CustId, PRR.Date)
Declare Cur_Fetch_Curve_Entity_Data cursor for Select Distinct EntityID From PricesRR PRR Where Convert(Nvarchar,Matchdate,101) = Convert(Nvarchar,@p_Match_Date,101) And Sector_Id = @p_Sector_Id And Location_Code = @p_Location_Code AND EntityId IN ( Select Distinct Entity_Id from Fn_Get_Allowed_Entity_List(@p_Location_Code , @p_Sector_Id , @p_Match_Date ,@p_UserID )) and isnull(PRR.Record_Last_Action,'N') <> 'D' and Version = dbo.GET_PRICESRR_MAX_VERSION(@p_Location_Code, @p_Sector_Id, @p_Match_Date, PRR.EntityID, @p_CustId, PRR.Date)
End Else If @p_Result_Status = 2 -- if User is higher than trader.. means broker or higher Begin Declare Cur_Fetch_Curve_Cust_Data cursor for Select Distinct Customerid From PricesRR PRR Where Convert(Nvarchar,Matchdate,101) = Convert(Nvarchar,@p_Match_Date,101) And Sector_Id = @p_Sector_Id And Location_Code = @p_Location_Code And --CustomerID <> 0 --CustomerID not in (0, -1, -2, -3, -100, -200) CustomerId Not In (Select CustomerId From Fn_Get_PricesRR_Not_To_Include_Cust_Id('V')) and isnull(PRR.Record_Last_Action,'N') <> 'D' --and Version = dbo.GET_PRICESRR_MAX_VERSION(@p_Location_Code, @p_Sector_Id, @p_Match_Date, PRR.EntityID, @p_CustId, PRR.Date)
Declare Cur_Fetch_Curve_Entity_Data cursor for Select Distinct EntityID From PricesRR PRR Where Convert(Nvarchar,Matchdate,101) = Convert(Nvarchar,@p_Match_Date,101) And Sector_Id = @p_Sector_Id And Location_Code = @p_Location_Code and isnull(PRR.Record_Last_Action,'N') <> 'D' --and Version = dbo.GET_PRICESRR_MAX_VERSION(@p_Location_Code, @p_Sector_Id, @p_Match_Date, PRR.EntityID, @p_CustId, PRR.Date)
End delete from @Temp_Curve_Submission_Data
----------------------- -----------------------
Open Cur_Fetch_Curve_Cust_Data fetch next from Cur_Fetch_Curve_Cust_Data into @p_CUSTOMERID WHILE @@FETCH_STATUS = 0 BEGIN
IF @@FETCH_STATUS <> 0 break BEGIN ----------------------- ----------------------- Open Cur_Fetch_Curve_Entity_Data fetch next from Cur_Fetch_Curve_Entity_Data into @p_ENTITYID WHILE @@FETCH_STATUS = 0 BEGIN
I have a table with column value like '123 345 678 143 648' like that. What I need to do is I have to take each code value and put it as a new record in another table. So, if I say 'Select substring(column_name,1,3) from table' then it is very fast (fraction of second). But since I need to take each code and the # of codes in each record may vary, I am using a while loop to take each code and so I delclared a variable @i and now my select statement is like this: 'Select substring(column_name,@i,3) from table'. Interesting now this select statement is taking almost 2 mins for each iteration.
Why it is like this? Is there any way I can reduce the time taken to execute each iteration?
Hi,I have a task at hand to reduce the time taken for search query toexecute. The query fetches records which will have to sorted bydegrees away from the logged in user. I have a function whichcalculates the degrees, but using this in the search query slows theexecution and takes about 10 secs to complete which is unacceptable.Please advice. Your help is much appreciatedFor more details plz see:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=97021ThanksIsfaar
============================ ALTER proc [dbo].[sp_CalculateMedianTimeInDepartmentMinutes]
@StartDate date ,@EndDate date as --== Check if count is even or odd declare @modulo int select @modulo = (Select COUNT(*)%2 from ED_data where AdmitDateTime between @StartDate and @EndDate ) --=== Get Median
[Code] ....
My fellow developer is using this code to calcuate a madians in many columns (see below). The problem is that it takes about 2 minutes to execute this code. Is there a way to reduce the time of execution?
I attach also a sample of the view
============== ALTER PROCEDURE [dbo].[sp_ED_Measures] @StartDate date, @EndDate date, @Hospital varchar(5) AS BEGIN SET NOCOUNT ON;
I have a VB application which uses SQL server as the database and uses Crystal reports for reporting.We are using a stored procedure to create a report and we pass ID from the vb side to run the stored procedure. In boston the report shows up in 4 sec.But in california it takes 7 min. We have a very good network(T3).Why it is taking more time in california ?. Any ideas ?
Dear Experts, i've one table named table11. in this perticular table, i've 30 columns and 40,000 rows of data. this table is taking 35 sec for select * from table11.
defnetly it will take more time if i used this in some places like procedures and functions or views like that.
where is the problem? generally it takes that much of time or is there any problem?
guidence please.....
Vinod Even you learn 1%, Learn it with 100% confidence.
Dear All, here i'm posting my query which is taking 3 minutes
please suggest me the best query
SELECT distinct INP.COLUMN001 REPORT_INPUT_ID, INP.COLUMN002 REPORT_ID, INP.COLUMN003 OPERATION_ID, OPER.COLUMN004 OPERATION_CODE, OPER.COLUMN005 OPERATION_NAME, INP.COLUMN004 ITEM_ID, CONVERT(NVARCHAR , INP.COLUMN005, 110) RECEIVED_DATE, INP.COLUMN006 LOT_NO, INP.COLUMN007 RECEIVED_QTY, INP.COLUMN008 CONSUMED_QTY, (select CODE from view1 where item_id = INP.COLUMN004) my_val, (select NAME from view1 where item_id = INP.COLUMN004) Item_Name, INP.COLUMN009 UOM_ID, U.UOM_CODE, INP.COLUMN010 BASE_RECEIVED_QTY, INP.COLUMN011 BASE_CONSUMED_QTY, case when INP.COLUMN012 ='1' then 'Progress' when INP.COLUMN012 ='2' then 'Closed' end OPERATION_STATUS, case when INGDTL.COLUMN006 ='0' then 'Ingredient' when INGDTL.COLUMN006 ='1' then 'Intermediate' end INPUT_TYPE, INP.COLUMNB01 COLUMNB01, INP.COLUMNB02 COLUMNB02, INP.COLUMNB03 COLUMNB03, INP.COLUMNB04 COLUMNB04, INP.COLUMNB05 COLUMNB05, INP.COLUMNB06 COLUMNB06, INP.COLUMNB07 COLUMNB07, INP.COLUMNB08 COLUMNB08, INP.COLUMNB09 COLUMNB09, INP.COLUMNB10 COLUMNB10, INP.COLUMND01 BRANCHID, INP.COLUMND02 COMPANYID, INP.COLUMND03 CREATEDBY, INP.COLUMND04 CREATEDDATE, INP.COLUMND05 LASTUPDATEDBY, INP.COLUMND06 LASTUPDATEDDATE, INP.COLUMND07 ROWGUID, INP.COLUMND08 UPDATEDSITE, INP.COLUMND09 LANGID, WC.COLUMN009 WIP_WAREHOUSE_ID, (SELECT (sum(WIP.COLUMN011) - sum(wip.column010)) FROM TABLE066 WIP where wip.column008 = INP.column004 and WIP.COLUMN005 = '8cd741c7-1ac6-4839-88e7-df85518170f1' and wip.column006 = inp.column003 ) WIP_Qty , WIPM.Column005 WIP_ITEM_ID FROM TABLE073 INP left join view1 I on I.ITEM_ID = INP.COLUMN004 left join view2 U on U.UOM_ID = INP.COLUMN009 left join TABLE022 OPER ON OPER.COLUMN001 = INP.COLUMN003 left join TABLE066 WIP on WIP.column008 = INP.column004 left join TABLE015 WC on WC.COLUMN001 = OPER.COLUMN008 left JOIN TABLE040 INGDTL ON INGDTL.COLUMN002 = INP.COLUMN004 AND WIP.column008 = INGDTL.COLUMN002 left join TABLE065 WIPM on WIPM.column005 = INP.column004 where INP.COLUMN002 = '057f87aa-7884-43fa-8984-9b74c971da62' order by my_val
I am having a serious problem which I need some help with regarding our SQL Server backup.
Basically it has started to take ages (as in 48hrs +), when it should only take about 4 hrs. The database is only 380GB and up until monday our backups have not been completing. When I check the activity monitor I have seen that the 'BACKUP DATABASE' process is set to suspended with a huge wait time and the wait type is ASYNC_IO_COMPLETION.
I am not sure how to solve this, but I am going to have to!
So if anyone has any ideas please help me! If you need any othe info please let me know.
Hi, I want to take a backup of the solution created on reporting server. Do i have to take backup of individual reports or can i take backup of the whole solution. Can you please tell me how to take backup?
I have a stored procedure when I run it manually it takes 23 seconds. I Scheduled a job to run the same stored procedure.It is taking 33 minutes. Any ideas why it is taking so much time ?
I have SQL Server 2005 installed on my machine and I am firing following query to insert 1500 records into a simple table having just on column.
Declare @i int Set @i=0 While (@i<1500) Begin Insert into test2 values (@i) Set @i=@i+1 End
Here goes the table definition,
CREATE TABLE [dbo].[test2]( [int] NULL ) ON [PRIMARY]
Now the problem with this is that on one of my server this query is taking just 500ms to run while on my production and other test server this query is taking more than 25 seconds. Same is the problem with updates. I have checked the configurations of both the servers and found them to be the same. Also there are no indexes defined on either of the tables. I was wondering what can be the possible reason for this to happen. If any of u people has any pointers regarding this, will be really useful
I have a query which returns approximately 50000 records, I am using a linked server to connect to two databases and retrieve data. For some reason it is taking a liitle more than hour to execute the query, but on MS Sql Server query window it comes after few minutes but the query runs for a long time.
How can expediate my query execution process.
Environment details
Database: MS Sql Server 64bit 2005 MS Sql jar file: sqljdbc_1.2.jar OS: Windows both server and client.
I'm running a query (see below) on my development server and its taking around 45 seconds. It hosts 18 user databases ranging from 3 MB to 400 MB. The production server, which is very similar but with only 1 25 MB user database, runs the query in less than 1 second. Both servers have been running on VMWare for almost 1 year with no problems. However last week I applied SP 2 to the development server, and yesterday I applied Critical Update KB934458. The production server is still running SQL Server 2005 Standard SP 1. Other than that, both servers are identical and running Windows 2003 Server Standard SP 1. I'm not seeing this discrepancy with other queries running against user databases.
use MyDatabase
GO
select db_name(database_id) as 'Database', o.name as 'Table',
s.index_id, index_type_desc, alloc_unit_type_desc, index_level, i.name as 'Index Name',
I have a DTS in SQL Server 2000, Where I am importing some data from a remote server (SQL 2000) to local server (SQL 2000).And this is working fine. it is taking max of 1 min to execute the package.
Now I have created the same DTS in sql server 2005 (SSIS) where the source server is sql server 2000 and the destination server is 2005.and I have created the ssis in that server. The same logic which i have created in sql 2000. But here it is taking almost 10 min to execute the package.
Where as the same in sql server 2000 taking max of 1 min. Why this happening.. Is there any configuration to execute the SSIS package.?
I have a SSIS package where a small table of 270 rows are fuzzy looked up with a table in another sql server and inserts the records to a temporary table. This takes more than 3 hours in debug mode or so and never goes beyond this step.I have used a OLE DB destination to insert to temporary table and temporary table doesn't get a value.
I want to create index for hash table (#TEMPJOIN2) to reduce the update query run time. But I am getting "Warning!
The maximum key length is 900 bytes. The index 'R5IDX_TMP' has maximum length of 1013 bytes. For some combination of large values, the insert/update operation will fail". What is the right way to create index on temporary table.
Update query is running(without index) for 6 hours 30 minutes. My aim to reduce the run time by creating index.
And also I am not sure, whether creating index in more columns will create issue or not.
Attached the update query and index query.
CREATE NONCLUSTERED INDEX [R5IDX_TMP] ON #TEMPJOIN2 ( [PART] ASC, [ORG] ASC, [SPLRNAME] ASC, [REPITEM] ASC, [RFQ] ASC,
In order to take automated backup of all user databases below is the query. This query will eliminate use of manual backups for user databases, in order to fully automate this just create a SQL Agent job and write this query in the job and forget about taking any manual DB backups.
DECLARE @name VARCHAR(50) -- database name DECLARE @path VARCHAR(256) -- path for backup files DECLARE @fileName VARCHAR(256) -- filename for backup DECLARE @fileDate VARCHAR(20) -- used for file name SET @path = 'C:DB_BKPUP'
Sometime is necessary to stop MSSQLSeverOLAPServieces to do a full backup in my OLAP Server disks. After backup had finished and I tried to star MSSQLSeverOLAPServieces but it takes almost 30 minutes to the services starts. What can it be causing that?
I have a stored procedure that normally takes about 5 hours to complete: DELETE tblX WHERE PROC_DT < dateadd(day, -93 , getdate())
tblX has about 55 million records and has an index on PROC_DT.
I have this running as a scheduled task. Over the weekend, the task executed and it is still running 56+ hours later. Does anybody have any ideas as to where I should look for the problem? I am afraid to kill the process because of the rollback time.
Hi, I have a table with 48 million rows,when i executed following update query it is taking 10 HOURS in SQL SERVER 2000 with SP1. Where as when i executed same query in SQL SERVER7.0 with same table then it is taking 13 MINUTES. Comming to Machine...SQL 2000 Server has more processors and greater memory than SQL 7.0 m/c. It looks strange but this is true.Does any one faced such problem..is there any bug in SQL 2000?????
Here is Query::
update cus_pay_jan_dist set univ_regdate = b.dayid from cus_pay_jan_dist a with (nolock), tm_dayids b with (nolock) where a.univ_regdate = b.dayidnum and a.univ_regdate like '2001%'
I have one stored procedure and its taking 10 mins to execute. My stored procedure has 7 input parameters and one temp table( I am getting the data into temp table by using the input parameters) and also I used SET NOCOUNT ON. But if copy the whole code of the SP and execute that as regular sql statement in my query analyzer I am getting the result in 4 seconds. I am really puzzled with this.
What could be the reason why the SP is taking more than query,Unfortunately I can't post the code here.
There are two applications running on different server say ServerA and ServerB. Both applications are using same database server SQL Server 2005 say ServerB. Called the application as ApplicationA and ApplicationB with respect to Server names
It means for ServerA the database is remote and for ServerB, database is local.
Both the applications are Java application and using datasource to connect to the database. The driver used are SQL Server 2000 driver (which includes 3 jars). This can be a question that why 2000 driver is used for 2005. The reason is, application on ServerA is getting error while using SQL Server 2005 as Driver not proper.
Problem Area:
When ApplicationB (local to database) is doing some DB operations (which includes select and then batch insert), ApplicationA (remote) is trying to insert a record which is taking too long time (around 40 sec.). This is causing timed out in ApplicationA.
ApplicationA is inserting the data into the same table from where ApplicationB is selecting the data.
Does the validation time of DTSX package depend on the amount of data?
I have a data transfer task which contains a oledb source data from a SQL Server 2005 view. Then the data go through 3 Lookup transfromations before going into another view, also on SQL Server 2005 but a different database. The purpose of this package is to load fact data, so it has to deal with few million of rows. Before setting DelayValidation to true, it takes few minutes just to open the package in BI Studio. Now with DelayValidation set to false, I can open the package without any problem. But it takes more than 5 minutes during Validation, Prepare for Execution and Pre-execution Phase. During the time the memory usage and cpu time on SQL Server goes up significantly. The CPU doesn't hit 100% though. My client machine doesn't have any significant activity.
I have similar packages ( Oledb view -> 3 or 4 lookup Transformations -> Oledb Table) but dealing with dimension data. With DelayValidaion set to False, those packages can be opened in BI Studio within few seconds. They also take only few seconds during those 3 phases before starting actual execution phase.
So I have an impression that the validation time depends on the amount of data in the database. Shouldn't it just depends on the metadata?
I have an update statment in my SSIS that use to take 10 minutes in SQL 2000 dts and now its take 1 hour 15 minutes in SQL 2005.
this is my sql update statment - Update WeeklySalesHistory set weekendingdate = (SELECT LastTransDateTime from ReplicationControl where TableName = 'WEEKHST') where weekendingdate is null
It is using ole db connection. About 36,000 records that it is updating.
I have read ole db can be slow and to use staging table. Does that mean on all updates like this I have to use a staging table and then insert. I didn't use to have to do this in SQL 2000. Has it changed. Are there any other options?
I'm using a OLE DB COMMAND component to perform an update (SQL statement) but the procces takes about 9 hours, so I changed it to a stored procedure but it was the same, I need to update about a a million of rows and the package is very simple.
How can I improve the time, Can I use another component or startegy?