I made a website in ASP.net and using sql server 2005 as database. There is sometime processing data that need long time processing ( about 20 minutes ) and big data. It works fine in dev box, but when I place on shared hosting, and some people access it crashed. The website can not be accessed.
Hosting support told me maybe I need to reprogram my code.
So anybody has solution for this problem ? Should I create new thread ?
I had a database of electronic resources which had 28000 records earlies and was working fine. Now we have added a whole bunch to make it 800K records which has increased the search time to 14-22 seconds which is not acceptable. I have all the tables indexed.
Please help me how to solve this problem. Let me know what other information I should put up here to make my problem undestandable. Thanks in advance, Archana
I'm trying to rebuild my master database for sql server 2000. The process of rebuilding stared fine. But it is almost 4 hours since it got started. Performing it on a test system. Got doubtful and started the same on another test system. Issue is same and it is almost 2 hours. The Db size is less than 100 MB in both cases. IS IT NORMAL? I've tried the same for SQL SERVER 2005 and it got finished in couple of minutes. Please advise.
HiI have have two linked SQL Servers and I am trying to get remote writesworking correctly (fast).I have configured the DB link on both machines to:Point at each others DB.I have security set up to map each others server loginsand Server Options: Collation Compatible, Data Access, RPC, RPC Out, UseRemote Collation all checkedMy problem is that when a SP performsBegin TransactionUpdate Local TableUpdate Remote TableCommit TranIt takes several seconds to complete. (about 7 seconds not acceptable tous)This is due to the remote update - how can I improve the response time?example of a stored procedures that takes timewhere ACSMSM is a remote (linked) SQL Server.procedure [psm].ams_Update_VFE@strResult varchar(8) = 'Failure' output,@strErrorDesc varchar(512) = 'SP Not Executed' output,@strVFEID varchar(16),@strDescription varchar(64),@strVFEVirtualRoot varchar(255),@strVFEPhysicalRoot varchar(255),@strAuditPath varchar(255),@strDefaultBranding varchar(16),@strIPAddress varchar(23)asdeclare @strStep varchar(32)declare @trancount intSet XACT_ABORT ONset @trancount = @@trancountset @strStep = 'Start of Stored Proc'if (@trancount = 0)BEGIN TRANSACTION mytranelsesave tran mytran/* start insert sp code here */set @strStep = 'Write VFE to MSM'updateACSMSM.msmprim.msm.VFECONFIGsetDESCRIPTION = @strDescription,VFEVIRTUALROOT = @strVFEVirtualRoot,VFEPHYSICALROOT = @strVFEPhysicalRoot,AUDITPATH = @strAuditPath,DEFAULTBRANDING = @strDefaultBranding,IPADDRESS = @strIPAddresswhereVFEID = @strVFEID;set @strStep = 'Write VFE to PSM'updateACSPSM.psmprim.psm.VFECONFIGsetDESCRIPTION = @strDescription,VFEVIRTUALROOT = @strVFEVirtualRoot,VFEPHYSICALROOT = @strVFEPhysicalRoot,AUDITPATH = @strAuditPath,DEFAULTBRANDING = @strDefaultBranding,IPADDRESS = @strIPAddresswhereVFEID = @strVFEID/* end insert sp code here */if (@@error <> 0)beginrollback tran mytranset @strResult = 'Failure'set @strErrorDesc = 'Fail @ Step :' + @strStep + ' Error : ' + @@Errorreturn -1969endelsebeginset @strResult = 'Success'set @strErrorDesc = ''end-- commit tran if we started itif (@trancount = 0)commit tranreturn 0
I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.
The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?
I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.
cn.Open() sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table scb = New SqlCeCommandBuilder(sda) sda.Update(dataset) cn.Close()
I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:
1. Start the PPC Program
2. Load DB into dataset
3. Create a ONE new record in dataset
4. Fill back to DB
When I take this four steps everytime, the filling time is almost 1s or even more!
Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)
However, when I give up the dataset and using the following code:
cn.Open() Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)
I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!
I have MS Time Seeries model using a database of over a thousand products each of which has hundreds of cases. It amazingly takes only a few minutes to finish processing the model, but when I click Mining Model Viewer to view the models, it takes many hours to show up. Once the window is open, I can choose model for different products almost instantly. Is this normal?
I am looking at building multiple SSIS packages. There will be some similarities. Flexibility is of highest importance. The main packages will need to connect to SQL Server1 as a source and SQL Server2 as a destination to transfer over dimenion data from multiple databases. (other SSIS packages may need to use SQL Server2 as a source and SQL Server1 as a destination)
For a single dimension table containing column dim_id on the target server (SQLServer2). I need to pass the results of the following SQL and insert into SQLServer2.database.dim_table
select dim.id from SQLServer1.database08.dim_table union select dim.id from SQLServer1.database07.dim_table union select dim.id from SQLServer1.database06.dim_table
Now next year the names of the databases on SQLServer1 will be database09,database08,database07!
Now so far my best thought is creating views in my destination SQL Server. So I need some way of dropping and recreating the views. Previously in DTS I would expect to see SQL Server connection that I could use as source and destination. Now I can see SQL Server destination but not source? Also How do I just use SSIS to run some SQL. i.e execute a stored procedure, drop and creat views?
Many thanks, Ells p.s Flexibility is the key, in the last three months all the ip and server names have changed more than once so need to be as flexible as possible.
I'm populating a new table based on information in an existing table. The new table is a list of all "items" and contains a primary key. The old table is a database of receipts where items can appear many times in any order.
I have put together the off-the-shelf components to do this, using a lookup transformation to see if the item is already in the new table. Problem is, because there's so much repetition in the old table I need to process the old table one row at a time. Batch processing is generating errors because the lookup doesn't detect duplicates within the buffer.
I tried setting the "DefaultBufferMaxRows" property of the task to 1, but that doesn't seem to have any effect.
To get the data from the old table, I'm using an OLE DB source. To get the data into the new table, I'm using the OLE DB Command transformation with parameters to execute an INSERT statement.
This is a job I have to do exactly once, so I don't care if I have to run it overnight. I'd rather have a simple, easy to understand but inefficient script so I understand what it's doing completely.
Any help on forcing SSIS to process one row at a time?
I am verifying my reports processing time. I get the information from the Reporting Service DB - [ExecutionLogs] table. I have the following information:
[TimeEnd] €“ time that reports generation ends.
[TimeStart] - time that reports generation starts.
[TimeDataRetrieval] - amount of time spent running the data sources.
[TimeProcessing] - time spent processing the report.
[TimeRendering] - time spent generating the output format.
If this information is correct the following statement should be true:
Is there anybody out there that can help me on how can I know the processing time taken for one transaction by using SQL Analyzer??
1)For example, I want to update using Analyzer and I would like to know time taken to do this update???
2) How to reduce processing time by using Store Procedures that using cursor?? I have add in some commit statement in my update statement.. Is there any other ways??
For a particular report it is sometimes failing to execute and sometimes successful. When querying the execution log it is apparent that the processing time is exceeding the timeout period setttings on the report server. However what is not clear is that the data retrieval is taking all the time and none for data retrieval. The report is just displaying data from a stored procedure. Can someone interpret the following data from the execution log table:
Hi, cube processing is taking more time in a new server while same cubes takes less time in another server. the cubes are processed through DTS package can anybody help finding out the possible reasons for this. Regards Naseem
I am running into a barrier and need to understand the average length of time that a fully optimized data cube should take to process.
We are currently running an average of 15 to 20 minutes per cube, with average of 2000 aggregations, 25% performance increase, and approximately 2 million rows, with around 40 dimensions and 30 measures.
I personally think this is a pretty good time to process. However, I am being challenged to reduce this time frame. In theroy I can't possibly see it getting below where we currently are. SO I am reaching out to the group of guru's...
What is your average length of time to process your Data Cubes? Please respond to me at ken.kolk@medcor.com I would greatly appreciate it and need the averages from the field.
I have a stored procedure that is called from a VB.NET application that takes an enormously long time to execute. In the QA it only takes 10sec but in the application it takes ages. The stored procedure is as follows:
PROCEDURE NAME IS SPTOPTWENTYUSERS
SELECT TOP 20 STRUSERNAME,SUM(INTBYTESRECVD) AS INTDOWNLOAD FROM TBLISAWEBLOGS WHERE DTELOGDATE BETWEEN @BEGINDATE AND @ENDDATE GROUP BY STRUSERNAME ORDER BY INTDOWNLOAD DESC
The code that runs it is as follows:
sSQLString = SPTOPTWENTYUSERS Using cnn As New SqlConnection(GetPath) Try Dim cmd As New SqlCommand(sSQLString, cnn) Dim dr As SqlDataReader
With cmd .CommandType = CommandType.StoredProcedure .CommandTimeout = 0 .Parameters.Add("@BEGINDATE", SqlDbType.DateTime) .Parameters.Add("@ENDDATE", SqlDbType.DateTime) .Parameters("@BEGINDATE").Value = dtpStartDate.Value .Parameters("@ENDDATE").Value = dtpEndDate.Value End With cnn.Open() dr = cmd.ExecuteReader
Any help on why this happens would be much appreciated.
Hi There, We have developed a application in VB and connected to SQL Server 6.5, we have some stored procedures where it brings the data from SQL Server 6.5, this application is running since some months, when we run this application it usually take only one minute to generate the report but since couple of days it is taking 25 Minutes to generate the report, even when I run that stored procedure at backend in Query analyzer at Server it is taking 15-20 Minutes to give the result. please can any one help in identifying the problem, What all the things I need to check to identify it. Give me the solution.
Problem: I schedule a job that calls a stored procedure which loads around 1.5 million records. The Job takes 19 hrs to complete. However, if i run that stored procedure manually in Query Analyser it takes only 45 minutes..
Did anyone faced this problem? Is this known problem..Any suggestions/recommendations?
I have a CTE query that is used to fill in nulls on a history table. The With statement executes just fine. sub 2 seconds on 974 records, however the main query is what's turning the whole query into a turtle. I know that it's the looping that it's doing there that is causing the slow down, but I'm just not sure how to fix it. I've tried inserting it into a temp table, refactored the code a hundred times, but nothing seems to be working.
Code is below and the execution plan is attached. Server Version: 12.0.2342.0 Enterprise: 64bit
;WITH BuildTable AS ( SELECT [GEGTH].[ID] , [GEGTH].[Changed By] , CAST( [dbo].[GetWeekStarting] ([GEGTH].[Changed Date] , 2 ) AS DATE) AS WeekOf , [GEGT].[Title]
When I login using QA to my SQL Server database, it takes 15-20 secondsto establish a connection and open a query window. Drilling into adatabase via Enterprise Manager is similar. Once the connection isestablished, the server runs plenty fast however.Can someone tell me why it could take a long time for a connection tobe established?This behavior occurs when I am local on the box.Thanks,John
Hi,I've a strange problem with a INSERT query. It's taking a long time toexecute. The format is like this :INSERT INTO table1SELECT ..FROM table2Executing the SELECT .. FROM table2 is taking 30 seconds. The resultis nothing: no records are selected.When i include the INSERT part it will take 12 hours to completeINSERT INTO table1SELECT ..FROM table2There's is an index on the table and when i delete it, it gives stillthe problem.Keh?Greetz,Hennie
I have a query which returns approximately 50000 records, I am using a linked server to connect to two databases and retrieve data. For some reason it is taking a liitle more than hour to execute the query, but on MS Sql Server query window it comes after few minutes but the query runs for a long time.
How can expediate my query execution process.
Environment details
Database: MS Sql Server 64bit 2005 MS Sql jar file: sqljdbc_1.2.jar OS: Windows both server and client.
I'm running a query (see below) on my development server and its taking around 45 seconds. It hosts 18 user databases ranging from 3 MB to 400 MB. The production server, which is very similar but with only 1 25 MB user database, runs the query in less than 1 second. Both servers have been running on VMWare for almost 1 year with no problems. However last week I applied SP 2 to the development server, and yesterday I applied Critical Update KB934458. The production server is still running SQL Server 2005 Standard SP 1. Other than that, both servers are identical and running Windows 2003 Server Standard SP 1. I'm not seeing this discrepancy with other queries running against user databases.
use MyDatabase
GO
select db_name(database_id) as 'Database', o.name as 'Table',
s.index_id, index_type_desc, alloc_unit_type_desc, index_level, i.name as 'Index Name',
If I use the following query for a Dataset and the execution takes a few seconds to show results
SELECT *
FROM dbo.ICParameter
WHERE (PatientID = @ID ) AND (LogTime > DATEADD(day, -1, GETDATE()))
ORDER BY LogTime
If I replace '99010200101' with @ID and enter '99010200101' when prompted for ID, the execution takes forever. Actually I have never got any results even after waiting for 10 minutes.
It seems inserting records takes a relatively long time. My guess is it needs time to allocate disk space for the extra space needed. Assuming this is true, are there any DB settings that allow auto space allocation in bigger chunk? I am looking for something like "DB growth factor" or " Table growth factor"
Dear friends, Our package which at the time of normal execution takes 2-2:30 mins for fetching data on VPN with some select queries. But some times its job runs for hours and hours. What could be the exact reason behind it? I guess its queries are stuch somewhere, but not when we run from the BIDS or run the job manually. Please help. Thanks.
I am designing a project using C#, SQL Express and VSTO.
Every time I change something in the database XSD display i.e add a query or create a new table adapter or configure an existing query, or just move the position of an existing table adapter, Visual Studio locks intself into a never ending loop with 100% processor activity (or 50% with dual processors). It stays in this state for about 4-5 hours and then come back to life.
Is there an option that I need to disable/enable to stop the program from doing this. I will appreciate your help.
SELECT @WantedDate = CAST(CAST(YEAR(GETDATE()) AS nvarchar)+ '-' +CAST(1 AS nvarchar)+ '-' +CAST(@myNum AS nvarchar) AS SMALLDATETIME), -- User supplied parameter value
Sometime is necessary to stop MSSQLSeverOLAPServieces to do a full backup in my OLAP Server disks. After backup had finished and I tried to star MSSQLSeverOLAPServieces but it takes almost 30 minutes to the services starts. What can it be causing that?
I have a stored procedure that normally takes about 5 hours to complete: DELETE tblX WHERE PROC_DT < dateadd(day, -93 , getdate())
tblX has about 55 million records and has an index on PROC_DT.
I have this running as a scheduled task. Over the weekend, the task executed and it is still running 56+ hours later. Does anybody have any ideas as to where I should look for the problem? I am afraid to kill the process because of the rollback time.
Hi, I have a table with 48 million rows,when i executed following update query it is taking 10 HOURS in SQL SERVER 2000 with SP1. Where as when i executed same query in SQL SERVER7.0 with same table then it is taking 13 MINUTES. Comming to Machine...SQL 2000 Server has more processors and greater memory than SQL 7.0 m/c. It looks strange but this is true.Does any one faced such problem..is there any bug in SQL 2000?????
Here is Query::
update cus_pay_jan_dist set univ_regdate = b.dayid from cus_pay_jan_dist a with (nolock), tm_dayids b with (nolock) where a.univ_regdate = b.dayidnum and a.univ_regdate like '2001%'
I use MS ACCESS and tables linked to MS SQL SERVER tables. There are indexes defined (they are visible in MS ACCESS too). Sometimes DLOOKUP functions are taking long time. Why?