Sp_fetch Executed A Lot Of Times, Decreasing Performance - ASP + SQL 2005
May 23, 2008
Hey,
In a ASP application, which used Access and is now being migrated to SQL 2005, I have a medium-sized query with many inner joins. I optimized the tables, with indexes, etc, and if I run the query on the SSMS it returns me like 55 lines in less then a second. On the other hand, when the ASP page passes the query to the SQL 2005, it takes 10 sec. to get the data. Using the Profiler I found that the sp_fetch is being executed a lot of times. It looks like the query is decomposed in smaller pieces and the rows as selected one by one. I'm using OleDb.
How to make the ADO (the culprit in my opinion) to execute that query at once?
Thank you.
View 3 Replies
ADVERTISEMENT
Jul 4, 2006
Hi,
We are processing 60,00, 000 rows(2 GB file) available in a flat file and loading them in to a database tables using OLEDB Destination components. In the data pipeline of an SSIS package we have 1 flat file source reader, 7 look up components(full cache mode), 1 multicast component and 2 OLE DB destinations with fast load option.
We have observed that first 10,00, 000 rows are processed and loaded in to target tables in just 4 minutes time. The second set of 10,00, 000 rows are processed in 15 minutes time. After this for processing each 1,00,000 rows SSIS is taking approximately 8 - 10 minutes time. We are not able to identify the reasons for the unexpected behaviour of SSIS.
We thought that as the input file size is 2 GB SSIS is not able to manage and slowing down over time of execution. We did split the big input file in to 60 small 37 MB (approx) size files. Then we modified the package by adding For-Each loop task to process all the 60 small files and load them in to database server sequentially. Even in this approach also we have identified data loading has slowed down drastically after processing 13 files.
In order to verify is there any problem with reading source file or transformation, we have replaced OLEDB destinations component with Flat File destinations. With Flat file destination the time taken for processing rows is very constant. For every 8 minutes package is able to process 10,00,000 rows and write them in to the destination files. So, there is no problem with the with either Look up components or flat file source reader.
We are sure that target database server is in same state/condition from the starting to the end of package execution. The client box in which we are running the package is having 1 GB RAM. During package execution time the CPU usage is at 30 % and PF usage is 580 MB. SP1 is also installed on both Client and Server.
Does any one have clue what is causing slow down of data load over the time of package execution?
View 3 Replies
View Related
Feb 21, 2006
Using the dm_exec_query_stats in 2005, I know I can get the number ofexecutions for a particular sql_handle, but is it possible to get thenumber of execs for a SQL in version 7 or 2000? Also is it possible toget reads/writes/etc in these early versions?Thanks.
View 1 Replies
View Related
Dec 15, 1999
One particular SQL stored procedure executes 25 times more slowly when invoked by an SQlAgent job than when executed directly. Any suggestions?
View 4 Replies
View Related
Sep 18, 2007
I have also posted this in microsoft.public.sqlserver.programming.
I have a query which, depending on where I run it from, will either take 10 milliseconds or 10 seconds.
The query works perfectly when run in SQL Server Management Studio... in my database of around 70,000 items it returns the results in around 10ms. It uses all my indexes and indexed views correctly.
However when I run the identical query from my ASP.NET application, it takes around 10 seconds... 1000 times longer.
Looking at it in Sql Server Profiler I can't see any difference in the query, except from ASP.NET it needs 62531 reads and from SSMS it needs only 318 reads. If I copy the slow running ASP.NET query from the profiler into SSMS, then it runs quick again. The results returned are the same.
I have provided more details of the query below, but I guess my real question is: What is the best way to debug this? I'm not an expert with SQL Server, so any pointers on where I should start looking to find the difference in how the query is being executed would be a great help.
The query is of the form:
WITH RowPost AS
(
SELECT
ROW_NUMBER() OVER(ORDER BY DateCreated DESC) AS Row,
ItemId,
Title,
....
FROM
Items_View WITH(NOEXPAND)
WHERE ItemX >= @minX AND ItemX <= @maxX AND ItemY >= @minY AND ItemY <= @maxY
)
SELECT
*,
(SELECT Count(*) FROM RowPost) AS [Count]
FROM RowPost
WHERE Row >= @minRow AND Row < @maxRow
Where Items_View is an indexed view, and WITH(NOEXPAND) is being used to force it to use the indexed view (this is optimal). The line beginning "SELECT Count(*)" is to get the total number of results (without having to run the inner query a second time).
This is running against SQL Server Developer Edition.
View 5 Replies
View Related
Oct 26, 2007
Hi all,
In my project, I have a website and through that, I run my reports. But the reports take a lot of time to render. When I checked the profiler, it showed that the SP for the report is run around 4-5 times. Due to this, the report rendering takes a lot of time.
When, I ran the SP with the same set of Parameters in Query Analyser, it ran in around 18 seconds. But when I ran the report from web interface, it took around 3 minutes to completely show the data. And the SP has been run 5 times.
I am having serious problems with Report's performance because of this. Many a times, report just times out. I have set the timeout as 10 minutes. And because the Sp is run 5 times, the report times out, if there is huge amount of data.
Any help would be appreciated.
Thanks in advance.
Swati
View 5 Replies
View Related
Feb 13, 2006
Is there any information around what the SSIS packages are doing in the first 5-10 seconds of execution, and ways to speed this process up?
View 3 Replies
View Related
Mar 13, 2002
I created DTS a while ago and placed in job to run once a day (it worked fine for 3 months)
2 days ago I changed sa password and now job fails with error (Login failed for user 'sa'.), but it run fine from DTS !!!
1. My DTS created with domain Account DomainSVCSQL2000( sa rights and local admin)
2. SVCSQL service use DomainSVCSQL2000 to run
3. SVCSQL agent use DomainSVCSQL2000 to run
4. DTS use 'osql -E
Where should look for reference to sa ?
Executed as user: MONTREALsvcsql2000. DTSRun: Loading... Error: -2147217843 (80040E4D); Provider Error: 18456 (4818) Error string: Login failed for user 'sa'. Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0. Process Exit Code 1. The step failed.
View 5 Replies
View Related
Nov 27, 2001
how does one make tempdb smaller, when i try to reduce the allocated size it wont let me? it fell over during a process i have tried to srhink the database in enterprise and tried dbcc shrinkdatabase in analyser but to no avail!
View 1 Replies
View Related
Sep 25, 2007
We have a large number of clients attempting to replicate two publications on 2005 Express databases (2 publications subscribed to the one subscriber database) with our 2005 Server (9.00.3042.00 SP2 Standard Edition) and experiencing two significant problems:
1) Users experience the following message:
The Merge Agent failed after detecting that retention-based metadata cleanup has deleted metadata at the Subscriber for changes not yet sent to the Publisher. You must reinitialize the subscription (without upload).
This problem should not apparently occur with SQL Server 2005 (or 2005 Express) instances with SP2 applied. All clients experiencing this problem have SP2 installed as does our Server and the retention period is 30 days. The subscribers have been replicating well under that.
2) Replications never succeed after appearing to replicate/loop around for hours
This issue is the most critical as we have clients who have been installed and re-installed with new instances of SQL Server 2005 Express, new empty databases (on subscriber before snapshot extraction), and using fresh snapshots (less than an few hours old) which cannot successfully replicate.
Interestingly there is at least 1 instance where several computers are subscribed and successfully replicating the same database as another where replication refuses to succeed.
To test we have taken a republished database from another 2005 Server which is working fine and restored it to the same server as the one holding the database with which we are experiencing problems and subscribed to it. This test worked fine and replication of both publications went through fast and repeatedly without showing any signs of problem.
This indicates that the problem is perhaps data related as it appears localised to that database.
Below are two screenshots which may assist.
Screenshot 1 Shows that on the server side the replication attempts look like they are succeeding despite the fact that the subscriber end does not indicate success. Also the history indicates the the subscription has spent all it's time initialising and not merging any changes.
Screenshot 2 Shows a rogue process which has appears on many of the problem child subscribers. It shows a process running with no end time even though the job indicates failure in the message and even though other replication attempts appear to have succeeded after it. This process stays in the history showing that it is running even when I can find no corresponding process for it.
Can anyone suggest a further course of action/further testing/further information required which may assist?
This is extremely urgent and any assistance would be greatly appreciated!
Thanks in advance!
Scott
View 5 Replies
View Related
Sep 12, 2005
I have a large database currently using up 5GB of space --
I deleted some records of a table and the free space is increased.
So the question is, if I delete some of the data from this table
will the database size decrease? Is there anything else I can do to
reduce the size of the database?
Thank you so very much for any help with this!
Regards Mr. Mumin
Thanks All,
View 5 Replies
View Related
Jul 20, 2005
hi,If we consider the following scenerio:a school with 500 students.On average student takes [x] courses per year, and [y] tests per course.table of grades:+------------+-----------+---------+-------+| student_id | course_id | term_id | grade |+------------+-----------+---------+-------+| | | | |+------------+-----------+---------+-------+| | | | |+------------+-----------+---------+-------+as average, x = 20, y =6 thennumber of records = 500 * 20 * 6 = 120,000 record per year.is this design correct considering number of records generated per year?Is it important to reduce the "number of records" or no?thank you.
View 1 Replies
View Related
May 26, 2008
Just wonder whether is there any indicator or system parameters that can indicate whether stored procedure A is executed inside query analyzer or executed inside application itself so that if execution is done inside query analyzer then i can block it from being executed/retrieve sensitive data from it?
What i'm want to do is to block someone executing stored procedure using query analyzer and retrieve its sensitive results.
Stored procedure A has been granted execution for public user but inside application, it will prompt access denied message if particular user has no rights to use system although knew public user name and password. Because there is second layer of user validation inside system application.
However inside query analyzer, there is no way control execution of stored procedure A it as user knew the public user name and password.
Looking forward for replies from expert here. Thanks in advance.
Note: Hope my explaination here clearly describe my current problems.
View 4 Replies
View Related
Jul 27, 2004
I have a table in my database and the table has almost 45 columns and the rowsize is 10468 bytes.in that most of the colums have varchar datatypes and and i think coz of poor knowledge of the data most of the columns with varchar data were given more column length. Now i want to decrease the size of those columns and to see the row size would be around 8k Bytes.If i do this now, does it affect the table performance much....Infact can i do this as there is lot of data (almost 2 million rows) in the table.If it is possible is there anything to be taken care before changing the column lenghts.
Thanks.
View 2 Replies
View Related
Apr 4, 2007
Hi friend!
I want to schedule a query or procedure running at given time regularly eg: at 12AM daily. Please tell me how to do that in sql server 2005.
We use SQL server 2005 developer edition.
Thanks in advance.
View 1 Replies
View Related
Feb 14, 2008
Hi all,
I have a Matrix report (SQL 2005 SP2) which uses a stored procedure to retrieve the result set. When I preview or view the report on IE, the SP gets executed thrice (instead of just once)? Anybody know about this? Is this is a bug in Rep Srvcs?
It is slowing down the report considerably.
Any help would be appreciated!
Thanks
SS
- I used SQL Profiler SP-Starting event to track this
View 2 Replies
View Related
Aug 19, 2007
Hi,
I installed VS 2005, but the optional component SQL Express is failing. The log says "SQL Express installation stopped due to unexpected errors". Can some one help me out how to resolve this issue?
Regards,
-Aazad.
View 3 Replies
View Related
Mar 16, 2008
This is my 10th attempt. I've uninstalled all the .net components and then reinstalled .net 2.0 and then various versions of SQL Server 2005 Express. Here's what all 10 error logs said:
Microsoft SQL Server 2005 Setup beginning at Sun Mar 16 18:26:46 2008
Process ID : 3808
c:fc0a616bb7e15d80b498fce92asetup.exe Version: 2005.90.3042.0
Running: LoadResourcesAction at: 2008/2/16 18:26:45
Complete: LoadResourcesAction at: 2008/2/16 18:26:45, returned true
Running: ParseBootstrapOptionsAction at: 2008/2/16 18:26:45
Loaded DLL:c:fc0a616bb7e15d80b498fce92axmlrw.dll Version:2.0.3609.0
Complete: ParseBootstrapOptionsAction at: 2008/2/16 18:26:46, returned false
Error: Action "ParseBootstrapOptionsAction" failed during execution. Error information reported during run:
Could not parse command line due to datastore exception.
Source File Name: utillibpersisthelpers.cpp
Compiler Timestamp: Wed Jun 14 16:30:14 2006
Function Name: writeEncryptedString
Source Line Number: 124
----------------------------------------------------------
writeEncryptedString() failed
Source File Name: utillibpersisthelpers.cpp
Compiler Timestamp: Wed Jun 14 16:30:14 2006
Function Name: writeEncryptedString
Source Line Number: 123
----------------------------------------------------------
Error Code: 0x80070002 (2)
Windows Error Text: The system cannot find the file specified.
Source File Name: cryptohelpercryptsameusersamemachine.cpp
Compiler Timestamp: Wed Jun 14 16:28:04 2006
Function Name: sqls::CryptSameUserSameMachine:rotectData
Source Line Number: 50
2
Could not skip Component update due to datastore exception.
Source File Name: datastorecachedpropertycollection.cpp
Compiler Timestamp: Wed Jun 14 16:27:59 2006
Function Name: CachedPropertyCollection::findProperty
Source Line Number: 130
----------------------------------------------------------
Failed to find property "InstallMediaPath" {"SetupBootstrapOptionsScope", "", "3808"} in cache
Source File Name: datastorepropertycollection.cpp
Compiler Timestamp: Wed Jun 14 16:28:01 2006
Function Name: SetupBootstrapOptionsScope.InstallMediaPath
Source Line Number: 44
----------------------------------------------------------
No collector registered for scope: "SetupBootstrapOptionsScope"
Running: ValidateWinNTAction at: 2008/2/16 18:26:46
Complete: ValidateWinNTAction at: 2008/2/16 18:26:46, returned true
Running: ValidateMinOSAction at: 2008/2/16 18:26:46
Complete: ValidateMinOSAction at: 2008/2/16 18:26:46, returned true
Running: PerformSCCAction at: 2008/2/16 18:26:46
Complete: PerformSCCAction at: 2008/2/16 18:26:46, returned true
Running: ActivateLoggingAction at: 2008/2/16 18:26:46
Error: Action "ActivateLoggingAction" threw an exception during execution. Error information reported during run:
Datastore exception while trying to write logging properties.
Source File Name: datastorecachedpropertycollection.cpp
Compiler Timestamp: Wed Jun 14 16:27:59 2006
Function Name: CachedPropertyCollection::findProperty
Source Line Number: 130
----------------------------------------------------------
Failed to find property "primaryLogFiles" {"SetupStateScope", "", ""} in cache
Source File Name: datastorepropertycollection.cpp
Compiler Timestamp: Wed Jun 14 16:28:01 2006
Function Name: SetupStateScope.primaryLogFiles
Source Line Number: 44
----------------------------------------------------------
No collector registered for scope: "SetupStateScope"
00DFCFC0Unable to proceed with setup, there was a command line parsing error. : 2
Error Code: 0x80070002 (2)
Windows Error Text: The system cannot find the file specified.
Source File Name: datastorepropertycollection.cpp
Compiler Timestamp: Wed Jun 14 16:28:01 2006
Function Name: SetupBootstrapOptionsScope.InstallMediaPath
Source Line Number: 44
Class not registered.
Failed to create CAB file due to datastore exception
Source File Name: datastorecachedpropertycollection.cpp
Compiler Timestamp: Wed Jun 14 16:27:59 2006
Function Name: CachedPropertyCollection::findProperty
Source Line Number: 130
----------------------------------------------------------
Failed to find property "HostSetup" {"SetupBootstrapOptionsScope", "", "3808"} in cache
Source File Name: datastorepropertycollection.cpp
Compiler Timestamp: Wed Jun 14 16:28:01 2006
Function Name: SetupBootstrapOptionsScope.HostSetup
Source Line Number: 44
----------------------------------------------------------
No collector registered for scope: "SetupBootstrapOptionsScope"
Message pump returning: 2
I'm a student and need this to work for projects. Any help will be greatly appreciated.
Bruce
View 6 Replies
View Related
Jan 9, 2008
I have a cell in one of my reports that displays both a begin time and an end time. The problem is that the times are stored in military time and I need to display them in AM/PM format.
Currently I am doing the following:
CASE
WHEN DATEPART(HH, start_time) < 13 THEN RIGHT(CAST(100 + DATEPART(HH, start_time) AS CHAR(3)), 2)
ELSE CAST(DATEPART(HH, start_time) - 12 AS CHAR(2)) END + ':' + RIGHT(CAST(100 + DATEPART(MI, start_time) AS CHAR(3)), 2) +
CASE
WHEN DATEPART(HH, start_time) < 13 THEN ' AM' ELSE ' PM'
END
+ ' - ' +
CASE
WHEN DATEPART(HH, end_time) < 13 THEN RIGHT(CAST(100 + DATEPART(HH, end_time) AS CHAR(3)), 2)
ELSE CAST(DATEPART(HH, end_time) - 12 AS CHAR(2)) END + ':' + RIGHT(CAST(100 + DATEPART(MI, end_time) AS CHAR(3)), 2) +
CASE
WHEN DATEPART(HH, end_time) < 13 THEN ' AM' ELSE ' PM' END AS 'TIMES',
in order to present the two times in one cell in AM/PM format. While this works, I'm wondering if there is a way in Reporting Services layout mode to give a date or time format mode to a cell that will work for two times with a dash in between them, rather than simply for one time. Then I could pull the times in the format that they are stored on the database and use Reporting Services to format them. Does anyone have any suggestions?
View 4 Replies
View Related
Apr 5, 2008
Log shows this:
Product : OWC11
Error : Error 1311. Source file not found: C:DOCUME~1TOMCUR~1LOCALS~1TempIXP000.TMPSKU0A4.CAB. Verify that the file exists and that you can access it.
--------------------------------------------------------------------------------
Machine : D7NSJK71
Product : Microsoft Office 2003 Web Components
Product Version : 11.0.8003.0
Install : Failed
Log File : c:Program FilesMicrosoft SQL Server90Setup BootstrapLOGFilesSQLSetup0005_D7NSJK71_OWC11_1.log
Last Action : InstallExecute
Error String : Source file not found(cabinet): C:DOCUME~1TOMCUR~1LOCALS~1TempIXP000.TMPSKU0A4.CAB. Verify that the file exists and that you can access it.
Error Number : 1311
When the dialog box appeared indicating file couldn't found, there was no option to look elsewhere for it the only option was retry or cancel which after 5 or 6 retries I cancelled. Everything else appears to have installed correctly for this instance.
Also earlier attempts to install have left services for named instances that I cannot install to such as SQLEXPRESS , SQLEXPRESS1 and CURRIEREXPRESS, I long ago had a beta version installed and have attempted numerous methods to completely wipe these out using the uninstallbetas tool from Microsoft as well as a number of suggestions from some posts.
Please help - The most significant reason I'm installing this is to practice creating SQL Reporting Service reports
View 3 Replies
View Related
Sep 22, 2007
I am getting following error when trying to install SQL express 2005 on XPSP2.
TITLE: Microsoft SQL Server 2005 Setup
------------------------------
The SQL Server System Configuration Checker cannot be executed due to WMI configuration on the machine SIGMA-805539A79 Error:2147944122 (0x800706ba).
For help, click: http://go.microsoft.com/fwlink?LinkID=20476&ProdName=Microsoft+SQL+Server&ProdVer=9.00.1399.06&EvtSrc=setup.rll&EvtID=70342
I tied re-installing WMI using http://blogs.msdn.com/jpapiez/archive/2004/12/09/279041.aspx link but could not get it working.
Do i need IIS installed? Its not installed on this box...
please suggest something... i am stuck...
Thanks,
View 3 Replies
View Related
Jun 28, 2007
Not sure if this is the right Forum for this question...
I have a DTS job running against SQL 2005. When this job executes one particular step (running a stored procedure), it takes a long time to run. If I try that same stored procedure in Query Analyzer, it runs really fast. How to fix?
(Migrating to SSIS has been considered, but we're not going there yet.)
I have tried running the DTS job with different logins, on different machines, inside DTS Designer and from a command line. Every time it is slow is DTS but fast in QA.
More Details:
This DTS job hasn't always been slow. Several months ago it was fast and then slowed down. For days and days it was slow. Then, someone installed SP2 for SQL. After that, the DTS job became lightning fast again. Then, today, it's back to the slowness.
Putting the SP through the tuning wizard only shows that we could achieve minor improvements in performance with a bunch of new indexes.
Anyone have an idea why this SP would be slow in DTS, but not in QA?
View 3 Replies
View Related
Feb 24, 2008
Hi,
I have installed two instances for the sql for two different application, and the task manager in Windows is showing that I am using 3.5 GB from the RAM (I have 4 GB RAM on my server).
Can sombody tell me the stebs to have monitoring on the RAM and how to have the best performance for SQL 2005.
I appreciate your help.
Thanks
View 4 Replies
View Related
Mar 10, 2008
Hello!I have a very simple structured table:id | datawhere "data" is a varchar(100) This table would contain a lot rows (~ 500.000.000) and I want to select all "id" where data=@data. Is it realistic that the SQL Server could serve this request on a normal webserver within 1 or 2 seconds? Thanks!
View 1 Replies
View Related
Sep 11, 2006
Hi, I want to know if anyone have any clue about the reason why this happens.
I have a table on SQL Server 7 with 320 thousand registers and when I execute a SELECT * on it, it takes about 6 seconds to give an answer. But the same table on SQL Server 2005 Ent takes about 16 seconds, Is it normal?:shocked: :shocked:
View 3 Replies
View Related
Apr 3, 2008
I was really impressed with the speed of sql server 2005, but when I upgraded to 2008, used the exact same database, and ran the exact same sql scripts, I noticed the performance was slower.
Is there a structural difference in 2008?
View 5 Replies
View Related
Aug 8, 2007
Hi all,
I am migrating a data warehouse from SQL 2000 to 2005. So far, I have been able to convert all DTS's on the old server and most tables and users. I am having problems with some of my views, though. A view which involves over 5 tables, and some sub-views of those tables runs perfectly on SQL 2000, but on 2005 I get a Query Timed out Message. A typical run of this view can return from 200-1000 records. My guess is that it gets stuck somewhere in the subviews it has to run. So I wonder, what are the limitations of SQL 2005 concerning Queries and sub-queries (how many subqueries can a query have without timing out?). I mean, I would expect 2005 to have more processing capacity than SQL 2000 (on which this query runs perfectly). I have run some queries which don't run on 2000 but do run on 2005 and return over 4000 records.
Or is there some setting I haven't adjusted, like the time it takes for a query to time out? How would I adjust this, then?
Many thanks for the info.
View 3 Replies
View Related
Jun 5, 2006
I have recently upgraded from SQL Server 2000 to SQL Server 2005, and now all my queries run infinitely more slowly.
Here is the scenario - I run an extract of a MS SQL Server database at a client site, then recreate the database on our in-house server - but without indexes etc. Then I run various queries in order to created data files that will be used for importing into a global system. When I was running Server 2000, most of the queries ran in less than 10 seconds each, but under Server 2005 they take 3 minutes or more! Does anybody know of any parameters that I need to adjust to fix this problem?
View 5 Replies
View Related
Jan 27, 2008
Hello All,
Here is the SQL 2005 encryption environment:
1. Clustered SQL 2005 (enterprise edition) on windows 2003. HP (quad processor) with CPU affinity set to all processors.
2. Table structure where encrypted data will be stored has two varbinary (max) columns to store encrypted data. The columns are varbinary (max) b/c the data size could be more that 8K.
3. Encryption using AES (tried both 128/256) algorithm with symmetric keys.
When inserting data in the columns, CPU is staying at 50% when inserting records. Any ideas why this would be happening. Any suggestions on improving performance is appreciated..
Thanks..
View 7 Replies
View Related
Jun 22, 2007
I have sql 2000 running with a client database that is about 200 people per day. A VB front end runs it. I have some problems with performance. Would upgrading to Sql 2005 improve my database performance?
View 5 Replies
View Related
Oct 25, 2007
Greetings.
What do I now have:
A directory with Access Databases; around 20 databases, all dinamicly created;
Each Database has on average 300 tables inside, all equally structured, all created by software;
Each table has two DateTime fields, 4 double fields and 4 long int fields;
Each table has around 10000 records, average.
The Directory is shared in a Windows 2003 Enterprise server.
Around 20 users access the databases simultaneously, adding, retrieving and deleting data, over 100MBits LAN.
Here's the catch:
As fast as possible, the program needs to retrieve 1 single record matching a single date from a given table in a given database. All databases work together. It needs to gets litterally thousands of individual records in order to work properly. Per user. That means thousands of requests, but not much data in each request. That's its core job.
A small percentage of request write the record back , that is, update it. Maybe 2% of requests.
If I were to reproduce this situation in a SQL server 2005, what would be the expected time for lets say 50000 requests ?
Or should I stick to Access ?
Thanks,,
Any response will be apreciated.
Pedro Ramos
View 1 Replies
View Related
Mar 1, 2007
Hello,
Is performance of web application (ASP.NET + SQL Server 2005 Wrg edition + Win Server 2003 Web edition) running on server with one core duo/4 CPU generally comparable to the performance of the same application running on the same server with 2/4 physical CPU’s?
Thank you for your ideas!
Jan
View 1 Replies
View Related
Nov 6, 2007
Hi,I have a Microsoft SQL Server 2005 Enterprise installed on Windows Server 2003, and developing web application for 500 clients. So I am interested will I have any performance issues if I put in 'Articles' table, data for all 500 clients and then filter it on client ID, or should I make 500 'Articles' tables for every client one with different name and then change sqldatasource for gridview depending on which client is working on it. I will have, beside 'Articles' table, another 10 tables, which means 5500 tables total, if I use second approach, on first I will have only 11 tables. So I am asking is it better to have more tables with less data, or less tables with more data. And what are pros and cons for both approach. Thanks a lot!
View 1 Replies
View Related