The default ASP generates the HTML response including several IFRAMES. It takes a long time to load this page. I believe a lot of it is round trips to load the IFRAMES. Couldn't include files be used to increase this performance, so everything is done in the initial call instead of multiple round trips?
Okay, I'm doing a project for someone in which random questions are to be pulled and posted so that the user can complete a test/quiz/exam for a grade.
Well, his host apparently doesn't have any database stuff, but I can probably get him to move to another if it's absolutely necessary.
Personally, I'd prefer to stick with files, just because that's what I'm most accustomed with. But if there's a very compelling reason (performance or reliability?) to move to a database, I suppose I could.
Here's a sample of the type of data I'll be using. The format is as follows: Question, Correct Answer, First Answer, Second Answer, Third Answer (If Any), Fourth Answer (If Any). A question is on each line (in this browser window, it may appear that a question is spanning several lines, but it truly isn't). Code:
I would like to know the Pros and cons for using ASP VS Visual Basic for intranet, in terms of time development and customization. The project that we will use it is a basic Information Management System.That could be used for offline and online using IIS and MYSQL.
We have a Web page that at load time will execute and build: a DHTML Menu/Data Driven, an Iframe with Detail Data, an Iframe to display Progress Information and another iframe with totals information of the previous iframe with Detail Data. Everything works fine except during the initial load. If we click on any place of the page during the above loading schema, the process stops and we end-up with Iframes with incomplete information and empty calculations. We need to refresh and avoid using the mouse and keyboard during the load time to get a decent start.Is anyone familiar with this behavior? Any ideas to solve this issue without disable the mouse?
I have a webaccount to which I have pointed two domain names. Since I have two different sites on the same webaccount I want to redirect the browser to the correct starting page given a specific domain name. Code:
I'm working on developing (yet another) shopping cart for my work. I was wondering if anybody knows which approach leaves a smaller footprint in memory on a server: an ADODB Recordset which stores arrays one-dimensional arrays or a two-dimensional array?
We are currently hosing our web site on a shared machine at Verio. The cart I have to build needs separate carts because a customer's products can come from different locations. Each location will have a different cart. On the cart page itself, I would like to display each cart in a separate location.
Logically, it make the most sense (to me, anyway) to use an ADODB Recordset to manage each individual cart (array) than to write the code to manage the arrays. What are the disadvantages of using ADODB recordset instead of a two-dimensional array in this case?
Which approach would be easier to manage and manipulate? Are there any reasons why I shouldn't use a two-dimensional array? The nice thing about ADODB is that I will not need to program certain features and have access to FILTERS.
Any suggestions about how each approach would scale as site traffic increases? Will one approach bog down the server more than the other? I'm also open to suggestions about a possible, yet illusive, 3rd way.
I need to take a set number of images, and have them display in a iframe on my page. I want to be able to have a Previous and Next button at the bottom of the iframe that will take my users to the corrisponding page of the document. I was told that an Array would be the best route to take for this. Here is what I have so far... and as you can guess there is something wrong with the code:
I would like to prevent a page from breaking out of it's iframe when indexed by a search engine. I would like to use asp (which I am just starting out in) and I know it is somewhere along the lines of "yourframespage.asp?id=pageid".
Can anyone give me a rundown of how I do this exactly because I cannot seem to work out how to do it.
Big.asp includes an iframe called small.asp. Small.asp has a form. How can i make that form to be submitted to big.asp? Because if I simpli point the form to big.asp, it would open a big.asp into the original big.asp .
I have a parent window (default.asp) and on this page I have an IFrame that displays the rest of the menu requests for the site. The url always shows the base url, never other pages because they appear in the IFrame - this is by design.
The content in the IFrame is dynamic and may scroll forever depending on how much data is retrieved from the db. If the user scrolls down the IFrame to view to contents, the top of the default.asp page scrolls with it until it reaches the top of the IFrame.
The problem is that when navigating through the site I can't seem to figure out how to get it to jump to the top of the default.asp page when the page in the IFrame changes. This causes the user to have to scroll up and down throughout the site making it unappealing for navigation reasons.
I'm using MSXML2.ServerXMLHTTP to retrieve a webpage. It works, but is unacceptably slow (~20s) when retrieving remote webpages (anything outside our network).Is there any way to improve the performance?
im trying to build some custom shopping cart for a client
what i know about shopping carts is the main idea which is an array stored in the the client's session
also i know that this point (the session) may cause some bad performance and here is my point
any cart contains just the product id and number of items or something like that (some people says data dictionary is good here)
the cart im going to build contains alot od data coz the site client when choose to add some item to his cart must be redirected to some page with a form to add many details about his needs with the product so the session here will carry alot of data ........
so i thought in 2 iedeas
1- array into session 2- some databse table carries all of temporary data with a unique primary key for every session but i need to know the best selution for the best performance
If I a loop that displays a list of records on a page, lets say a list of contacts (lets say 200). The question that I have is does it effect the performance if I add inside the loop the following:
Whether it can improve the execution effciency of ASP by centeralizing the often used code into header files and then including these header files in ASP pages?
I have a 500 page ASP program that takes 1-2 seconds to execute. For some strange reason it takes 20-30 seconds before the first lines executes. How can I make this program behave better?
I have a list box with about 500 users in it. I have to populate that on a site. It causes extreme slowness. I am using classic ASP with MS SQL. What is happening is the users are stored as a number (a user_id) and that number is looked up. Then that number is passed through a simple function to turn the number into a name. The names are stored in the DB in first_name, last_name format. The company wants the list box in first_name last_name format so the function is needed. Should I look into database caching, caching the list box somehow, making another table of the user_id's and names in the correct form, or some other option. Once again, I am not using ASP.NET, but classic ASP instead.
I am running the WebService / CurrentConnections performance counter. It works fine for one site I'm looking at, but not for another. The one it doesn't work for, the polling vertical red line doesn't even move. Is there a setting somewhere that controls whether a web site can be be 'polled' or not? I am using Windows Server 2003 Standard Edition.
I'm maintaining an ASP Classic file that has HTML withing Response.Write methods such as <% Response.Write "<table><tr><td>" Response.Write testVariable Response.Write "</td></tr></table>" %>
Would their be a performance hit if I were to write this instead?
I have a database with 300,000 records. I have two "DATE" columns and I need to calcluate the difference and display the number of days in one of the reports. I was wondering if this calculation of days should be done on the fly or is it OK to have a "Difference Date" column[Contains Number of Days] and retrieve the same
Their will be not more than 50 concurrent users accesing it. However, in the next 6 months the record size is expected to reach 800,000. The database will be hosted on a shared hosting on the internet
how to display the current connection value on a ASP page. Wanted to know how to read this value into ASP page as opposed to going to performance monitor.
any 1 know of any resource that provides Active Server Page performance on differenct server platforms. I am interested to know what the page/sec, concurrent users (on avarage, this of course depends on other factors) is for Dell PowerEdge servers.
I recall that 40page/sec on 800Mhz server a few years ago was really good. What can I expect from a dual PE1750 3GHz Xeon?
I have an asp site that of late is having a very very slow response at the production environment and it is taking 6-7 minutes to do the basic operations that are needed. We have come to know that the performance has deterirated after the recent release.
Going through this thread IIS 6.0 slowing down one of the suggestions listed was to make sure one destroys objects the code creates which I have taken care in all the pages. Actually all the new changes that were to be done involved working with session or local variables and so no com objects were touched.
We are using Windows 2000 server with SP4 at all the environments. Could someone tell me some suggestions on how to identify the problem? This is a problem faced by many at the clients end and this occurs even on server restart. Please let me know if there are any additional details that are needed.
I've been trying to transition an application from an Access db to an Oracle db. I've got the base functionality down, but it's slooow. I'm prety sure that the 8 processor IIS and Oracle servers are more than capable of the low traffic that my code creates.
My question is, what's the most efficient way to do this? I've been and still am doing research on cursors, locktypes, etc... But, I haven't come across something that is similar, so I thought I'd see if anyone here has had the same experience.
Currently my connections/recordsets look like this: Code:
I have a simple data entry form that has text boxes and some drop-down menus per row. My problem is the performance of the page.
I give the user the option of displaying 25, 50 or 100 rows at a time. Each row has 12 text boxes and 4 drop-down menus. The OPTIONS for the drop-down menus are being populated with data from our MS SQL database.
I'm using VIEWS to get the data and place them in recordsets, but how can I increase performance. It's taking way too long for the page to load and if the user clicks submits.. that takes a long time too.
I am writing a page to go into a real large database and pull out the most recent entry and post it to a web page in an ASP document. I think I have the query right, but it is sooooo slow that it isn't a good solution. The code I have to do the query is as follows, but it is soooo slow it doesn't work..
I have another table which is much smaller, where I can find the unique ID values (they only occur once in the other table..) Can I then go to the last record of the larger table and search backwards from there, stopping when I find the each occurance of each ID in the smaller table?
Any ideas how I might go about it? I have searched and searched but haven't been able to come up with something that works yet. Code:
I have a dynamic datagrid on web application, initially it will populate with search result data. in the first or last column will be a edit button to each row, when the user click this edit, it will allow user to see the column of the row shown as dropdownlist, textbook as defined in the datagrid(or TemplateColumn), after the user make their selection on those fields, the value will be valided and save to database. My question is when come to make choice to populate the DataGrid, I have concern regarding when to load the Grid(further the Dropdownlist for example). do this inner controls populate with data when the user clicked edit, or do they populate with data when the grid was loaded with the search results? Which one would be more efficient in this case? I intend to hold the Population to the DropDownList until user clicked the row, but would wonder if this approach would work or not? how/when to add the value to the DropDownList.Item.collection?
i'm working on a rather big website project (some movie database you can visit here : URL) so i consider myself an experienced developper in ASP / MSSQL2k and recently i've experienced some nasty problems with simple pieces of code that shouldn't have made any problems at all..
what i can add to my intro is that all the pieces of code work, both on the ASP side and the SQL side, the problem is a dramatic performance loss when everything tries to work together..
so, the thing is pretty simple, i have to make some lists of movies and stuff based a few parameters, filter parameters and sorting parameters..
The whole thing is built in a quite huge stored procedure in order to avoid having msqql compile every different query that can come from every combination of parameters, on the fly.. the stored procedure is very basic, a few "if then else" and the queries and it works perfect. Code: