Faster Return Of Large Recordset
I remember reading somewhere that there is a faster way to retrieve a large recordset to display it in a table, could anyone point me in the right direction.
View RepliesI remember reading somewhere that there is a faster way to retrieve a large recordset to display it in a table, could anyone point me in the right direction.
View RepliesI have an ASP page which users can call with various parameters to generate
data for reports.
However, some users who specifiy very few parameters and so generate a large
result page are geting challenged for their network credentials after approx
30 seconds.
When they enter their credentials for the third time the page returns with a
401 "You are not authorized to use this page"
Yet if they specify more parameters (and so get a smaller result page) then
the page returns normally.
If checked the query in the database and it runs fine with no errors, so I
ssupect I'm hitting some sort of buffer limit within IIS? Code:
There is an access database and asp page .That asp page opens connection with that database,And opens a recordset online..
Is there any way that this opened recordset can be returned to Vb6 application which requested that Asp page to open connection and recordset and return some result.
I am running a query that can return over 16000 records in the recordset. The problem is that whenever the return is greater than 16000 the recordset is null instead of having data. I have been looking around the internet, not sure if this is an administrative option. I am using IIS with an ORACLE DB, and ASP XSL for the pages. Anyone know why the system seems incapable of returning over 16000 records? I can return 16000+ with simplier queries. Running the same query in SQL-PLUS works perfectly.
Here is the query I am running.
SELECT * FROM (SELECT STATUS, ADDRESS, NAME, POSITION, GROUP, POINTS, GROUP_STATUS, USER_ID, USER_DESCRIPTION, ROW_NUMBER() OVER (ORDER BY GROUP, NAME, ADDRESS) AS RANK FROM USERS U, GROUPS G, USER_GROUPS UG WHERE G.ID = UG.GROUP_ID AND U.ID = UG.USER_ID *** Additional user filtering done here ***) WHERE RANK BETWEEN 0 AND 50
I am returning 50 users per page. This all works fine until I hit the 16000 record limit, at which point there is no data in the recordset.
I've got a SQL Server database, and am pulling records from it via ASP.Everything works fine, apart from the field which is stored as 'text' data type in SQL Server.If I change this data type to 'varchar' it works fine, but I need more than 8000 characters, so really I need the data type to be 'text'.Can anyone suggest why this might be happening, and how it could be resolved?
View Replies View RelatedDoes anyone know how ASP calls a stored procedure in ORACLE ? I want the stored procedure to return a recordset bact to the ASP environment.
Can a stored procedure in ORACLE return a recordset to ASP. If so can you direct me to the necessary syntax that can be useful for this ?
I have an interesting problem and am looking for some advice. I am
hoping to build an asp whereby records are pulled from SQL Server. These
records will be merely file locations for thumbnail images I hope to display
in a grid.
Rather than having a grid be constructed row-to-row, I was
wondering if it is at all possible to dynamically construct this grid as a
table of thumbnails whereby each cell (perhaps 5-6 columns across)
represents one record? I have no idea how one would construct this loop that
spans columns AND rows rather than just rows.
I have a page that does a fairly simple "Select * From Table" I then go through and assign each column a variable, print that variable out in a nice neat table, then do a rs.movenext, then Loop.
This method worked GREAT with my old DB which was 600 rows or so, but now I am doing this against a much larger db (5 to 6 thousand rows) and the performance is very bad. Infact it now takes several minutes for the whole page to render.
There HAS to be a better way to do this? What do you guys do if you have a DB with 5k or so rows and you want to put all that in a table? Should I look into using Java Script? and abandon ASP/HTML?
I am making a Purchase order system for a person who has been using an XLS for several years. She wants to be able to scroll up and down to see all of the POs, which I can do with a smaller DB.
Just curious, perhaps it's just a habit thing, but I'm wondering if one if faster than the other? Code:
myVar = "blah"
If myVar2 = "go" Then myVar = "not"
OR
Code:
If myVar2 = "go" Then
myVar = "not"
Else
myVar = "blah"
End If
If I separate my search form asp and my results asp into two pages, will that make the search faster?
View Replies View RelatedI currently have the following code to extract data and display in a table.
-Sample code ---[Ranges from 1 to 205] #20 is displayed below]]-------------
Dim ticket20, sql20
Set ticket20 = Server.CreateObject("ADODB.Recordset")
sql20 = "SELECT * FROM routingdata WHERE ddate = #" & Session("today") & "# ;"
ticket20.Open sql20, db, adOpenForwardOnly, adLockOptimistic
While Not ticket20.EOF
Response.Write "<font size=2>" & Ticket20("20")
ticket20.MoveNext
Wend
ticket20.Close
There is 200 plus cells that it need to display - so it takes a long time to display the data. Is there a better code to extract all this data and faster ?
I'm using xmlhttp to get info from 10 different sites.. 1 site's info is coming to me about 3 second. but when i use 10 sites it longs about 30 seconds.. how can i make it faster ... any solution or any different method can you offer me?
View Replies View RelatedWhat is a faster/better coding practice?
Method 1:
Code:
Sub myFunction(byRef x)
x = x + 1 ' do something with x
End Sub
x = 7
myFunction(x)
response.write x ' shows 8
Method 2:
Code:
Function myFunction(x)
myFunction = x + 1 ' do something with x
End Function
x = 7
response.write myFunction(x) ' shows 8
Also discuss considerations when there are more than one variables that need to be changed.
The following is the error keep getting while running web application.
Tools used: HTML, ASP, DLL's(written in Delphi).
Application Error: dllhost.exe - Application Error
---------------------------
The instruction at "0x00000000" referenced memory at "0x00000000". The
memory could not be "read".
Click on OK to terminate the program
I am updating over 100 fields in 7 tables with SQL in VB 6.0. The values will mostly come from check boxes. Do I need a VB variable to hold each value for my SQL query?
Also, what is the syntax for skipping an optional field? Do I just skip it and use comma's? I will never know which values are checked off so I cannot write code that will only insert my true values.
we are rebuilding a large website and we want to make sure it doens't lose it good ranking within the search engines. So we were thinking of using 301 to tell google where the pages have moved to.
The problem is I'm not really sure where to start to impliment something this. We have a website that has 26,200 indexed pages, so I need to come up with somekind of solution that would work across the board.
I am a PHP programmer and trying to do a site in asp but one feature I can't figure out how to do in ASP is large variables.
In PHP it is
$fullpage = <<<EOT
THE ENTIRE PAGE OF HTML
EOT;
EOT can be anything it just uses that as a starting and ending point so I can have quotes, apostrophes, variables etc inside the page. Is there any way at all to do this in ASP? Code:
I need to populate a select form input (combo box) with about 22,000 rows of data. This is taking an unacceptable amount of time to load. And this data is only going to grow in the future.
I'm using a stored procedure to get the data. Not sure if that's the most efficient. I'm using classic asp. I'm open to any suggestions because I'm not even sure where to look to get options.
I am trying to upload large images ( around 4 mb) from the server to show on
the client. Currently I'm using an http handler to do it and breaking it
into chunks sending it 1 mb at a time. Sometimes I'm getting errors, like
the page won't load. Any solutions on how this is done right?
I am trying to do some calculations on large numbers (ie
7,768,489,957,892,578,474,792,094 / 12,280) and no matter what I do it
doesn't get it quite right. Its always somewhere between 10 and and 5000 out
:(
I have a suspition is could be down to one of the number functions I am
using along the way but im not sure.
I am using forms authentication to protect all content as described in the
kb article below.
http://support.microsoft.com/defaul...kb;en-us;893662
This works fine except for files that are larger than about 40 or 50mb at
which point the user gets a 404 error and the httperr log indicates
connection_dropped status.
This is a w2k3 server with SP1 and all security patches installed.
Has anyone seen this before? I have been reading various posts online but
none seem to fit this symptom. Also have tried tuning various meta-base
properties to no avail.
It seems to me that I generally use two types of Functions:
Type #1-Ones that any page on my site might use
Type #2-Ones that only a single page would ever use
Logically, it seems that I should put the Type #1 functions in the GLOBAL.ASA file and the Type #2 functions in the pages that use them. I would like to, however, just go ahead and include ALL of my functions in the GLOBAL.ASA file.
Comments on this? Is this a good idea?
when creating a webpage that must display a large result set, the page load time is unbearable. Unfortunately, there are a lot of records and they all need to be displayed, at once, on this one page.
To reduce file size, I have minimized the amount of HTML tags used to dislay these records. My only other idea is to write javascript matrix to hold the data, and then to use dom to create the page on the client side. Are there any tried and true ways of speeding up page loads?
I am attempting to use the technique in KB 812406
(http://support.microsoft.com/?kbid=812406) to transfer
large files via Response.OutputStream.Write.
It works GREAT in in debug mode. But whenever I set
debug="false" in the web.config file clients get cutoff after about 1.5 minutes.
There is no error raised the client simple gets an incomplete file.I've tried playing around with various timeout settings in web.config and in IIS with no luck.
I am currently developing a download centre for a hardware manufacturer which has alot of firmware and other software available to it's clients. Most of these are less than 10mb and with their internet connection I'm thinking a browser upload through the admin tools will be fine. However there are files on the current site of up to 100mb, how would be best to handle uploading these?
I was thinking maybe an ftp component would handle bigger files better than an normal http upload? Does anyone know of any components that allow some kind of background uploading so they can continue to use the admin for other tasks while this continues?
Or just any other suggestions full stop! Maybe I would need to come up with some way of taking them out of the system and loading a windows ftp with the correct directories etc.. predefined?
I want to count number of data available depending upon the conditions in the large database the database table rows are approximately 3 lakhs and there are multiple table to be searched and records need to be displayed in ASP pages.
Kindly suggest the best scripting method as i am always getting error like "Script Time Out" Error in Active Server Pages (ASP)
please suggest how to avoid or in sql server i can create some pre defined script and just call from asp pages
I've a large database that I am working with. The problem is right now I've so many sample data in the datbase all the testing data in it. I want to clear the database and reset the AutoNumber so that it starts at 1 and goes up by one.I know that its possible in Access and I've done it before, problem is the tables have relationship and lots of them.
the way I know to rest the AutoNumber require me to break the relationship but it will take very long time for me to rectreate them again and I might not get them same at the end.Is there a way to rest the AutoNumber without having to break the relationship?
I am creating a dynamic website using ASP and an MS Access backend DB. I am a little confused with how to go about things and am now facing the following problem. I want a large volume of text to be loaded dynamically on my page but don't really understand how i should be storing this text as the database fields will only hold a max 255 chars.
Obviously this is not large enough to store all of the text i may need and so where should i be storing the text? The only idea i had was storing the text in an external text file to which i point to with the database field (i.e. database holds the path to the txt instead of the actual text itself).....is this possible? or more to the point is there a better way? How would this normally be accomplished?
I hope we can upload only files with limited size through ASP scripts(<2MB). I need to write an ASP script that can serve up to 100MB of file uploading. I have written one script(with progress bar ;-)), as my hosting won't allow me any third party upload components. Is there any way we for ASP scripts to make larger file uploads?
View Replies View RelatedI have an asp page with a form, which has a lot of fields 30 to 40 .Is there an easier way to loop through all the fields submitted to the insert page or is the only way to do it using the standard, request.form for each field and then INSERT INTO Reports (ReportID, ReceivedDa very lengthy for this amount of fields. Perhaps there is a way of looping through some sort of collection ?
View Replies View RelatedI have a site that i upload files to that are no bigger than 100MB. we upload about once a day, and use FTP...... we would like to be able to do it from the site itself.
any idea where i can get a script? i cant register components with my
hosting company i dont think. i go thru godaddy.com
need a good free asp upload script. cant find any that work. Huge ASP Upload
doesnt even work on the demo page, let alone on my own page
i'm getting the following error in my event log:
Event Type:Error
Event Source:Active Server Pages
Event Category:None
Event ID:5
Date:2007/03/09
Time:11:50:48 AM
User:N/A
Computer:ZEUS
Description:
Error: File /index.asp Data size too large. Size of data being sent
in the request is over the allowed limit..
For more information, see Help and Support Center at
http://go.microsoft.com/fwlink/events.asp.
The problem is that when I get this error, all websites in IIS6 become completely unresponsive, and I have to run an iisreset.
We are using Windows Server 2003 along with IIS 6. When trying to browse to an ASP page, the following error is returned:
Active Server Pages error 'ASP 0107'
Data size too large.
Size of data being sent in the request is over the allowed limit. the asp page in question is about 5 KB in size and we also increased the size of the AspBufferingLImit setting in Metabase.xml to over 16MB to no avail.
Similar ASP pages are all working, it is only this one, and there is nothing special or different about this one.