Up to this point, i've been able to extract the search results from a site. Form -> Results chain.
However, on a new project im on, im a little stuck, and not sure if it is at all possible. The next page im trying to get info from there's a chain of 3 pages. so Form -> Middle -> Results.
I've tried inputting my own form data, but i get the page info from the Middle link, and not the Results link. Any ideas how i can go on to the Results link?
At the moment i dont think it's possible, but i haven't been in this game for long.
One of the limitations of ASP is the lack of a dynamic SSI, in which you can include files using variables. So far I know only of two methods to go around that.to use the File System Object.to use the XMLHTTP Object.Which one of the two is less of a burden for IIS ? Or are there other ways that are much better?
I m getting information with XMLHTTP from 3 sites at the same time. but it is so slow. how can i make it faster. any information or any document about it. anyone who know about it. it is about alsa bandwith. how can i increase xml http performance.
I got a great little example of how to use XMLhttp to query an external web site (I'm wanting to do my own localised search engine by extratcing the content and putting it into an Access DB), but it only appear to extract from the page that you point it to, ie the main default page is extracted when you put in something like www.hp.com.
I want the util to extract content from all of the found pages in the site I suppose like a spider. Has anybody done this or any ideas how you get it to go past more than the initial page?
I need the ASP/xml page to send the Request to the share point portal server (Webdav) using XMLHTTP and i also need to parse the response xml into HTML.If any body knows about.
It might contais mainly three files one is Search page where user can type his requirements , second one contains sending XMLHTTP request to the Share Point Portal server and third one contains the Parse xml response into HTML format.
I use MSXML2.XMLHTTP to receive some HHTP responces from remote server, unfortunatedly, it uses cookie-based auth., so I am unable to pass it, due to inability to store cookies. How can I retrieve cookies from the headers of HTTP response, and how can I add them to my request at the next step?
I'm using XMLHTTP to perform a simple screen scraping job. When I try to have it "scrape" lines with embeded CSS declarations, it renders them incorrectly, stripping the leading period. Consequently, my CSS doesn't work.Here's how I'm calling it, but I don't think my implementation is the problem.
Dim srvXmlHttp Set srvXmlHttp = Server.CreateObject("MSXML2.ServerXMLHTTP") srvXmlHttp.open "GET", TearURL, false srvXmlHttp.send() If srvXmlHttp.status <> 200 Then Response.Write "No Server Response" Response.End End If Dim strRetval5 strRetval5 = srvXmlHttp.responseText
Server.CreateObject("MSXML2.ServerXMLHTTP") in order to get a connection from one server to another.
On my personal host (IIS) the above method works perfect, and I can 'include' an HTTP stream from a remote URL into my own page. However, the method doesn't work on another webserver (IIS, too) although the MSXML2.ServerXMLHTPP method is installed (I checked it).
The error message I get there is: msxml3.dll error '80072ee2' The operation timed out
This error does only appear if I try to get data from a remote URL, if I try to get data from http://localhost/foo, the method works - I believe the webserver is behind a proxy. Would this cause any problems? And if so, how can it be solved?
- Do you guys know of any other solution how to 'emulate' the php include command in ASP without using the MSXML2.ServerXMLHTTP method?
Shown below is ASP Code sample that tries to access a method,CreateUser in the server called, "http://smallbutsmart.basis.com.au" using the protocol,XMLHTTP. Can you explain to me why this Code does not work and showing the correct Code Sample?
<% dim objXMLHTTP set objXMLHTTP = Server.CreateObject("Microsoft.XMLHTTP") objXMLHTTP.Open "Post", "http://smallbutsmart.basis.com.au", false 'objXMLHTTP.SetRequestHeader "Content-type", "text/html" 'objXMLHTTP.CreateUser "abc","123","Scriven","1","001","qms" objXMLHTTP.Send %>
I've been searching everywhere online to find an alternative method besides using Microsoft.XMLHTTP (as it freezes the server up alot!!) but with no luck at all.
I am using server side ASP, and some said to use Microsoft.ServerXMLHTTP instead. However I have tried that as well and it still freezes up the whole thing (i.e. the site just keeps loading forever).
I tried to do a "on error resume next" clause to catch the error but still doesn't stop the page being freezing up.. :(
I saw someone here said don't use XMLHTTP in ASP as it is not thread safe (is that why it is freezing up??), and suggest to use MSXML2.ServerXMLHTTP.3.0.
so do anyone know if MSXML2.ServerXMLHTTP.3.0 will help? i.e. not freezing up the page?
when we get info from an url with xmlhtttp or xml server http how it behaves. how bandwith it uses or how it adjust the bandwith which it use... how can it use all the bandwith.. any promramme or any windows adjusment ...are there possible solutions.. for example how the flash get use all the bandwith when we separate from the whole part a file to lots of part
I have a question regarding async mode for calling Microsoft.XMLHTTP object.
Microsoft.XMLHTTP hangs the IE once in a while suddenly, but it will work again after half an hour or so without doing anything. I have searched through the Internet and seems like the reason it hangs the browser it's because XMLHTTP limits you to two concurrent HTTP connections to each remote host; so if more than 2 concurrent connections strike the script which is calling XMLHTTP, it will hang. Is that true?
If that the case, can I change async mode to true (async=true) so that it will only take one connection at a time, meanwhile other concurrent connections will loop and wait till XMLHTTP is ready to process data again. Will that work?
I have a machine running W2K Server with the .NET Framework 1.1.
Yesterday, I was able to browse through my .NET web pages OK.
Then, I installed Visual Studio 6.0 and Visual Studio .NET 2003.
Now, when my .NET web pages attempt to perform an XMLHTTP.Send(), I get a status of 500 (i.e. "Server Error). Calling the XMLHTTP.statusText(), I get a response string of "Server Error".
I have checked my IE settings and they have not been changed.
Does anyone have a clue as to why XMLHTTP quit working after I did the installs yesterday
I'm trying to get some data from an external website using XMLHTTP, but my problem is that the data is with special danish characters in (ÆØÅ), which are being replaced with "?" due to UTF-8. I've tried to specify the charset using response.charset, but that doesn't seem to have any effect.
I am trying to access information present in a page ( e.g. info.dll?about) using xmlHTTP on the server side. I get the result in a
variable and parse through the content using string functions in VBScript. This was working absolutely fine till yesterday, today i get the result of xmlHTTP request as "MZ?". This issue is seen only when requesting one or two DLLs, other DLLs work fine. Anyone knows what could be the issue here? This is the code snippet. Code:
why this ASP code isn't workng, I have the following page (one.asp),
Code:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" /> <title> Test </title> </head> <body> <div> é á í ó ú Ç » ¢ </div> </body> </html>
I already know I can use the XMLHttp object to login to a page on a remote site, but can I then use it to post data to a page after the login page?
I would have thought that wouldn't work, as the XMLHttp object runs on the server, and I guess wouldn't maintain session state because of this. Is this something anyone's come across?
I have a form where I want to be able to validate a field against a DB *before* the form has been submitted. After doing a bit of homework, it seems there are two general approaches: 1) Remote Scripting [using RSExecute/RSGetASPObject] 2) Using XMLHTTP/XMLDOM objects I have no experience with either, so I though I'd see what my peers are using.
I have use an affiliate link on my site, and I did not like it because it did not have my sites banners on it, so I asked the permission of the other site to change the display and they aggreed..
The affiliate site has 4 pages of forms linking to each other so it goes something like form.asp > results.asp > results_details.asp > confirm_details.asp >>>>>> proceed to payment..
I wanted to grab the each page page using the XMLHTTP object and then the post of the form to a local pages on my site.. The local pages would grab the request.form string and then using code something like below..
on my page quote123.asp, I have <%=request.form%> and I never see the string "select=All" - why is this... Code:
I'm using XMLHTTP in ASP to grab some google results. On my local server the results are exactly the same as if I went and searched on google itself but the results I get back when I'm trying it on my webhost at discountasp.net are different in results range number and the results themselves. I was thinking it might just be that both are taping in to two different google servers which aren't currently updated the same but the results never change for the two and the outcome of the local server being the same as a real google search and my webhost server being different stays the same. The links also come back different on my webhost compared to my localhost.
I cant find out how to get this to work...i got this bit of code.. --------------------------------------------
Code: dim objXMLHTTP dim URL
URL = "http://www.yahoo.com" 'the url that you want to pull html from
Set objXMLHTTP = Server.CreateObject("Microsoft.XMLHTTP") 'create the
xmlhttp object objXMLHTTP.Open "GET", URL, false 'use the open command to get the url objXMLHTTP.Send
'TO GET HEADERS 'response.write objXMLHTTP.getAllResponseHeaders
Response.Write "<hr>" Response.Write "<h4>HTML Code for "&URL&"</h4>" Response.Write "<textarea rows=30 cols=120>" Response.Write objXMLHTTP.responseText 'output the html that was pulled from the page Response.Write "</textarea>" Set objXMLHTTP = Nothing
----------------------------------------------- and it works...it fetches the page but i want it to save the information that it fetches into an html file
set strm1=createobject("adodb.stream") With strm1 .type = 1 .open .write objXMLHTTP.responsebody .savetofile sFile, 2' adSaveCreateOverWrite .close End With ------------------------- but it doesnt want to write to file.
i get the following error ADODB.Stream error '800a0bbc'
Write to file failed.
/bytesect/data/test2.asp, line 15
Can anyone please help i am really struggling with this...I simply want a generated ASP page to be saved into an html page on the server through a script. I tried the filesystemobject method but it would be insane to insert an FSo.writeline "" code into every single line of html. This XMLHTTP component is promising but i wanna save the file!
I am retrieving a response from another site but certain special characters such as ' (single quote) or "(double quote) and £ (pound sign are being returned as ?. The code i am using to get the page is as follows: