Am using MSXML4.0 serverXMLHTTP for communicating with a remote server. Using POST method of serverXMLHTTP to post a xml string but after calling send method its give "HTTP 1.1 405 method not allowed" in response status text.
When strLocalComputer contains the local computer name, we get the requested page. But if we use the IP address of the local computer we get the error :
msxml4.dll (0x80072EE7) The server name or address could not be resolved.
We manage to run the URL with the IP address from a browser on the server but not from a browser on the local computer. Any ideas ?
I've two Win2000 Advanced Server (ServerA and ServerB) both part of an Active Directory. The Active Directory Server is named ServerC.
On ServerA I've published an asp page PageA.asp inside a virtual directory VirtuaA. On ServerB I've published an asp page PageB.asp inside a virtual directory VirtuaB.
Inside both server I've installed Microsoft XML Parser 4.0 sp2
Both virtual directories have setted Windows Integrated Authentication.
PageA.asp has to get PageB.asp using ServerXmlHttp object and to show its contents.
When I try to get PageA.asp from any client part of the same active directory I obtain an error: I'm not authorized to get PageB.asp.
I've tried to use every solution founded on previous posts:
- proxycfg -d -p " " "*" on ServerA
- flag "trust for delegation" on Active Directory Control Panel of ServerC
- ASP instruction .setProxy 2, " ", "*" inside PageA.asp
I've been experimenting with both ServerObjects' AspHTTP and Microsoft's ServerXMLHTTP to access IMail. They both work fine, except for one weird thing.
To log in to IMail via login.cgi, it takes ServerXMLHTTP about 30 seconds. It takes AspHTTP about 1 second. Accessing any other IMail page, the two are virtually identical. But the login page is unacceptably long for ServerXMLHTTP.
I have been reading these posts and I thought I may be able to use serverXMLhttp for my current problem.
My company requires all of their divisions to use the same template for intranet sites. They (corp IT) changes this template often. I can use serverXMLhttp to read the their site and dsplay everything. My problem is the .js and .css include files they use. How can I get this info into the correct format in my page and change their relative image links so that I can use their images.
I will have a few minor edits for links and one image change. I assume that can be done through Replace()
I want to use HEAD instead of GET in order to save bandwidth. I don't want to retrieve the full html of a page, just the HTTP header to know that a file exists:
HTTP/1.0 200 OK or that a file is not found HTTP/1.0 404 Not Found
This is for a link checker that I am writing. Does anyone know what I am doing wrong or does this object not recognise the HEAD parameter?
I'm trying to use a ServerXMLHTTP connection in async mode to request Page2.asp from Page1.asp. Both pages are inside 2 different virtual directories with Windows Integrated Authentication but the user who requests Page2.asp is different from user has requested Page1.asp.The user of Page2.asp is IWAM_machine_name. If I use ServerXMLHttp in standard mode there is no problem.
this code works with msxml4, but found out the server it ultimately will sit on only has ver 3 and they won't update. so, this code fails on the bold line and i can't figure out why.
Code: Set httpReq = Server.CreateObject("MSXML2.ServerXMLHTTP" ) Set myXmlDoc = Server.CreateObject("MSXML2.DOMDocument" ) httpReq.Open "post", webServiceUrl, False httpReq.setRequestHeader "Content-Type", "text/xml" httpReq.setRequestHeader "SOAPAction", "soapserver/soap:CreatePreview#CreateGraphics" httpReq.Send soapEnv 'fails here returnSoapEnv = httpReq.responseText
I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to retrieve large binary data from a remote server. When the request is large (more than a few megabytes), the ServerXMLHTTP page jumps to nearly 100% CPU utilization for an unusually long time. The remote server needs a few seconds to prepare the request, during which time the CPU seems OK. It seems that as soon as the data is ready to retrieve, the CPU usage jumps and remains that way until the data has all been copied to the requesting server. That takes way too long - about 35 seconds when requesting a 12 MB file over a gigabit Ethernet.
I use ServerXMLHTTP hundreds of thousands of times daily on this same system on the same network, with absolutely no problem - but for smaller requests. There's something about the size of the request that makes it blow up.
I saw some reports of older systems with this problem (Windows 2000), but I'm running IIS 6 on Windows Server 2003, SP1.
I have been using the following code to access a remote url, which works fine, but if the remote "geturl" does a redirect (as the page in this code does), I have know idea what the redirected url is. Page still displays, (including the html source code), but I cannot determine what the base href is (that is, I don't know the url of the page being displayed, since is is a redirected page).
I've used MSXML2.ServerXMLHTTP in ASP to write a link checker. It checks a list of URLs stored in a db and flags any websites/webpages that are down. Since I am only interested in whether the page is up or not and not in the actual HTML, I use HEAD instead or GET.
Code:
Set getPage = Server.CreateObject("MSXML2.ServerXMLHTTP") getPage.Open "HEAD", URL, false where getPage is my object, URL is the URL to check and I can also do
which checks the URL using the username and password supplied. Everything works as expected but my problem lies in the fact that the password is in plaintext in the ASP page. It is not good security practice to have an ASP page contain a username/password combo in plaintext. This means that if someone had file access to the ASP page (intranet) then they could see the username/password pair which we want to try and avoid.
Does the HTTP standard only allow for username/passwords to be sent in plaintext?
I'm trying to integrate portions of a page on one domain into another domain. The goal is to use a rate calculator on the remote site to produce quotes on the main site. I have permission to do this. I'm using MSXML2.ServerXMLHTTP and it's working quite well except that I am unable to set a session cookie that seems to be required to generate a PDF on the remote site.
I've tried passing the session cookie thru MSXML2.ServerXMLHTTP, but that doesn't work. I've tried setting a cookie using ASP but it seems that PDF generation on the remote site requires the session cookie from the remote domain. I've tried setting a cookie with the domain property set to the remote site but the cookie is never written. I think this is a security feature in the Http Session State.
Is it possible to set a session cookie in MSXML2.ServerXMLHTTP with a remote host/domain? Can ASP be used to set a cookie with a domain other than the requesting domain? There seems to be a property to do this but I haven't been able to get it to work. Any other suggestions?
On server A I've a web application WA1. Inside WA1 there are a virtual directory VD1 and an ASP page named page1.asp. Inside VD1 there is another ASP page named page2.asp.
page1.asp makes a ServerXMLHTTP requests for page2.asp.
If Debugging Flags on WA1 are enabled the request seems to be blocked, if those flags are disabled everything seems to be ok.
I am trying to use ServerXMLHTTP to post data containing Japanese characters, but the data posts as question marks, boxes or just random ascii characters. Here is the code I am using:
Consider the following simple function to get the contents of a remote URL:
Function GetURL(str_URL) Set obj_XMLHTTP = Server.CreateObject ("MSXML2.ServerXMLHTTP.3.0") obj_XMLHTTP.Open "GET", str_URL, False, "", "" obj_XMLHTTP.Send GetURL = obj_XMLHTTP.ResponseText Set obj_XMLHTTP = Nothing End Function
Is there anyway that I can use a remote proxy server to make the request?
I am trying to use ServerXMLHTTP in an ASP page to return a binary file download to the browser. It works just fine with small files ( under 1 MB) but seems to fail with large files (4 MB, 11 MB in tests). A success would be that the browser kicks off the "Save As" file dialog. The failures are not always the same. Sometimes the browser tries to download the ASP file itself. Sometimes the the file seems to download successfully, but for example only 1.6 MB of a 4 MB file are actually downloaded and it doesn't seem to be a simple truncation. Sometimes I get "internal server error 500". Below is my ASP code. "Project1.exe" is small enough to be successful. If you substitute "OmniViewProSetup.exe" (4 MBs) it will fail.....
I'm including the output of an .asp page hosted on other domain using serverXMLHTTP. Is it possible that I can share the values of some variables from external file.
The following shows when I call it locally. If you look at the URL that has http://localhost, this is the one I use when I call the ASP page from the same local server (self contained).
When I want to call the remote server (the one that fails) I use the other URL has http://remote.
I hope this helps..
Because the webservice is the same on both machines and I'm calling both machines the same, what would cause the remote machine to return that the Request form not recognized?
Having just "discovered" it myself, I thought I'd draw everyone's attention to the fact that the WinHTTP 5.x object (which is used behind the scenes by our good friend the ServerXMLHTTP object) can be used directly in scripts.
This is useful because the ServerXMLHTTP object encapsulates XML-related functionality which is unnecessary for performing most simple HTTP requests, and thus by using the WinHTTP object directly you achieve higher performance, scalability, and reduced memory consumption... AND WinHTTP offers quite a few features that ServerXMLHTTP does not expose - including the ability to specify a proxy (and an exclusion list) from within the script (or to acquire them from Internet Explorer's settings), IPv6 support, and HTTPS/SSL support!
That's better than internal server error. This is what came back, which is confusing because when its called locally it comes back fine. Am I missing something? ....
I am trying to access a webservice to post some values via MSXML2.ServerXMLHttp The OS used is win2003 on both machines.
When I access the local machine (A) which has the same service the call works great. When I access the other machine (B) that is on the same subnet, I get a Status of 500. Code:
Invalid ProgID. For additional information specific to this message please visit the Microsoft Online Support site located at: http://www.microsoft.com/contentredirect.asp. /CoxAxis/adminEditPage.asp, line 6
My code:
<% dim self, pid, i, c self = Request.ServerVariables("URL") pid = Request.Querystring("pid") set Session("pageContent") = Server.CreateObject("Scripting.Dictionary") Set custObj = Server.CreateObject("NFIFunctions.ValidateField") Line 6 set psi = Session("pageContent") set errDict = Server.CreateObject("Scripting.Dictionary") i = 1
Let me start by saying I'm fairly new to Asp coding. That said...
My ISP only uses AspSmartMail. I've created an online form that uses fill out, which is then e-mailed to the collector of the information and CC-ed to the person who submitted the information.
The error I'm receiving is this: aspSmartMail.SendMail : Error 28 error '8004001a' 504 Invalid Username or Password
In my script, I've Dimensioned several items, as you'll see below, passing the authenticating username/password to the smtp server, but it's not working. I tried not passing the information by entering in the actual info without it being passed by the diminsioned items. This didn't work either. I of course verified that the username/password I'm usine is correct.
Can someone plase tell me why I can't authenicate? I would really appreciate any help that might be out there.
Relavant Asp code below: ----------------------------------------- Dim smtpserver,youremail,yourpassword,yourusername,rem oteemail