I am trying to use ServerXMLHTTP in an ASP page to return a binary file download to the browser. It works just fine with small files ( under 1 MB) but seems to fail with large files (4 MB, 11 MB in tests). A success would be that the browser kicks off the "Save As" file dialog. The failures are not always the same. Sometimes the browser tries to download the ASP file itself. Sometimes the the file seems to download successfully, but for example only 1.6 MB of a 4 MB file are actually downloaded and it doesn't seem to be a simple truncation. Sometimes I get "internal server error 500". Below is my ASP code. "Project1.exe" is small enough to be successful. If you substitute "OmniViewProSetup.exe" (4 MBs) it will fail.....
I am currently developing a download centre for a hardware manufacturer which has alot of firmware and other software available to it's clients. Most of these are less than 10mb and with their internet connection I'm thinking a browser upload through the admin tools will be fine. However there are files on the current site of up to 100mb, how would be best to handle uploading these?
I was thinking maybe an ftp component would handle bigger files better than an normal http upload? Does anyone know of any components that allow some kind of background uploading so they can continue to use the admin for other tasks while this continues?
Or just any other suggestions full stop! Maybe I would need to come up with some way of taking them out of the system and loading a windows ftp with the correct directories etc.. predefined?
I hope we can upload only files with limited size through ASP scripts(<2MB). I need to write an ASP script that can serve up to 100MB of file uploading. I have written one script(with progress bar ;-)), as my hosting won't allow me any third party upload components. Is there any way we for ASP scripts to make larger file uploads?
I'm using Microsoft's ASPFileUpload routines and all works fine as long as my files are smaller than about 200K. For anything larger than about 210K, I get the following error:
Error Type: Request object, ASP 0104 (0x80004005) Operation not Allowed
On the .Upload method. Has anyone else experienced this problem with ASPFileUpload?
I have code that uploads files in asp. It seems to be working however on files > 200kb it bombs and i don't know why. Does anyone have any idea of why this would occur and what i can do to fix it?
I am working on an app that needs to stream binary files(such as PDF or WORD docs) to the browser. I cannot just pass a URL pointing to the file directly. I've tried the following in an ASP file: Code:
where strData contains the binary data content of the WORD file in a string variable (for PDF, I would set the content type to "application/pdf"). However, the browser displays the data itself, rather than hosting the doc in the appropriate browser applet.
I currently have my database set up so that when a user uploads a file, it writes it in binary in the database. I know the benefits of having a separate file server and storing the path, but unfortunately, I can not do it that way.
I know how to include the file in, lets say, a hyperlink so that any user can click on the file to view it. But, I would like to know how to include the file (mainly pictures) directly into the page as an <img src> Any suggestions?
P.S. Once the information is in the database, these are the commands I use to output the file to a hyperlink. The file that this code is in is "file.asp?Id=" Where the Id is taken from the URL and used to obtain the correct file. I left out all the connection/cmd strings intentionally just show I could show the relevant information:
i am using the following code to generate a xls file using the content type now when the user opnes the file at his pc it takes long time to open..if the no of records in file is large does the use of html tags has slow down the process of opening in excel. Code:
I have an application which calls 2 database stored procedures in SQL Server. The first stored procedure takes about 20 minutes to execute. The second stored procedure runs in a split second.The application works fine on my machine. However, when I deploy it to another server (Server 2003) and run the application, a few minutes after the first stored procedure starts, the browser redirects to a "Page not found" error page on Internet Explorer. According to SQL Profiler, the stored procedure continues to run to completion, and the next stored procedure also runs.
I looked into the httperr error logs. The error I am getting is: “1 connection_dropped defaultapplicationpool”.Does anyone know what the heck is going on and what configuration changes I need to make to the server to get around this problem?
I have an ASP page hosted on a windows 2003 server. This page was working fine until Saturday. Now, the server doesn't serve it and gives a 404 page not found error. The file DEFINITELY exists in the correct folder (it worked fine previous to now). I have tested the page on our in house server, which is a Windows 2000 server and the page works fine. I have exported the database data and tables which are necessary to the 2000 server and again it works fine, so I don't think it is any data in the database which is causing it. We serve many other sites off this server, and the asp works fine on all of them, and throughout the rest of this particualr site which is causing the problems.
Am using MSXML4.0 serverXMLHTTP for communicating with a remote server. Using POST method of serverXMLHTTP to post a xml string but after calling send method its give "HTTP 1.1 405 method not allowed" in response status text.
When strLocalComputer contains the local computer name, we get the requested page. But if we use the IP address of the local computer we get the error :
msxml4.dll (0x80072EE7) The server name or address could not be resolved.
We manage to run the URL with the IP address from a browser on the server but not from a browser on the local computer. Any ideas ?
I've two Win2000 Advanced Server (ServerA and ServerB) both part of an Active Directory. The Active Directory Server is named ServerC.
On ServerA I've published an asp page PageA.asp inside a virtual directory VirtuaA. On ServerB I've published an asp page PageB.asp inside a virtual directory VirtuaB.
Inside both server I've installed Microsoft XML Parser 4.0 sp2
Both virtual directories have setted Windows Integrated Authentication.
PageA.asp has to get PageB.asp using ServerXmlHttp object and to show its contents.
When I try to get PageA.asp from any client part of the same active directory I obtain an error: I'm not authorized to get PageB.asp.
I've tried to use every solution founded on previous posts:
- proxycfg -d -p " " "*" on ServerA
- flag "trust for delegation" on Active Directory Control Panel of ServerC
- ASP instruction .setProxy 2, " ", "*" inside PageA.asp
I've been experimenting with both ServerObjects' AspHTTP and Microsoft's ServerXMLHTTP to access IMail. They both work fine, except for one weird thing.
To log in to IMail via login.cgi, it takes ServerXMLHTTP about 30 seconds. It takes AspHTTP about 1 second. Accessing any other IMail page, the two are virtually identical. But the login page is unacceptably long for ServerXMLHTTP.
I have been reading these posts and I thought I may be able to use serverXMLhttp for my current problem.
My company requires all of their divisions to use the same template for intranet sites. They (corp IT) changes this template often. I can use serverXMLhttp to read the their site and dsplay everything. My problem is the .js and .css include files they use. How can I get this info into the correct format in my page and change their relative image links so that I can use their images.
I will have a few minor edits for links and one image change. I assume that can be done through Replace()
I want to use HEAD instead of GET in order to save bandwidth. I don't want to retrieve the full html of a page, just the HTTP header to know that a file exists:
HTTP/1.0 200 OK or that a file is not found HTTP/1.0 404 Not Found
This is for a link checker that I am writing. Does anyone know what I am doing wrong or does this object not recognise the HEAD parameter?
I'm trying to use a ServerXMLHTTP connection in async mode to request Page2.asp from Page1.asp. Both pages are inside 2 different virtual directories with Windows Integrated Authentication but the user who requests Page2.asp is different from user has requested Page1.asp.The user of Page2.asp is IWAM_machine_name. If I use ServerXMLHttp in standard mode there is no problem.
this code works with msxml4, but found out the server it ultimately will sit on only has ver 3 and they won't update. so, this code fails on the bold line and i can't figure out why.
Code: Set httpReq = Server.CreateObject("MSXML2.ServerXMLHTTP" ) Set myXmlDoc = Server.CreateObject("MSXML2.DOMDocument" ) httpReq.Open "post", webServiceUrl, False httpReq.setRequestHeader "Content-Type", "text/xml" httpReq.setRequestHeader "SOAPAction", "soapserver/soap:CreatePreview#CreateGraphics" httpReq.Send soapEnv 'fails here returnSoapEnv = httpReq.responseText
I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to retrieve large binary data from a remote server. When the request is large (more than a few megabytes), the ServerXMLHTTP page jumps to nearly 100% CPU utilization for an unusually long time. The remote server needs a few seconds to prepare the request, during which time the CPU seems OK. It seems that as soon as the data is ready to retrieve, the CPU usage jumps and remains that way until the data has all been copied to the requesting server. That takes way too long - about 35 seconds when requesting a 12 MB file over a gigabit Ethernet.
I use ServerXMLHTTP hundreds of thousands of times daily on this same system on the same network, with absolutely no problem - but for smaller requests. There's something about the size of the request that makes it blow up.
I saw some reports of older systems with this problem (Windows 2000), but I'm running IIS 6 on Windows Server 2003, SP1.
I have been using the following code to access a remote url, which works fine, but if the remote "geturl" does a redirect (as the page in this code does), I have know idea what the redirected url is. Page still displays, (including the html source code), but I cannot determine what the base href is (that is, I don't know the url of the page being displayed, since is is a redirected page).
I've used MSXML2.ServerXMLHTTP in ASP to write a link checker. It checks a list of URLs stored in a db and flags any websites/webpages that are down. Since I am only interested in whether the page is up or not and not in the actual HTML, I use HEAD instead or GET.
Code:
Set getPage = Server.CreateObject("MSXML2.ServerXMLHTTP") getPage.Open "HEAD", URL, false where getPage is my object, URL is the URL to check and I can also do
which checks the URL using the username and password supplied. Everything works as expected but my problem lies in the fact that the password is in plaintext in the ASP page. It is not good security practice to have an ASP page contain a username/password combo in plaintext. This means that if someone had file access to the ASP page (intranet) then they could see the username/password pair which we want to try and avoid.
Does the HTTP standard only allow for username/passwords to be sent in plaintext?
I'm trying to integrate portions of a page on one domain into another domain. The goal is to use a rate calculator on the remote site to produce quotes on the main site. I have permission to do this. I'm using MSXML2.ServerXMLHTTP and it's working quite well except that I am unable to set a session cookie that seems to be required to generate a PDF on the remote site.
I've tried passing the session cookie thru MSXML2.ServerXMLHTTP, but that doesn't work. I've tried setting a cookie using ASP but it seems that PDF generation on the remote site requires the session cookie from the remote domain. I've tried setting a cookie with the domain property set to the remote site but the cookie is never written. I think this is a security feature in the Http Session State.
Is it possible to set a session cookie in MSXML2.ServerXMLHTTP with a remote host/domain? Can ASP be used to set a cookie with a domain other than the requesting domain? There seems to be a property to do this but I haven't been able to get it to work. Any other suggestions?
On server A I've a web application WA1. Inside WA1 there are a virtual directory VD1 and an ASP page named page1.asp. Inside VD1 there is another ASP page named page2.asp.
page1.asp makes a ServerXMLHTTP requests for page2.asp.
If Debugging Flags on WA1 are enabled the request seems to be blocked, if those flags are disabled everything seems to be ok.
I am trying to use ServerXMLHTTP to post data containing Japanese characters, but the data posts as question marks, boxes or just random ascii characters. Here is the code I am using:
Consider the following simple function to get the contents of a remote URL:
Function GetURL(str_URL) Set obj_XMLHTTP = Server.CreateObject ("MSXML2.ServerXMLHTTP.3.0") obj_XMLHTTP.Open "GET", str_URL, False, "", "" obj_XMLHTTP.Send GetURL = obj_XMLHTTP.ResponseText Set obj_XMLHTTP = Nothing End Function
Is there anyway that I can use a remote proxy server to make the request?