Hoping someone out there has experience with allowing large file downloads from their site(s) ?? One of the pages of my site is going to allow a download of a 'demo' app install. I have allowed many small file downloads using FileStream objects with no issues, but never a file this size. The install file is ~110 meg and when I tested out using a FileStream to download the file from my the server, the download times out around 60 meg every time. So I switched the code to FTP (see main part of the code below) only now I have two issues: - how do I control where on the user's machine the file downloads? - how can I report on the progress as the download will take several minutes (or more ) ? myFtpWebRequest = WebRequest.Create(strFullPathAndFileName) myFtpWebRequest.Credentials = New NetworkCredential("<userid>", "<password>") myFtpWebRequest.Method = WebRequestMethods.Ftp.DownloadFile myFtpWebRequest.UseBinary = True myFtpWebResponse = myFtpWebRequest.GetResponse() myStreamWriter = New StreamWriter(Server.MapPath(sFileName)) myStreamWriter.Write(New StreamReader(myFtpWebResponse.GetResponseStream()).ReadToEnd) myStreamWriter.Close() myFtpWebResponse.Close()
You shouldn't make that decision for the user. Think about how much that would suck from a user's perspective...hunting for downloaded files wherever the provider decided to drop them. Not good.
I agree 100%. Ideally I would like it to popup with a Save as ... window as it does when using a FileStream. But with my code, nothing appears, it just saves the file the same place I am running the code from locally. I can easily prompt for the location, my issue is telling the streamwriter object where to send the file.