I am trying to implement a file upload function in an asp.net application. The application will be used by, say, about 200 users at the same time to download various files. It will be hosted on IIS 7. I do not want the application server to crash due to multiple requests appearing simultaneously.
I assume that by calling Context.Response.Flush () in a loop, I clear all the file data that I would have read so far, so the memory usage of the application will be uniform. What other optimizations can be made to the current code, or what other approach should be used in such a scenario?
Requests will be for different files, and file sizes can be from 100 KB to 10 MB.
My current code is as follows:
FileStream inStr = null; byte[] buffer = new byte[1024]; String fileName = @"C:\DwnldTest\test.doc"; long byteCount; inStr = File.OpenRead(fileName); Response.AddHeader("content-disposition", "attachment;filename=test.doc"); while ((byteCount = inStr.Read(buffer, 0, buffer.Length)) > 0) { if (Context.Response.IsClientConnected) { Context.Response.ContentType = "application/msword";
source share