I have a web application that connects to a server using a TCP connection and reads a binary document, which then writes to its response object. In other words, it transfers the file from an external server using a custom protocol and returns this file to the client via HTTP.
The server sends the status code and type mime, which I read successfully, and then writes the contents of the file and closes the socket. This seems to be normal.
The client (C # web application) reads the data:
private NetworkStream stream_;
public void WriteDocument(HttpResponse response)
{
while (stream_.DataAvailable)
{
const int bufsize = 4 * 1024;
byte[] buffer = new byte[bufsize];
int nbytes = stream_.Read(buffer, 0, bufsize);
if (nbytes > 0)
{
if (nbytes < bufsize)
Array.Resize<byte>(ref buffer, nbytes);
response.BinaryWrite(buffer);
}
}
response.End();
}
It seems to always exit the read cycle before all data is received. What am I doing wrong?
source
share