Response.flushBuffer () not working

I am trying to implement a servlet for streaming large objects:

oracle.sql.BLOB blob = rs.getBLOB('obj'); InputStream in = blob.getBinaryStream(); int bufferSize = 1024; byte[] buffer = new byte[bufferSize]; ServletOutputStream out = response.getOutputStream(); int counter=0 while((length=in.read(buffer)) != -1){ out.write(buffer,0,length); counter++; if(counter % 10 == 0){ counter=0; response.flushBuffer(); } 

This code assumes sending data to the client in chunk. Now, when a large object stream (100 MB) occurs, the memory rises, and the server sometimes dies if there are several parallel downloads / threads.

Why flushBuffer() this flushBuffer() not send data to the client? The client receives a pop-up window for opening / saving the file only after the response is closed.

+6
source share
2 answers

Before writing data, you must set the Content-Length header, or the server is forced to buffer all the data until the stream is closed, and at this point it can calculate the value itself, write the header and send all the data. Once you get the output stream, before writing any data, set the content length:

 response.setHeader("Content-Length", String.valueOf(blob.length())); 

Most servers are smart enough to flush the buffer itself at this point, so you probably don't even need to call flushBuffer() , although that doesn't hurt.

+3
source

First of all, you need the response header for the servlet response so that the container can find out after the number of bytes can end:

 response.setHeader("Content-Length", String.valueOf(blob.length())); 
+1
source

Source: https://habr.com/ru/post/915504/


All Articles