I just came across this exact same scenario in some code that I was asked to work on. The programmer created a temporary file, receiving the input stream on it, deleting the temporary file, and then calling renderBinary. It seems to work just fine even for very large files, even in gigabytes.
I was surprised by this and still looking for additional documentation that shows why this works.
UPDATE: Finally, we came across a file that made this thing bomb. I think it was over 3 GB. At this point, it became necessary NOT to delete the file during rendering processing. I actually ended up using the Amazon Queue service to queue messages for these files. Messages are then retrieved on a job with a scheduled deletion. Works well, even with clustered servers on a load balancer.
source share