Download very large files (> 5 GB)

I need your help. I want to create a download script using HTML, jQuery and PHP. Is it possible to write a script that can load very large files (> 5 GB)?

I tried it with FileReader, FormData and Blobs, but even with them I can’t upload large files (my browser crashes after selecting a large file).

PS: I want to write it myself. Do not publish ready-made scripts.

Hello

+4
source share
5 answers

Yes. I wrote PHP to upload a file exactly 5 GB more than a year ago.

FileReader, FormData and Blobs will fail because they all require preprocessing and conversion to javascript before loading it.

But you can easily download a large file with simple XMLHttpRequest.

var xhr=new XMLHttpRequest(); xhr.send(document.forms[0]['fileinput']); 

This is not a standard or documented method, however several Chrome and Firefox support. However, it sends the contents of the file as is, not multipart / form-data, not http form-data. You will need to prepare your own HTTP header to provide additional information.

 var xhr=new XMLHttpRequest(), fileInput=document.forms[0]['fileinput']; xhr.setRequestHeader("X-File-Name", encodeURIComponent(getInputFileName(fileInput))); xhr.setRequestHeader("X-File-Size", getFileSize(fileInput)); xhr.send(fileInput); 

PS. well, actually it was not PHP. It was a mixed PHP and Java Servlet.

+4
source

Look at "chunking" , possibly with a plugin like AX Ajax multi uploader , which should help with both file size limits on the client side and server side.

+1
source

Keep in mind that it is important to set the PHP.ini variable (which is related to the script history), called the maximum execution time of each script, in seconds max_execution_time = xxxx to prevent your script from timeout, because downloading large files is what You know how long it takes. Check also the variable max_input_time = xxxx , which is the maximum amount of time that each script can spend analyzing the request data. It is a good idea to limit this time on production servers in order to eliminate unexpectedly long scenarios, but in your case you may need to increase it. Consider also changing the following variables memory_limit , upload_max_filesize , post_max_size

+1
source

The problem is that it is not very practical. First you have a problem that you need to restart the download when you have a problem with the browser. And this can happen when downloading a large file.

Here is another solution with Ajax:

php upload large files
AX-jQuery Uploader

+1
source

I don’t think that web downloads were considered for a 5gb + kind of file, or that the browser is going to transmit this information with pleasure. File system limitation is also a problem. You should consider / rethink file loading depending on the usage scenario. Is it just a web option? FTP, streaming, remote dumping is probably the best solution that will not block your web server / web page while doing the migration. HTTP is not the best protocol for this.

Think that the browser, PHP and Apache all have limited memory. My antivirus alerts me when chrome uses more than 250 MB per page (which is not considered normal). PHP has 128 MB of dedicated memory by default, and imagine that you have 100 concurrent Apache users downloading 5 GB of files. That's why they invented FTP.

Why do you think these restrictions exist in PHP, apache ...? Because it’s an attack method, a security problem and a server blocking method that can be easily used ... by everyone.

+1
source

Source: https://habr.com/ru/post/1442745/


All Articles