I have a Python script that converts a PDF file. It is called via PHP in my Laravel application:
$command = escapeshellcmd("python /home/forge/default/pdf.py " . $id);
$output = shell_exec($command);
This works fine with any PDF up to 250 MB, but not with a large PDF file, such as 500 MB.
If I call the Python script directly from the command line, it works fine and exits after about 5 minutes. Only when summoned shell_execdoes he fail.
This happens in a job in the Laravel queue, as far as I know, does not use HTTP / PHP FPM, but on the command line, which should not have a timeout?
The Laravel queue worker works with a timeout set to 0 (no timeout).
Is there anything else in the PHP CLI settings that could lead to a crash? Does anyone know where errors will be recorded - there is nothing in the tables failed_jobs, nothing in, laravel.logand my Bugsnag integration did not catch me.
When it starts at the OK command line, I assume that this is not a Python problem, but something related to calling it from PHP.
The server has 60 GB of RAM and monitors the process through htop, it never gets more than 3% of RAM use. Could there be some other hard-coded hard drive?
I am using Laravel 5.4, Ubuntu server, Python 2.7.12.