Why is a long Python script called from PHP not working

I have a Python script that converts a PDF file. It is called via PHP in my Laravel application:

$command = escapeshellcmd("python /home/forge/default/pdf.py " . $id);
$output = shell_exec($command);

This works fine with any PDF up to 250 MB, but not with a large PDF file, such as 500 MB.

If I call the Python script directly from the command line, it works fine and exits after about 5 minutes. Only when summoned shell_execdoes he fail.

This happens in a job in the Laravel queue, as far as I know, does not use HTTP / PHP FPM, but on the command line, which should not have a timeout?

The Laravel queue worker works with a timeout set to 0 (no timeout).

Is there anything else in the PHP CLI settings that could lead to a crash? Does anyone know where errors will be recorded - there is nothing in the tables failed_jobs, nothing in, laravel.logand my Bugsnag integration did not catch me.

When it starts at the OK command line, I assume that this is not a Python problem, but something related to calling it from PHP.

The server has 60 GB of RAM and monitors the process through htop, it never gets more than 3% of RAM use. Could there be some other hard-coded hard drive?

I am using Laravel 5.4, Ubuntu server, Python 2.7.12.

+4
source share
1 answer

script , php.ini. max_execute_time = 30s. 128M

; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 300

; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 1280M
0

Source: https://habr.com/ru/post/1677431/


All Articles