Json_decode and memory cleared

I have a list of large JSON files (the smallest files are 500 Ko and the largest are 100 Mo).

I need to process each file independently. My problem is that memory usage is growing more and more after each file, even if I delete all the memory.

Example:

foreach ($files as $file) {
    json_decode(file_get_contents($file->getRealpath()), true);

    $memory = memory_get_usage(true);
    echo 'Memory: '.@round($memory / pow(1024,($i=floor(log($memory, 1024)))), 2).' '.['b', 'kb', 'mb', 'gb', 'tb', 'pb'][$i]."\n";

    gc_collect_cycles();
}

Result:

Memory: 6 mb
(...)
Memory: 6 mb
Memory: 6 mb
Memory: 10 mb
Memory: 10 mb
Memory: 10 mb
(...)
Memory: 12 mb
Memory: 12 mb
Memory: 12 mb
(...)
Memory: 490 mb
Memory: 490 mb
Memory: 490 mb
(...)
Memory: 946 mb
Memory: 944 mb
Memory: 944 mb
(...)

The memory grows more and more until PHP tells me that it cannot get more. As you can see, I am not doing anything in this example except json_decode (), no variable assigned, or anything else. So why is my memory growing so much and how can I clear it?

+4
source share
1 answer

Check the size of the file you are trying to get. It could be bigger, so it requires memory usage

or

, , strlen(), , var, .

unset , .

unset($decoded_data);

$var = null

unset, , , ( - null), , , , CPU.

https://github.com/salsify/jsonstreamingparser

, JSON . JSON, , JSON PHP .

0

Source: https://habr.com/ru/post/1688867/


All Articles