Memory leak while executing a Doctrine query in a loop

I am having trouble finding the cause of a memory leak in my script. I have a simple repository method that increments the "count" column in my entity by the size of X:

public function incrementCount($id, $amount) { $query = $this ->createQueryBuilder('e') ->update('MyEntity', 'e') ->set('e.count', 'e.count + :amount') ->where('e.id = :id') ->setParameter('id', $id) ->setParameter('amount', $amount) ->getQuery(); $query->execute(); } 

The problem is that if I call this in a loop bursts of memory usage at each iteration:

 $doctrineManager = $this->getContainer()->get('doctrine')->getManager(); $myRepository = $doctrineManager->getRepository('MyEntity'); while (true) { $myRepository->incrementCount("123", 5); $doctrineManager->clear(); gc_collect_cycles(); } 

What am I missing here? I tried ->clear() according to the Doctrine instruction for batch processing . I even tried gc_collect_cycles() , but the problem remains.

I am running Doctrine 2.4.6 on PHP 5.5.

+5
php memory-leaks symfony doctrine doctrine2
Oct 28 '14 at 19:27
source share
5 answers

I solved this by adding --no-debug to my command. It turns out that in debug mode, the profiler stored information about each request in memory.

+9
Nov 05 '14 at 23:46
source share

You lose memory for each iteration. It would be best to prepare the request once and quickly change the arguments . For example:

 class MyEntity extends EntityRepository{ private $updateQuery = NULL; public function incrementCount($id, $ammount) { if ( $this->updateQuery == NULL ){ $this->updateQuery = $this->createQueryBuilder('e') ->update('MyEntity', 'e') ->set('e.count', 'e.count + :amount') ->where('e.id = :id') ->getQuery(); } $this->updateQuery->setParameter('id', $id) ->setParameter('amount', $amount); ->execute(); } } 

As you mentioned, you can use batch processing here, but first try this and see how well (if at all) it performs ...

+4
Oct 28 '14 at 19:42
source share

I just ran into the same problem, this is what fixed this for me:

- no debugging

As the OP points out in its answer, installing --no-debug (ex: php app/console <my_command> --no-debug ) is critical for performance / memory in symfony console commands. This is especially true when using Doctrine, because without it, Doctrine goes into debug mode, which consumes a huge amount of additional memory (which increases at each iteration). For more information, see the Symfony docs here and here for more information.

- okr = prod

You should also always indicate the environment. By default, Symfony uses the dev environment for console commands. Typically, dev not optimized for memory, speed, processor, etc. If you want to iterate over more than a thousand elements, you should probably use the prod environment (for example: php app/console <my_command> --no-debug ). See here and here for more information.

Tip. I created an environment called console , which I specifically configured to run console commands. The following is information on how to create additional Symfony environments .

php -d memory_limit = YOUR_LIMIT

If you are doing a large upgrade, you should probably choose how much memory is acceptable for its consumption. This is especially important if you think a leak may occur. You can specify the memory for the command using php -d memory_limit=x (ex: php -d memory_limit=256M ). Note: you can set the limit to -1 (usually the default for php cli) so that the command runs without memory limits, but this is obviously dangerous.

Well-formed console command for batch processing

A well-formed console command to run a big update using the tips above looks like this:

php -d memory_limit=256M app/console <acme>:<your_command> --env=prod --no-debug

Use Doctrine IterableResult

Another great thing about using Doctrine ORM in a loop is using the IterableResult doctrine (see Doctrine Batch Processing docs ). This will not help in the given example, but usually when performing such processing the query result is executed.

Print memory usage while running.

It can be very useful to keep track of how much memory your team consumes during its work. This can be done by displaying the response returned by PHP, built into the memory_get_usage () function.

Good luck

+4
Nov 19 '16 at 23:01
source share

Doctrine logs any request you make. If you make a lot of queries (this usually happens in loops), Doctrine can cause a huge memory leak.

You need to disable Doctrine SQL Logger to overcome this.

I recommend doing this only for part of the loop.

Before the loop, get the current registrar:

$sqlLogger = $em->getConnection()->getConfiguration()->getSQLLogger();

And then disable SQL Logger:

$ em-> GetConnection () → getConfiguration () → setSQLLogger (zero);

Swipe here: foreach() / while() / for()

After the loop ends, return the logger:

$em->getConnection()->getConfiguration()->setSQLLogger($sqlLogger);

+2
May 11 '17 at 12:57
source share

For me, this was a cleansing of the doctrine or, as the documentation says, disabling all entities:

 $this->em->clear(); //Here em is the entity manager. 

So, inside my y loop, clear every 1000 iterations and detach all entities (I don't need them anymore):

  foreach ($reader->getRecords() as $position => $value) { $this->processValue($value, $position); if($position % 1000 === 0){ $this->em->flush(); $this->em->clear(); } $this->progress->advance(); } 

Hope this helps.

PS: here is the documentation .

0
Nov 25 '17 at 17:27
source share



All Articles