The usual approach is to use iterate () .
$q = $this->getDefaultEntityManager()->createQuery('select u from AppBundle:Contractor c'); $iterableResult = $q->iterate(); foreach ($iterableResult as $row) {
However, as stated in the doctrine documentation, this can lead to errors.
The results can be fully buffered by the client / database connection, allocating additional memory that is not visible to the PHP process. For large suites, this can easily kill the process without an apparant reason.
The easiest approach to this is to simply create small queries with offsets and constraints.
//get the count of the whole query first $qb = $this->getDefaultEntityManager(); $qb->select('COUNT(u)')->from('AppBundle:Contractor', 'c'); $count = $qb->getQuery()->getSingleScalarResult(); //lets say we go in steps of 1000 to have no memory leak $limit = 1000; $offset = 0; //loop every 1000 > create a query > loop the result > repeat while ($offset < $count){ $qb->select('u') ->from('AppBundle:Contractor', 'c') ->setMaxResults($limit) ->setFirstResult($offset); $result = $qb->getQuery()->getResult(); foreach ($result as $contractor) { // do something } $offset += $limit; }
With these heavy data sets, this is likely to exceed the maximum time. <30 seconds . Therefore, do not forget to manually change set_time_limit in php.ini. If you just want to update all datasets using a well-known template, you should consider writing one big update request instead of looping and editing the result in PHP.
source share