i have a query that returns ~50k rows,
seems like doctrine put this whole result into memory what exceeds memory limit(128M)
the only solution i found that saves some memory is
$result->execute(array(), Doctrine_Core::HYDRATE_NONE);
but it still exceeds the limit,
is there any way to read one row at a time with doctrine ?
Doctrine Documentation - 13. Batch Processing
Update:
For 1.2 check out this page:
http://docs.doctrine-project.org/projects/doctrine1/en/latest/en/manual/data-hydrators.html
Under the "On Demand" heading you will find the answer.
Batch processing helps!
It took me a long time to figure out what helps. I have the impression that as long as you stay in the same php-process you can't free your memory. So this didn't help for me:
- $object->free(true)
- unset($object)
- sfConfig::set('sf_debug', false)
- gc_collect_cycles() / gc_enable()
- Doctrine_Core::HYDRATE_ON_DEMAND
- sfCommandApplicationTask::runTask()
- sfDoctrinePager
What really helped was a hint from Rich Sage who suggested to spawn subprocesses to execute Portions of the work. Limit and Offset for the query are given as parameters between the processes. To get row by row set $options['limit']
to 1, but 50 should be a good value too:
daddyTask.class.php
[..]
$total = ItemTable::getInstance()->createQuery('i')->count();
for ($offset = 0; $offset < $total; $offset += $options['limit'] )
{
passthru(
sprintf('%s %s/symfony doTask --limit=%s --offset=%s', sfToolkit::getPhpCli(), sfConfig::get('sf_root_dir'), $options["limit"], $offset),
$returnVar
);
}
doTask.class.php
$items = ItemTable::getInstance()->createQuery('i')->limit($options['limit'])->offset($options['offset'])->execute();
foreach( $items as $item )
{
// ..do something with the item
$this->log( 'INFO: '.memory_get_peak_usage(true).' memory in use.' );
}
(I'm using Doctrine 1.2 in the Symfony 1.4 framework on PHP 5.3.10)
Any comments or critique on this topic is greatly appreciated as it's a difficult one for me!
is there a reason why you don't use ->limit() and ->offset()?