php/symfony/doctrine memory leak?

2019-01-07 08:45发布

问题:

I'm having problems with a batch insertion of objects into a database using symfony 1.4 and doctrine 1.2.

My model has a certain kind of object called "Sector", each of which has several objects of type "Cupo" (usually ranging from 50 up to 200000). These objects are pretty small; just a short identifier string and one or two integers. Whenever a group of Sectors are created by the user, I need to automatically add all these instances of "Cupo" to the database. In case anything goes wrong, I'm using a doctrine transaction to roll back everything. The problem is that I can only create around 2000 instances before php runs out of memory. It currently has a 128MB limit, which should be more than enough for handling objects that use less than 100 bytes. I've tried increasing the memory limit up to 512MB, but php still crashes and that doesn't solve the problem. Am I doing the batch insertion correctly or is there a better way?

Here's the error:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes) in /Users/yo/Sites/grifoo/lib/vendor/symfony/lib/log/sfVarLogger.class.php on line 170

And here's the code:

public function save($conn=null){

    $conn=$conn?$conn:Doctrine_Manager::connection();

    $conn->beginTransaction();


    try {
        $evento=$this->object;


        foreach($evento->getSectores() as $s){

            for($j=0;$j<$s->getCapacity();$j++){

                $cupo=new Cupo();
                $cupo->setActivo($s->getActivo());
                $cupo->setEventoId($s->getEventoId());
                $cupo->setNombre($j);
                $cupo->setSector($s);

                $cupo->save();

            }
        }

        $conn->commit();
        return;
    }
    catch (Exception $e) {
        $conn->rollback();
        throw $e;
    }

Once again, this code works fine for less than 1000 objects, but anything bigger than 1500 fails. Thanks for the help.

回答1:

Tried doing

$cupo->save();
$cupo->free();
$cupo = null;

(But substituting my code) And I'm still getting memory overflows. Any other ideas, SO?

Update:

I created a new environment in my databases.yml, that looks like:

all:
  doctrine:
    class: sfDoctrineDatabase
    param:
      dsn: 'mysql:host=localhost;dbname=.......'
      username: .....
      password: .....
      profiler: false

The profiler: false entry disables doctrine's query logging, that normally keeps a copy of every query you make. It didn't stop the memory leakage, but I was able to get about twice as far through my data importing as I was without it.

Update 2

I added

Doctrine_Manager::connection()->setAttribute(Doctrine_Core::ATTR_AUTO_FREE_QUERY_OBJECTS, true ); 

before running my queries, and changed

$cupo = null;

to

unset($cupo);

And now my script has been churning away happily. I'm pretty sure it will finish without running out of RAM this time.

Update 3

Yup. That's the winning combo.



回答2:

I have just did "daemonized" script with symfony 1.4 and setting the following stopped the memory hogging:

sfConfig::set('sf_debug', false);


回答3:

For a symfony task, I also faced to this issue and done following things. It worked for me.

  • Disable debug mode. Add following before db connection initialize

    sfConfig::set('sf_debug', false);
    
  • Set auto query object free attribute for db connection

    $connection->setAttribute(Doctrine_Core::ATTR_AUTO_FREE_QUERY_OBJECTS, true );
    
  • Free all object after use

    $object_name->free()
    
  • Unset all arrays after use unset($array_name)

  • Check all doctrine queries used on task. Free all queries after use. $q->free() (This is a good practice for any time of query using.)

That's all. Hope it may help someone.



回答4:

Doctrine leaks and there's not much you can do about it. Make sure you use $q->free() whenever applicable to minimize the effect. Doctrine is not meant for maintenance scripts. The only way to work around this problem is to break you script to parts which will perform part of the task. One way to do that is to add a start parameter to your script and after a certain amount of objects had been processed, the script redirects to itself with a higher start value. This works well for me although it makes writing maintenance scripts more cumbersome.



回答5:

Try to unset($cupo); after every saving. This should be help. An other thing is to split the script and do some batch processing.



回答6:

Try to break circular reference which usually cause memory leaks with

$cupo->save();

$cupo->free(); //this call

as described in Doctrine manual.



回答7:

For me , I've just initialized the task like that:

// initialize the database connection
$databaseManager = new sfDatabaseManager($this->configuration);
$connection = $databaseManager->getDatabase($options['connection'])->getConnection();
$config = ProjectConfiguration::getApplicationConfiguration('frontend', 'prod', true);
sfContext::createInstance($config);

(WITH PROD CONFIG)
and use free() after a save() on doctrine's object

the memory is stable at 25Mo

memory_get_usage=26.884071350098Mo

with php 5.3 on debian squeeze



回答8:

Periodically close and re-open the connection - not sure why but it seems PDO is retaining references.



回答9:

What is working for me is calling the free method like this:

$cupo->save();
$cupo->free(true); // free also the related components
unset($cupo);