Yii2: Query::batch - memory limit

Created on 31 Jul 2015  路  25Comments  路  Source: yiisoft/yii2

Hello, I need to iterate 200k record table, but when i use foreach (Item::find()->each() as $item) it fails with memory error. Loop doesn't execute at all.
I found only one topic about this with no solution.

It works only on development environment (mac, mamp 3.0.6, 128 mb memory limit), but on deployment servers it throws an error.
Could you suggest where should i search the difference?

docs

Most helpful comment

As a note: this will happen all the time on the default install of PHP7 now due to how mysqlnd now works by adding the result set's memory usage to the process's own:
http://php.net/manual/en/mysqlinfo.concepts.buffering.php

When using libmysqlclient as library PHP's memory limit won't count the memory used for result sets unless the data is fetched into PHP variables. With mysqlnd the memory accounted for will include the full result set.

And mysqlnd is the recommended lib for the PDO extension now, while libmysqlclient is now "not recommended"

All 25 comments

  1. PHP version.
  2. PDO version.
  3. PDO settings about emulating prepared statements.

I have the exact same problem and already tried the suggestion in #8420 (PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => false) but it didn't help. My code is basically this:

$query = (new \yii\db\Query())
    ->select('*')
    ->from('bigtable')
    ->where('active=1');

foreach ($query->batch(200) as $batch) {
    // this line is never excueted due to a memory exhaustion exception
}

'yiibaseErrorException' with message 'Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)' in /var/www/vendor/yiisoft/yii2/db/Command.php:837

@samdark Your above link goes to a russian site, so not sure, if I missed some essential information. Anything else we could try? It seems like batch() doesn't really behave as promised by default.

Is debug turned off?

No, should it be?

Yes beause Yii keeps debug stuff in memory and it grows.

I usually disable all log targets when processing large amounts of data in dev environment. Logs get spammed with query profiling info and I couldn't figure out how to force them to flush.

I've tried with YII_DEBUG==false and YII_ENV = 'prod'. Still got the same error. I also tried both, console and web application. It happens in both.

@samdark BTW your tipp should still be documented somewhere. It's not really obvious.
@nineinchnick Also tried that, didn't help.

On a sidenote, try this to flush logs immediately:

'log' => [
    'flushInterval' => 1,
    'targets' => [
        [
            // ...
            'exportInterval' => 1

I'm out of ideas.

@samdark Sorry, I've confused my DB connections (have 2 in my app). The above PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => false really helps. But now i got this:

ERROR: SQLSTATE[HY000]: General error: 2014 Cannot execute queries while other unbuffered queries are active. Consider using PDOStatement::fetchAll(). Alternatively, if your code is only ever going to run against mysql, you may enable query buffering by setting the PDO::MYSQL_ATTR_USE_BUFFERED_QUERY attribute.

I know, this is probably a limitation of MySQL PDO. Any help is still welcome as this basically makes moving data from one DB to another impossible if you have large datasets.

Are you trying to run multiple queries?

I select a batch of rows through one DB connection then call Command::batchInsert() to insert them into another DB. I didn't expect that the above would happen even if you have 2 different database connections. And the second statement is actually not a query, but an insert statement.

OK, I've figured it out. Actually there really _was_ another query: I have some code that creates my second DB connection for me. The configuration for that connection is also stored in the primary DB. So this would query the primary DB each time I need that other DB connection. I've now made sure, that the connection is created before the actual data migration.

Everything seems to work fine now.

Sorry for late answer, but for my case updating of php and pdo modules resolved this issue. I didn't change any params.

unable to reproduce the issue, see #11152

As a note: this will happen all the time on the default install of PHP7 now due to how mysqlnd now works by adding the result set's memory usage to the process's own:
http://php.net/manual/en/mysqlinfo.concepts.buffering.php

When using libmysqlclient as library PHP's memory limit won't count the memory used for result sets unless the data is fetched into PHP variables. With mysqlnd the memory accounted for will include the full result set.

And mysqlnd is the recommended lib for the PDO extension now, while libmysqlclient is now "not recommended"

https://github.com/yiisoft/yii2/pull/11393#issuecomment-253957000

@cebe @aivanouski any update锛焪ill this be fixed at milestone: 2.0.11, 2.0.10

In the mean time I made this https://github.com/Sammaye/yii2-pq

As an additional, unbuffered queries suck

Duplicates #8420

About Sammaye鈥檚 class, method each( $batchSize = 100, $db = null )

now becomes each($batchSize = 100, $page = true, $db = null)

This problem may still appear even on latest Yii2 version.
It's not the framework problem it's in DB connection configuration.

I've solved this by adding PDO param to my connection config.
'attributes' => [ PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => false ]

So after it became

'db'=> [
    'class'=> 'yii\db\Connection',
    'dsn'=> 'mysql:host=127.0.0.1;port=3306;dbname=DBNAME',
    'username'=> 'root',
    'password'=> 'root',
    'charset'=> 'utf8',
    'attributes'=> [
        PDO::MYSQL_ATTR_USE_BUFFERED_QUERY=> false
    ]
]

Memory limit error has gone on PHP 7.2.8.

I don't use Yii anymore but as said in my repo:

"Added to that, my own observations that unbuffered queries suck meant that I created this."

Honestly unbuffered queries are not good, they cannot even be used like normal queries.

Was this page helpful?
0 / 5 - 0 ratings