It seems like some data is getting cached somewhere and Laravel is doing something with that cached data on teardown. When creating large tables with hundreds of thousands of rows this behavior is not desirable.
Create test table
<?php
use Illuminate\Support\Facades\Schema;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Database\Migrations\Migration;
class CreateTestTable extends Migration
{
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::create('test_table', function (Blueprint $table) {
$table->increments('id');
$table->string('column0', 50)->nullable();
$table->string('column1', 50)->nullable();
$table->string('column2', 50)->nullable();
$table->string('column3', 50)->nullable();
$table->string('column4', 50)->nullable();
$table->string('column5', 50)->nullable();
$table->string('column6', 50)->nullable();
$table->string('column7', 50)->nullable();
$table->string('column8', 50)->nullable();
$table->string('column9', 50)->nullable();
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::dropIfExists('test_table');
}
}
Create test command to insert rows
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use DB;
class TestCode extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'test:code';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Easy way to test code.';
/**
* Create a new command instance.
*
* @return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* @return null
*/
public function handle()
{
// Memory consumption insert test
$pdo = DB::getPdo();
$rowValues = [];
$valuesToInsert = [];
$placeHolderString = '(' . implode(', ', array_fill(0, 10, '?')) . ')';
$placeHolders = [];
for ($i = 0; $i < 10; $i++) {
$rowValues['column' . $i] = uniqid() . uniqid();
}
$columns = array_keys($rowValues);
$counter = 0;
while ($counter < 100000) {
foreach ($rowValues as $value) {
$valuesToInsert[] = $value;
}
$placeHolders[] = $placeHolderString;
if ($counter > 0 && $counter % 1000 === 0) {
$query = "INSERT INTO `test_table` (" . implode(', ', $columns) . ") VALUES " . implode(', ', $placeHolders);
// Using query builder, memory consumption continually rises
DB::statement($query, $valuesToInsert);
// Using the pdo connection, memory consumption levels out and stays constant
// $pdo->prepare($query)
// ->execute($valuesToInsert);
$valuesToInsert = [];
$placeHolders = [];
$this->info('Memory usage: ' . memory_get_usage(true) );
}
$counter++;
}
}
}
After executing the test:code command the output in the shell looks like this:
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 23068672
Memory usage: 23068672
Memory usage: 23068672
Memory usage: 25165824
Memory usage: 25165824
Memory usage: 25165824
Memory usage: 27262976
Memory usage: 27262976
Memory usage: 27262976
Memory usage: 29360128
Memory usage: 29360128
Memory usage: 29360128
Memory usage: 31457280
Memory usage: 31457280
Memory usage: 31457280
Memory usage: 33554432
Memory usage: 33554432
Memory usage: 33554432
Memory usage: 35651584
...
When bypassing the query builder and using the pdo connection the output looks like this:
Memory usage: 18874368
Memory usage: 18874368
Memory usage: 18874368
Memory usage: 18874368
Memory usage: 18874368
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
Memory usage: 20971520
...
FYI The same behavior occurs when using DB::table()->insert() to insert the data instead of DB::statement().
If you say that the same thing with PDO doesn't happen can you then post the equivalent code with PDO?
It's already in the original post. Just comment out the DB::statement code and un-comment the pdo->prepare code.
Have tested your code. My environment:
PHP 7.3.2
MariaDB 10
Laravel 5.7.26
The results were (using DB::statement()):
Memory usage: 14680064
Memory usage: 14680064
Memory usage: 14680064
Memory usage: 14680064
Memory usage: 14680064
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
Memory usage: 16777216
...
Memory usage: 16777216
Can you test again using PHP 7.3? Doesn't seems to be a Laravel issue. 馃
Thanks @KennedyTedesco for testing it out. After reading your response I thought it might have something to do with telescope and after I added test:code to the ignore_commands in the telescope config the memory leak went away.
So is it a bug in Telescope causing the memory increases, or is it a symptom of the logging that telescope is doing?
@laurencei I'm not sure if this is a bug or expected behavior for Telescope. Maybe one of the Laravel devs could answer that.
ok - in that case this should probably be moved to the telescope repo, but I dont have access there, so someone else will have to do it. @driesvints @themsaid .
It's best that you open up a new issue in the Telescope repo because we can't move issues.
@driesvints - FYI - you should be able to, as you are a member of both?
https://help.github.com/articles/transferring-an-issue-to-another-repository/
@laurencei only admins can transfer issues. I'm only a member of this repo.
I encountered this while seeding the database in Laravel 5.8.19 A discussion in Telescope (https://github.com/laravel/telescope/issues/173) hinted at this approach, which seemed to work form me.
use Laravel\Telescope\Telescope;
...
Telescope::stopRecording(); // cuz Telescope holds on to stuff after an insert
< all of your inserts...>
Telescope::startRecording();
For anyone landing on this page after a Google search, I had the same symptoms in a data import script. It would work when batching 100 rows per insert but would fail when batching 1000 rows per insert. I profiled my code as shown above and found:
Rows Inserted: 1000 Memory usage: 37752832
Rows Inserted: 2000 Memory usage: 39854080
Rows Inserted: 3000 Memory usage: 44052480
Rows Inserted: 4000 Memory usage: 48250880
Rows Inserted: 5000 Memory usage: 50352128
Rows Inserted: 6000 Memory usage: 54550528
Rows Inserted: 7000 Memory usage: 58748928
Rows Inserted: 8000 Memory usage: 60850176
Rows Inserted: 9000 Memory usage: 65048576
Rows Inserted: 10000 Memory usage: 69246976
Rows Inserted: 11000 Memory usage: 71348224
Rows Inserted: 12000 Memory usage: 75546624
Rows Inserted: 13000 Memory usage: 79745024
Rows Inserted: 14000 Memory usage: 81846272
Rows Inserted: 15000 Memory usage: 86044672
Rows Inserted: 16000 Memory usage: 90243072
Rows Inserted: 17000 Memory usage: 94441472
Rows Inserted: 18000 Memory usage: 96542720
Rows Inserted: 19000 Memory usage: 100741120
Rows Inserted: 20000 Memory usage: 104939520
Rows Inserted: 21000 Memory usage: 107040768
Rows Inserted: 22000 Memory usage: 111239168
Rows Inserted: 23000 Memory usage: 115437568
Rows Inserted: 24000 Memory usage: 117538816
Rows Inserted: 25000 Memory usage: 121737216
Rows Inserted: 26000 Memory usage: 125935616
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 716800 bytes)
I traced through the Connection class and noted that even with logging disabled it was still raising a QueryExecuted event. I updated the code to call DB::connection()->unsetEventDispatcher(); which disabled the events, and since then the code runs without leaking over the full data set (2.64m records).
Rows Inserted: 1000 Memory usage: 35651584
Rows Inserted: 2000 Memory usage: 35651584
Rows Inserted: 3000 Memory usage: 35651584
Rows Inserted: 4000 Memory usage: 35651584
Rows Inserted: 5000 Memory usage: 35651584
Rows Inserted: 6000 Memory usage: 37748736
Rows Inserted: 7000 Memory usage: 37748736
Rows Inserted: 8000 Memory usage: 37748736
Rows Inserted: 9000 Memory usage: 37748736
Rows Inserted: 10000 Memory usage: 37748736
Rows Inserted: 11000 Memory usage: 37748736
Rows Inserted: 12000 Memory usage: 37748736
Rows Inserted: 13000 Memory usage: 37748736
Rows Inserted: 14000 Memory usage: 37748736
Rows Inserted: 15000 Memory usage: 37748736
Rows Inserted: 16000 Memory usage: 37748736
Rows Inserted: 17000 Memory usage: 37748736
...
Rows Inserted: 2640000 Memory usage: 37748736
Rows Inserted: 2641000 Memory usage: 37748736
Rows Inserted: 2642000 Memory usage: 37748736
Rows Inserted: 2643000 Memory usage: 37748736
I hope that helps someone.
PHP 7.4.1
MariaDB 10.4.11
Laravel 6.6.0
For anyone landing on this page after a Google search, I had the same symptoms in a data import script. It would work when batching 100 rows per insert but would fail when batching 1000 rows per insert. I profiled my code as shown above and found:
Rows Inserted: 1000 Memory usage: 37752832 Rows Inserted: 2000 Memory usage: 39854080 Rows Inserted: 3000 Memory usage: 44052480 Rows Inserted: 4000 Memory usage: 48250880 Rows Inserted: 5000 Memory usage: 50352128 Rows Inserted: 6000 Memory usage: 54550528 Rows Inserted: 7000 Memory usage: 58748928 Rows Inserted: 8000 Memory usage: 60850176 Rows Inserted: 9000 Memory usage: 65048576 Rows Inserted: 10000 Memory usage: 69246976 Rows Inserted: 11000 Memory usage: 71348224 Rows Inserted: 12000 Memory usage: 75546624 Rows Inserted: 13000 Memory usage: 79745024 Rows Inserted: 14000 Memory usage: 81846272 Rows Inserted: 15000 Memory usage: 86044672 Rows Inserted: 16000 Memory usage: 90243072 Rows Inserted: 17000 Memory usage: 94441472 Rows Inserted: 18000 Memory usage: 96542720 Rows Inserted: 19000 Memory usage: 100741120 Rows Inserted: 20000 Memory usage: 104939520 Rows Inserted: 21000 Memory usage: 107040768 Rows Inserted: 22000 Memory usage: 111239168 Rows Inserted: 23000 Memory usage: 115437568 Rows Inserted: 24000 Memory usage: 117538816 Rows Inserted: 25000 Memory usage: 121737216 Rows Inserted: 26000 Memory usage: 125935616 PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 716800 bytes)I traced through the
Connectionclass and noted that even with logging disabled it was still raising aQueryExecutedevent. I updated the code to callDB::connection()->unsetEventDispatcher();which disabled the events, and since then the code runs without leaking over the full data set (2.64m records).Rows Inserted: 1000 Memory usage: 35651584 Rows Inserted: 2000 Memory usage: 35651584 Rows Inserted: 3000 Memory usage: 35651584 Rows Inserted: 4000 Memory usage: 35651584 Rows Inserted: 5000 Memory usage: 35651584 Rows Inserted: 6000 Memory usage: 37748736 Rows Inserted: 7000 Memory usage: 37748736 Rows Inserted: 8000 Memory usage: 37748736 Rows Inserted: 9000 Memory usage: 37748736 Rows Inserted: 10000 Memory usage: 37748736 Rows Inserted: 11000 Memory usage: 37748736 Rows Inserted: 12000 Memory usage: 37748736 Rows Inserted: 13000 Memory usage: 37748736 Rows Inserted: 14000 Memory usage: 37748736 Rows Inserted: 15000 Memory usage: 37748736 Rows Inserted: 16000 Memory usage: 37748736 Rows Inserted: 17000 Memory usage: 37748736 ... Rows Inserted: 2640000 Memory usage: 37748736 Rows Inserted: 2641000 Memory usage: 37748736 Rows Inserted: 2642000 Memory usage: 37748736 Rows Inserted: 2643000 Memory usage: 37748736I hope that helps someone.
PHP 7.4.1
MariaDB 10.4.11
Laravel 6.6.0
Nice finding, just testing it right now. going back to report the results.
I have the same error as yours which throwing through the Connection class.
Everytime i running my insert script for large amount of rows into database, i always facing the same error at some points.
So below i tried to logging the memory usage from both scenario.
The first scenario without unsetEventDispatcher(); ,
1 > Peak memory usage: 21,831,048 bytes
2 > Peak memory usage: 91,028,296 bytes
3 > Peak memory usage: 109,026,736 bytes
4 > Peak memory usage: 126,431,200 bytes
5 > Peak memory usage: 145,720,368 bytes
6 > Peak memory usage: 160,210,456 bytes
7 > Peak memory usage: 176,807,440 bytes
8 > Peak memory usage: 193,412,584 bytes
9 > Peak memory usage: 210,476,376 bytes
10 > Peak memory usage: 227,466,368 bytes
11 > Peak memory usage: 244,590,936 bytes
12 > Peak memory usage: 259,457,168 bytes
13 > Peak memory usage: 275,795,112 bytes
14 > Peak memory usage: 295,889,880 bytes
15 > Peak memory usage: 311,263,272 bytes
at processing number 60 i got FatalError caused by memory exhausted (memory_limit set is at 1GB)
The second scenario with unsetEventDispatcher(); setting at the beginning of the scripts.
1 > Peak memory usage: 22,616,032 bytes
2 > Peak memory usage: 91,662,040 bytes
3 > Peak memory usage: 92,010,248 bytes
4 > Peak memory usage: 92,442,544 bytes
5 > Peak memory usage: 92,847,632 bytes
6 > Peak memory usage: 95,805,800 bytes
7 > Peak memory usage: 95,805,800 bytes
8 > Peak memory usage: 95,805,800 bytes
9 > Peak memory usage: 95,805,800 bytes
10 > Peak memory usage: 95,805,800 bytes
11 > Peak memory usage: 95,805,800 bytes
12 > Peak memory usage: 95,805,800 bytes
13 > Peak memory usage: 96,214,688 bytes
14 > Peak memory usage: 96,214,688 bytes
15 > Peak memory usage: 96,215,024 bytes
16 > Peak memory usage: 96,215,024 bytes
as you can see from the stats, max memory usage from the script is not sky-rocketed like the first one and seems like it's added bit by bit as the real usage of the script. :D
so i can confirm, by setting the unsetEventDispatcher in the connection, and of course by disabled the queryLog really helps to avoid FatalError which causing by memory leaked.
Most helpful comment
For anyone landing on this page after a Google search, I had the same symptoms in a data import script. It would work when batching 100 rows per insert but would fail when batching 1000 rows per insert. I profiled my code as shown above and found:
I traced through the
Connectionclass and noted that even with logging disabled it was still raising aQueryExecutedevent. I updated the code to callDB::connection()->unsetEventDispatcher();which disabled the events, and since then the code runs without leaking over the full data set (2.64m records).I hope that helps someone.
PHP 7.4.1
MariaDB 10.4.11
Laravel 6.6.0