Is there an easy way to write to a sheet in batches or something?
I have the following that was working fine for the first 10-20k records
$res = Excel::create('UserExport', function($excel) {
$excel->sheet('Users', function($sheet) {
$users = User::select('id','name','email','created_at','dob','attending','country','postcode','accesscode','arrival_time','phone','gender','character','type')->get();
$sheet->fromModel($users);
});
})->store('xlsx', storage_path('excel/exports'), true);
At 40k, even setting the memory_limit at 1GB is not enough to keep this going, so I guess I need to read from the database and write to the sheet in batches, I'm just unsure how to do that from the docs?
i am facing the same problem. trying to export around 15K records, memory allowed as 1GB but the export was unsuccessful. could not find any solution yet 馃槥
You can re-open the sheet with ::load() and fromModel (https://github.com/Maatwebsite/Laravel-Excel/blob/2.1/src/Maatwebsite/Excel/Classes/LaravelExcelWorksheet.php#L450) accepts a startCell. So by setting the start cell every batch, it will append it.
You could also try chunking your query:
$users = User::select('id','name','email','created_at','dob','attending','country','postcode','accesscode','arrival_time','phone','gender','character','type');
$user->chunk(100, function($users) use($sheet) {
$sheet->rows($users);
});
This is an example that worked for me.
$orders = Order::withTrashed()->where('status', 'done');
Excel::create('Report', function($excel) use ($orders) {
$excel->sheet('report', function($sheet) use($orders) {
$sheet->appendRow(array(
'id', 'landing', 'departure', 'phone_id'
));
$orders->chunk(100, function($rows) use ($sheet)
{
foreach ($rows as $row)
{
$sheet->appendRow(array(
$row->id, $row->landing, $row->departure, $row->phone_id
));
}
});
});
})->download('xlsx');
Looks good @narek-king :)
I have tried this solution, but not work for me.
I have 60 columns and 30,00 rows. :smile:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 16384 bytes)
If it is related to memory_limit you can decries the chunk. Try something lower than 100.
@narek-king, I have tried with chunk 100 and 50, increase memory_limit and request timeout, but that is not work.
$records = CallDataRecord::query();
Excel::create('Call Data Records', function ($excel) use ($records) {
$excel->sheet('Sheet', function ($sheet) use ($records) {
$sheet->appendRow(array(
'i_account',
'i_protocol',
'duration',
'cli' ,
'cld',
'cld_in',
'description',
'cli_in',
'billed_duration',
'p_asserted_id',
'remote_ip',
'connect_time',
'disconnect_time'
));
foreach ($records->cursor() as $row)
{
$sheet->appendRow(array(
$row->i_account,
$row->i_protocol,
$row->duration,
$row->cli,
$row->cld,
$row->cld_in,
$row->description,
$row->cli_in,
$row->billed_duration,
$row->p_asserted_id,
$row->remote_ip,
$row->connect_time,
$row->disconnect_time,
));
}
});
})->download('csv');
This can help you out, you can use cursor method to reduce memory allocation, but i want to know what should do for 10K - 100K records. It take so much time to process the data.
i have 2 million entries how should i do it ?
Most helpful comment
This is an example that worked for me.