Laravel 5.2, Maatwebsite Excel 2.1
Memory footprint on load should be within reason.
I'm currently working on a project where I'm uploading a 38MB file, it is using around 9GB of system memory just to open the file. This seems a bit extreme. Are there any ways that I can limit the memory usage? I am literally just uploading the data into the database. Chunking returns the exact same usage.
Excel::load($filePath, function($reader) {
// uses 9GB just to get here
});
Also wanted to note: the file has no calculations or formatting, it's just numerical and textual data.
As suggested to others try using raw PHP to read your file (given that it's a csv) or http://csv.thephpleague.com/examples/
Only thing I can offer as it's the solution that I resorted to for myself, hope it helps.
We're having the same issue. we're running out of memory exporting 8K contacts. The system has 16gb and php has access to all of it, so that's rather unexpected.
@NmExHunTeRz unfortunately, .xlsx is a requirement in this case.
From the little testing I did, it looks like PHPExcel loads an entire workbook into memory.
Might want to try something like box/spout. It will stream the contents of a workbook, using as little as 2-8 MiB of RAM. Keep in mind though that the large the workbook, the more CPU intensive the process becomes.
@kherge thanks for the link for Spout, looks like a nice library though although it's a shame that it doesn't seem to support column formatting. I guess loading a template might be an option.
@garygreen this is open source! if you want it to support column formatting, make it support column formatting :)
@mcblum Sorry, I don't use Laravel-Excel anymore. Unneeded abstraction with lots of problems so just use PHP Excel directly now. You're welcome to add the column formatting though, if you like 馃構