Laravel-excel: [QUESTION] Queueing big csv imports using SQS "413 Request Entity Too Large"

Created on 25 Nov 2020  路  5Comments  路  Source: Maatwebsite/Laravel-Excel

Prerequisites

Versions

  • PHP version: 7.4
  • Laravel version: 7.2
  • Package version: 3.1

Description

When using WithChunkedReading & ShouldQueue on SQS, importing large CSV files results in "413 Request Entity Too Large" error from AWS. This seems to be because the entire spreadsheet is being passed into the queued job. Is there a way to not have the initial import queued, and only queue the chunks? Alternatively, is there a way for the import to fetch the spreadsheet from the path when the initial job is being handled, so that nothing is being passed into the payload of the job?

question

Most helpful comment

It was the chunk size, it creates all of the queues in one post to SQS. Increasing chunk size reduces the payload to SQS, which has a 256kb limit.

Thanks for the quick responses.

All 5 comments

We don't pass the entire spreadsheet in the jobs, we reopen the file in each chunk.

Are you perhaps getting that error while trying to upload it?

No the file has already uploaded by that point, it is when I execute the initial import here that I'm getting the error.

Excel::import(new UsersImport($project), 'tmp/spreadsheet.csv');

I have this set on my import;

class UsersImport implements OnEachRow, WithHeadingRow, WithChunkReading, ShouldQueue
{
    ...

    public function chunkSize(): int
    {
        return 500;
    }
}

I don't use SQS so I can't debug it for you. If you are able to find the root cause and how we could fix it, feel free to PR a fix. If you need help on commercial support basis, send us an e-mail. (https://laravel-excel.com/commercial-support)

It was the chunk size, it creates all of the queues in one post to SQS. Increasing chunk size reduces the payload to SQS, which has a 256kb limit.

Thanks for the quick responses.

That explains, glad you figured it out!

Was this page helpful?
0 / 5 - 0 ratings