I am running Directus app and api on Docker using the latest images. I have created my own dockerfile to move custom php settings to allow large file uploads (instead of the default 2Mb).
modified php settings
upload_max_filesize=500M
post_max_size=500M
memory_limit=500M
max_execution_time=1000
modified NGINX conf
client_max_body_size 500M;
However with the above settings I can upload a 20Mb video file, but any file that is just over 100Mb in size always returns an error (even though it should allow up to 500Mb).
I cannot seem to find any log information to determine the cause.
File that is >100Mb but less than 500Mb should be uploaded successfully to file library
The large file returns a server error (500) once the upload has completed. See attached image.

@wolfulus @WellingGuzman — is this an API, Docker, or server issue?
@benhaynes I can't tell where's the issue.
Hey @shartley76 can you tell us if there's any logs in the api logs directory. Also in your developer tools, what's the response on that failed request?
Hi, i've attached a screenshot from developer tools. The first error i think is to do with the thumbnail of the video, there is a CORS error which is strange since i'm able to upload a smaller file with no issues. I've expanded out the final object error. Looks like a timeout of some sort..

hey @shartley76, can you go to the network tab, and find the failed request (the one with status 500) and screenshot or copy paste the response?
Also look in the logs directory of the API to see if there's any error being logged.
@shartley76 Sorry I didn't explain, that's a log from the application and I can't tell what really happening on the server side that throws these errors. I need to know what on the logs directory and the network tap request's response.
hi @WellingGuzman, on network tab I don't have a response, just an internal server error 500 listed on request. Response tab is empty, no data.
logs directory on container @ /var/www/html/logs is empty
The request doesn't seem to be getting to the API. Is there any errors in your nginx logs?
I can't see anything in the logs, they're empty
for accessing the files resource there are 2 errors: the first is:
Access to XMLHttpRequest at 'http://localhost:7000/_/files' from origin 'http://localhost:8000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
I'm not sure why i'm getting that error when I can successfully upload to files a smaller file?
error no2 is:
Error: Network Error at e.exports (http://localhost:8000/js/chunk-vendors.9bc4762b.js:25:62397) at XMLHttpRequest.p.onerror (http://localhost:8000/js/chunk-vendors.9bc4762b.js:74:20534)
also @WellingGuzman checking the network timing on the files (server error 500 resp). I can see that it does start to receive a response:

Hi @WellingGuzman were you able to upload a large file 100Mb plus with a docker deployment, or stand alone? If so would you mind sharing your dockerfile and/or config settings? Then I can just verify I have everything setup correctly. Thanks
Hey @shartley76, it was hard for me to know what was the error without seeing the logs in the API.
I created a dummy 100mb file, and I am now getting an error on bigger files. I don't know why I wasn't before. The issue is the code trying to verify that a huge string is not an url.
I am working on a fix right now. We would need to replace this function: https://github.com/directus/api/blob/master/src/helpers/file.php#L299 in something that doesn't need to read the whole thing or doesn't break trying to.
Either way filter_var must be replaced, because it's limited to ASCII, so not all URLs will be returned as valid url.
From Docs: http://php.net/manual/en/filter.filters.validate.php
Beware a valid URL may not specify the HTTP protocol http:// so further validation may be required to determine the URL uses an expected protocol, e.g. ssh:// or mailto:. Note that the function will only find ASCII URLs to be valid; internationalized domain names (containing non-ASCII characters) will fail.
Hey @shartley76, I believe I have fixed what was triggering the 500 internal error. In my end I am not experience this issue anymore on big files.
I based the validation using regex, instead of filter_var. I ported the patterns/logic from this JS library: https://github.com/segmentio/is-url/blob/master/index.js.
Can you confirm it's working for you on master branch?
Ref: https://github.com/directus/api/commit/81582e182e7422c8190010cf448fe7109b928dec
Hey @WellingGuzman I tried with that patch and I seem to still be getting a 500 when uploading this test file: http://ipv4.download.thinkbroadband.com/200MB.zip.
@computerwizjared can you show me the full log? I am going to try using the 200MB and see if I can reproduce it.
@computerwizjared Did you try upload the file via url? I found an issue uploading non-image files using url. Uploading files using url only works with image at the moment. That error message should be more friendly.
@WellingGuzman I am uploading the file using the built-in Directus App method... not sure if that is via url or not. And after switching to the development environment I am getting a 200 OK but the UI still shows a red error like before. I do have the PHP memory_limit set to 450M in my Dockerfile, as well as the other upload size settings.
Log Message:
[11-Feb-2019 21:46:47] WARNING: [pool www] child 230 said into stderr: "[11-Feb-2019 16:46:47 America/New_York] PHP Fatal error: Allowed memory size of 471859200 bytes exhausted (tried to allocate 279620300 bytes) in /var/www/html/src/endpoints/Files.php on line 79"
Maybe this is a different issue pertaining to my configuration? It's odd that I'm trying to upload a 200MB file and it is giving me an out of memory error. I feel like it shouldn't take more memory than the size of the file, but I may be wrong. Thank you!
I am going to properly fix this bug, I didn't though it was going to be a problem, but I am going to reduce the passing around the file content and encoding decoding the base64 content.
It's not related to the configuration but the API handling big files poorly, keeping these huge data on memory instead of saving into disc as soon as possible to avoid the high memory usage.
I will let you know as soon as I got it running as expected.
Thanks for log.
@WellingGuzman are you able to push any changes for the api to the directus/api docker registry? I'm running docker (linux containers) on windows locally. Many thanks!
@shartley76 I couldn't push any changes, as I am not working with the docker image. @WoLfulus may be able to help you.
I will be working on a proper fix to avoid huge files crashing the server.
Hey @shartley76 I ran into trouble trying to test Directus changes with Docker and the best way I found was to just grab the modified files and use the Dockerfile ADD command to put them into the image, then build and run it.
Thanks @computerwizjared , good idea, will do.
HI @WellingGuzman, any update on this one? Many thanks, Simon
@shartley76 FYI Welling left the project. For now we solved this by upping the ram on the VM that runs directus. It seems the issue is a file upload uses double or more space in memory than the size of the file.
thanks @computerwizjared, i'll grab latest from master and retry as you say.
Hi @computerwizjared I tried running the latest api code from master on a docker container with 1gb memory. I'm trying to upload 144Mb video file, but still getting an error on trying to upload.
I have the following config set in php.ini and default.conf;
upload_max_filesize=500M
post_max_size=1G
memory_limit=500M
max_execution_time=1000
Is there something i'm missing in terms of config?
@shartley76 you're right, I'm getting a 502 gateway timeout. Upping the memory limit helped with the one error I was getting before (with a slightly smaller file) but now it's a different one.
@computerwizjared do you think that this bug will be able to get assigned to someone soon? It has been in the high priority bug triage for a while now and is a blocker for us at the moment
@shartley76 I don't have any idea. I'm not affiliated with Directus, just another user. We're having this issue too, and all we're doing right now is uploading to the server directly and then editing the database manually. Sorry!
@shartley76 and @computerwizjared — we'll try to get this resolved asap, no specific timing. The more info we have the better/faster.
@hemratna + @theharshin — maybe we can look into this one?
Sure @benhaynes. Will look into the issue this weekend. Thanks for the inputs @shartley76 @computerwizjared 🙂
Still getting this error with api version 2.1.1. Is this issue officially fixed or still in the pipeline?
https://www.loom.com/share/b9c1f5f1d58e40a4984bca2d8da2a149
I could "fix" it for now by increasing the settings like suggested by @shartley76 and it works now for somewhat bigger files. But it would be amazing if there is a more stable version. So I ask myself it this issue still exists?
upload_max_filesize=500M
post_max_size=1G
memory_limit=500M
max_execution_time=1000
I believe that this issue is caused by the extra "processing" being done using up more memory than the file itself. Ideally Directus should allow for uploads up to the php.ini settings, but clearly that's not the case and users need to "pad" them higher for larger files.
@bjgajjar @hemratna — any thoughts on where Directus might be eating up more memory or why we need to increase the values so much higher that the actual uploaded file size? Can we at least isolate which of these settings is the important one?
hi @benhaynes are there any updates on this? Should i re-test with the latest api 2.0.21? Looks like there are still some issues as per your last post on this? Currently with the above settings the video file limit seems to be about 110mb. I can re-test with v2.0.21 to see if this has now been resolved?
There's never a problem with testing something again @shartley76, feel free to give it a go and see.
@shartley76
In the latest version, this issue is not reproducible. I am not able to replicate it at my end.
Below are my configurations.
php.ini file has memory_limit of 100Mupload_max_filesize and post_max_size to 1000M of .htaccess file (To avoid the validation from APP)I tried to upload a file with the size of 800MB and able to upload it.
May I have more details on configurations?
@shartley76 keep in mind that the latest version of the API is 2.2.2, not 2.0.21 🙂
Side note, we are using Docker and the last available release for that was 2.0.18. We have been manually uploading any files over a certain size and editing the database since we are stuck on 2.0.18.
Docker refactor is in work now so that you can get on the latest version, @WoLfulus should have an update soon!
Most helpful comment
Hey @shartley76, it was hard for me to know what was the error without seeing the logs in the API.
I created a dummy 100mb file, and I am now getting an error on bigger files. I don't know why I wasn't before. The issue is the code trying to verify that a huge string is not an url.
I am working on a fix right now. We would need to replace this function: https://github.com/directus/api/blob/master/src/helpers/file.php#L299 in something that doesn't need to read the whole thing or doesn't break trying to.
Either way
filter_varmust be replaced, because it's limited to ASCII, so not all URLs will be returned as valid url.From Docs: http://php.net/manual/en/filter.filters.validate.php