File with size of 9 GB should be downloaded
0 Byte file created, no error message but logged error
Operating system:
Ubuntu 16.04
Web server:
Apache
Database:
MySQL
PHP version:
PHP 7
Nextcloud version: (see Nextcloud admin page)
Nextcloud 11.0.1
Updated from an older Nextcloud/ownCloud or fresh install:
Updated from Owncloud 9 -> Nextcloud 10 -> Nexcloud 11 alpha
Where did you install Nextcloud from:
Git
Signing status:
List of activated apps:
The content of config/config.php:
Are you using external storage, if yes which one: local/smb/sftp/...
local
Are you using encryption: yes/no
no
Are you using an external user-backend, if yes which one: LDAP/ActiveDirectory/Webdav/...
no
Browser: Chrome
Operating system: MacOS 10.12
Nextcloud log
{"reqId":"He91ouATEMBQ7rkEw65e","remoteAddr":"127.0.0.1","app":"PHP","message":"stream_copy_to_stream() expects parameter 3 to be integer, string given at \/media\/data\/www\/nextcloud\/3rdparty\/sabre\/http\/lib\/Sapi.php#78","level":3,"time":"October 11, 2016 20:58:34","method":"GET","url":"\/remote.php\/webdav\/Media-Fernsehaufnahmen\/arte_HD\/arte_HD_Der_Tag_der_Kr%C3%A4hen20160615_135900.ts","user":"name","version":"9.2.0.4"}
{"reqId":"DKWW6ajns8HKX8pETQS+","remoteAddr":"127.0.0.1","app":"PHP","message":"stream_copy_to_stream() expects parameter 3 to be integer, string given at \/media\/data\/www\/nextcloud\/3rdparty\/sabre\/http\/lib\/Sapi.php#78","level":3,"time":"October 11, 2016 20:58:58","method":"GET","url":"\/remote.php\/webdav\/Media-Fernsehaufnahmen\/arte_HD\/arte_HD_Die_wahre_Macht_des_Vatikan_(2_2)20160207_144900.ts","user":"name","version":"9.2.0.4"}
Update: According to https://github.com/fruux/sabre-http/pull/61, the issue is located somewhere else. The file size seems to be invalid (NaN) which leads to that error. However, if i apply that patch, it works - but is not solving the root cause as noted in the comments.
Update2: The file breaks at exact 4 GB (real size is 9,1 GB). So there must be an issue somewhere.
Can anybody please test large file downloads in an environment with 32 bit an php 7 ? thx !
Well you said you use 32 bits, the max int there is 2147483647.
From min to max, the number is 4294967294, which is exactly 4GB.
So I'd say try to switch to 64 bits if you want to have so huge files.
I guess the problem is, that before your cast to int (the intval from the patch), the size is an invalid string. casting to int seems to use the max int, when the value is too high:
See that the following 3 numeric strings all result in the same integer:
https://3v4l.org/OMMWp
var_dump((int) '9223372036854775807');
var_dump((int) '9223372036854775808');
var_dump((int) '92233720368547758081231');
So not sure if there is anything we can do.
There seems to be a general issue with files bigger than 2GB on 32-bit systems. :see_no_evil:
There has been a discussion about that topic, generally, stating that as an alternative either floats may be used (which are essentially 64 bits, even on the 32bit arm platform) or numeric strings are an option:
https://github.com/fruux/sabre-dav/issues/408
var_dump((int) '9223372036854775807');
var_dump((int) '9223372036854775808');
var_dump((float) '92233720368547758081231');
results in:
int(9223372036854775807)
int(9223372036854775807)
float(9.2233720368548E+22)
... but as far as i understood, floats are not an option for sabre dav, so it possibly won麓t change. Nextcloud shall catch the error and throw a warnings prior downloading instead, i guess.
Can you try, whether https://github.com/nextcloud/server/pull/1740 fixes the issue?
And by the way, you should not run any nextcloud 11 alpha version yet. We didn't make any release for that Oo
No, does not seem to solve it. Download is still 0 Byte. Log:
{"reqId":"cIm0o8yJYIzbld34vhUX","remoteAddr":"127.0.0.1","app":"PHP","message":"stream_copy_to_stream() expects parameter 3 to be integer, string given at \/media\/data\/www\/nextcloud\/3rdparty\/sabre\/http\/lib\/Sapi.php#78","level":3,"time":"October 15, 2016 11:24:15","method":"GET","url":"\/remote.php\/webdav\/Media-Fernsehaufnahmen\/arte_HD\/arte_HD_Der_Tag_der_Kr%C3%A4hen20160615_135900.ts","user":"me","version":"9.2.0.4"}
-> I will investigate and test further by the end of tomorrow.
The proposed fix did unfortunately not help. So is there really no chance to download files > 4 GB in an 32 Bit Environment ? I am not into the internals of NC, but i think a download is "streamed" in some kind of chunks, so Filesize should not matter for the download itself, correct ? Is it needed to know the actual filesize (as variable) before downloading ?
Well that seems to exactly be the problem. Before streaming the size is given. But in your case the size is not a number, but a string (because its bigger than the max int) and therefor the error occures.
The problem is, that the log is not really helpful. Let me check if I can find out where this comes from and whether you can add some debug calls for us, that help us finding the real issue.
Can you make the following changes, try an download and then revert the changes again (otherwise your log will become very huge quickly), and then post the log here?
File 3rdparty/sabre/dav/lib/DAV/CorePlugin.php
find (line 795):
$propFind->handle('{DAV:}getcontentlength', [$node, 'getSize']);
before add:
\OC::$server->getLogger()->error('Debugging propFind(): ' . json_encode($node->getSize()));
File apps/dav/lib/Connector/Sabre/Node.php
find (line 201):
return $this->info->getSize();
before add:
\OC::$server->getLogger()->error('Debugging getSize(): ' . json_encode($this->info->getSize()));
File lib/private/Files/FileInfo.php
find:
return isset($this->data['size']) ? $this->data['size'] : 0;
before add:
\OC::$server->getLogger()->error('Debugging FileInfo::getSize(): ' . json_encode($this->data['size']));
Here are the Results:
File 1, exact Filesize is: 9.085.604.216 Byte
"message":"Debugging propFind(): 9085604216"
"message":"Debugging FileInfo::getSize(): 9085604216"
"message":"Debugging getSize(): 9085604216"
"message":"Debugging FileInfo::getSize(): 9085604216"
--> is correct !
FIle 2, exact Filesize is: 9.585.365.368 Byte:
"message":"Debugging propFind(): 9585365368"
"message":"Debugging FileInfo::getSize(): 9585365368"
"message":"Debugging getSize(): 9585365368"
"message":"Debugging FileInfo::getSize(): 9585365368"
--> is correct !
Can you try whether https://github.com/nextcloud/server/pull/1902/files helps?
Sorry, neither #1890, nor #1902 solve this
PHP_INT_SIZE is 4, I double checked this, just in case ...
Just for your information. The same issue occurs on the last stable version of owncloud as well as the daily build (which includes the mentioned patches).
I did some more tests on 11.0.1 now.
First of all, the problem still exists. It may be okay if this issue cannot be solved for a 32 bit OS, but this shall be documented somewhere.
Now, if i download a folder, a very big one (65 GB), the download seems to be good because of the .tar archive that is created.
Now, can麓t you implement an option like that:
IF (32Bit-Machine) AND (File > 4 GB)
THEN
provide file download as zip/tar archive (just like you do with folders)
ELSE
provide file download uncompressed
END
-> As an option, configurable in the config.php.
If this is not possible, please close and state "won麓t fix". Thanks !
What do you think ? @LukasReschke
-> As an option, configurable in the config.php.
If we do this it's definitely not an option.
IF (32Bit-Machine) AND (File > 4 GB)
Maybe the easiest approach is to do this completely in the frontend, because otherwise this would also affect the webdav and then we would run into issues. You upload a file on one device in plain format and on the other device the file is downloaded as tar archive -> would be super confusing.
On the other side: how does Webdav works on 32 bit systems if the file is bigger than 4 GB?
@MorrisJobke : You are right. I tried downloading the file via WebDav, does not work either.
So to recap: If the server is 32 Bits, no files above 4 GB can be handled directly, only if they are compressed transparently. I will try again on a 64 Bits server as soon as I find some time.
i think the compressed File has to be smaller than 4GB after compression too.
A further approach is to split the archives to parts smaller than 4GB. Possible to only contain a valid amount of files. That way the download could be completed.
a further question is: is there a limitation of the memory of PHP to create that zip file?
I do not know what the repercussions are, but for me it worked:
In Sapi.php I removed $contentLength
stream_copy_to_stream($body, $output,$contentLength);
to
stream_copy_to_stream($body, $output);
found in this thread -> https://github.com/owncloud/core/issues/23788
I'm on Raspberry pi 3 with php 7.1 and nextcloud 11.0.2
I had this problem:
stream_copy_to_stream() expects parameter 3 to be integer, string given at /home/things/3rdparty/sabre/http/lib/Sapi.php#78
@alfael I the head revision of sabre, there is a typecast that avoids this warning. However, as the cast is "int", the file is capped at 2 GB.
If I remove the parameter as you proposed, the file is broken at exactly 4 GB, so this does not resolve the issue.
--> On a 32 Bit system, big files are not possible, and as far as I understood, this won麓t fix (see above)
Got it working with following patch:
/www/nextcloud/3rdparty/sabre/http/lib/Sapi.php@78:
while (!feof($body)) {
// stream_copy_to_stream($body, $output, $contentLength);
fwrite($output,fread($body,8192));
}
Instead of copying entire file at once, it copies chunks of 8MB
Bit crude, but the concept works :)
Based on: http://php.net/manual/en/function.stream-copy-to-stream.php#98119
Perfect @rikmeijer @nickvergessen , this fixed the issue. I will 馃憤 on the change, as i do not see any side effect. Please vote for this change in order to get the 32 Bit download right.
Filed a php bug https://bugs.php.net/bug.php?id=74395
Oh nice work @rikmeijer
Why do you use 32bit php in the first place?
Most SBCs such as Raspberry Pi and Odroid devices still rely on 32 Bit. Therefore this change is a real big thing for those.
YEP! Just stumbled in this right now. Got really sad :(
just released https://github.com/fruux/sabre-http/releases/tag/4.2.3 which contains a workaround for the 32bit file size problems
I just checked on daily, big downloads on 32 Bit work perfectly now (6,8 GB file in my testcase), the md5 sum matches and network speed did not slow down (tested in my local lan).
Thank you - Let consider this fixed with Nextcloud 13 馃挴 馃憤
i use Nextcloud 12.0.3 on a 32Bit 16.04 Ubuntu (from a Webhoster) with PHP 7.0.22
i am facing the issue that the zip generated by the Download-Button crashed downloading arround 4GB.
i also tried the sabre-http fix (copied the lib to nextcloud/3rdparty/sabre/http/lib) without success 馃槩
what did i do wrong?!
Most helpful comment
Got it working with following patch:
/www/nextcloud/3rdparty/sabre/http/lib/Sapi.php@78:
while (!feof($body)) {
// stream_copy_to_stream($body, $output, $contentLength);
fwrite($output,fread($body,8192));
}
Instead of copying entire file at once, it copies chunks of 8MB
Bit crude, but the concept works :)
Based on: http://php.net/manual/en/function.stream-copy-to-stream.php#98119