After upgrading from NC11 to NC12, I noticed incorrect folder and file sizes. All >2GB files are pending. I checked the oc_filechache table - it looks like the sizes are fine. I trunc the oc_filechache table and executed the files: scan --all again and again filled it with the exact size, but the Files app is rendered incorrectly. There are no errors in the NC logos, there is no error in appache logos too.
Attach external (local folder) with big files in it. Browse it in WebUI and the folder size will be incorrect or pending. Navigate to some sub folder that contains more then 2GB or have at least one file with size more than 2GB and will see "pending" in the file size column.
In my case file with 3.9G (4143738495 bytes) is shown as "pending" size. The upper folder also is with "Pending size". The same file in oc_filechache is with 4143738495 size so it is indexted currect.
Should see 3.9G
See "pending"
Operating system: OSMC (Debian 8.8)
Web server: Apache/2.4.10 (Debian)
Database: mysql Ver 14.14 Distrib 5.5.55, for debian-linux-gnu (armv7l) using readline 6.3
PHP version: php5 with fpm (PHP 5.6.30-0+deb8u1)
Nextcloud version: NC 12.0.0
Updated from an older Nextcloud/ownCloud or fresh install: Updated from NC 11.0.3
Where did you install Nextcloud from: from the official website
Signing status:
Signing status
No errors have been found.
List of activated apps:
App list
Enabled:
Nextcloud configuration:
Config report
{
"system": {
"instanceid": "ocdhcp0sm80x",
"passwordsalt": "REMOVED SENSITIVE VALUE",
"secret": "REMOVED SENSITIVE VALUE",
"trusted_domains": [
"XXX.XXX.X.XXX",
"X.XX.XXX.XX",
"XXXXXXXXX.XX"
],
"preview_libreoffice_path": "\/usr\/lib\/libreoffice\/program\/soffice.bin",
"datadirectory": "\/var\/www\/NCData",
"overwrite.cli.url": "https:\/\/XXX.XXX.X.XXX",
"dbtype": "mysql",
"version": "12.0.0.28",
"dbname": "NextCloud",
"dbhost": "127.0.0.1",
"dbtableprefix": "oc_",
"dbuser": "REMOVED SENSITIVE VALUE",
"dbpassword": "REMOVED SENSITIVE VALUE",
"logtimezone": "UTC",
"installed": true,
"memcache.local": "\OC\Memcache\APCu",
"allow_user_to_change_display_name": true,
"enable_avatars": false,
"quota_include_external_storage": false,
"updater.release.channel": "beta",
"appstore.experimental.enabled": true,
"mail_from_address": "test",
"mail_smtpmode": "php",
"mail_domain": "xxxxxxxxx.xx",
"mail_smtpsecure": "ssl",
"mail_smtpauthtype": "LOGIN",
"mail_smtpauth": 1,
"mail_smtphost": "xxxxl.xxx",
"mail_smtpport": "465 ",
"mail_smtpname": "REMOVED SENSITIVE VALUE",
"mail_smtppassword": "REMOVED SENSITIVE VALUE",
"loglevel": 0,
"maintenance": false,
"theme": "",
"htaccess.RewriteBase": "\/",
"debug": true,
"updater.secret": "REMOVED SENSITIVE VALUE"
}
}
Are you using external storage, if yes which one: local
Are you using encryption: no
Are you using an external user-backend, if yes which one: No
Browser: 58.0.3029.110 (64-bit)
Operating system: Windows 7
Web server error log
no errors
Nextcloud log
no errors
Browser log
no errors
We can reproduce this bug on NextCloud 12.0.0 with a .zip file that is 3.15 GB.
The file was uploaded correctly but is listed as size "pending" in the web interface, and continues to show syncing in the Windows file explorer. Other devices are stuck on "file 1 of 1" trying to sync the file through the desktop client with no movement in the progress bar after hours, but the file can be completely downloaded through the browser.
Update: The sync client across multiple machines eventually tried to stop syncing the file with the following error message "Broken webserver returning empty content length for non-empty file on resume"
Confirmed. I also have this issue and on NC11 everything was fine.
I'm running NC12 on a raspberrypi with PostgreSQL. This kind makes me remember about the PHP problem in retrieving file sizes greater than 2GB when running on 32bit architectures.
Confirmed! Same problem here!
And it also shows wrong size of available space in client (Windows 10)
Same issue here, Linux client and Rasberry Pi Server.
Same issue here. A fresh install on Raspberry Pi 3.
Same here. Raspberry Pi 3, PHP 7, MariaDB, Apache2.
The same happens to me when a folder contains more than 2GB of files.
While the size is reported as "Pending" in the file list, in the right panel the size is reported as <1Kb and as a negative value in the tooltip that you get when you hover onto it.
The negative folder size seems to arise from the issue @gbmaster referred to.
For example I have a folder:
Summed size of content: 2,835,430,391 bytes
Tooltip size: -1,459,536,905 bytes
32-bit signed integer range: -2,147,483,648 to 2,147,483,647
Summed size exceeding the range: 687,946,744
Wrap-around from -2,147,483,648 and you end up at -1,459,536,903
Example2:
Summed sure of content: 4,261,493,373
Tooltip size: -33,473,923
Summed size exceeding the range: 2,114,009,726
Wrap-around from -2,147,483,648 and you end up at - 33,473,921
same here.
Debian Jessie, NGINX, PHP7, MySQL
Same here on a Raspberry Pi 3 running nextcloud 12.0.0
Pending size with negative number in details shows up for a 2.15GB big ZIP file. Sync client is stuck on the file for some time ending with "Server version downloaded, copied changed local file into conflict file"
can confirm on Cubieboard 4 (CC-80)
Architecture: armv7l - debian/ubuntu armhf
upgraded from 11.0.3 to 12.0 (both official downloads)
All checks passed and no errors (upgrade and/or webinterface)
Apache 2.4, PHP 7, MySQL
maybe this only affects armv7l? judging from the comments here everyone who reported their architecture was on arm
Have same issue on 32-bit operating system. Is everyone having this issue using 32-bit hardware? Or is this issue occurring with both 64-bit and 32-bit operating systems?
Architecture is not arm.
Ah! Interesting to note that. Seems like all the people here reporting are the ones noticing this on 32bit OS.
Raspberry Pi 3, which I use, has a 64bit processor but generally you still install 32bit OS because of the low memory that it has.
Yep.. I'm having the same issue running LAMP in a Rpi 2... No solution yet? Even in the nextcloud client shows 0 B of 43 GB in use... I have 50 GB so that means that it's assuming in a wrong way that I have consumed 7 GB.
I'm having the same issue, system: Raspberry Pi 3, PHP 7, MariaDB, Apache2 and Debian 9
The problem, as far as I can see, is due to the 32-bit version of PHP. Apparently, until NC11, developers found a workaround for this issue.
The console (occ) is still working.... so the problem is not in PHP - it is from the source of NC12. Also NC11 is working.....
Is there a way to raise this problem to the NC developers?
I will try in #nextcloud-dev
I would assume this to be a problem with parseInt(file.size, 10)
in lines 100, 128 and 175 in /apps/files/js/filesummary.js.
However, some of you are suggesting this error is new in NC12, and I could not find any relevant / detrimental commit. (Last patch to these lines was by @DeepDiver1975 in 2015...)
on NC11 everything worked fine and I only updated to NC12 and did not change or update anything else (RPi3, PHP, MySQL, etc... is still the same)
my Desktop Client on Win10 shows different values of available and used spaces. For example: now it shows used 91MB of 56GB, correct would be used 64GB of 120GB
In browser it shows on some folders that the size will be calculated ("Ausstehend" in german), on NC11 all worked fine...
I tried to rebuild everything and did a clean install of NC12 too! But during upload of the files it suddenly showed the wrong sizes.
Maybe it is not the file summary, but the calculation of the single files (don't know where to find that code snippet)
I have the same problem. Folder size shown as pending for a folder larger then 7GB in the root directory. Opening the folder however it shows the exact size of 7.1GB at the bottom (all files and subfolder are smaller then 2GB). This all started right after the upgrade from NextCloud 11.03 to 12.0. Also on the personal page the used space is shown as 0B while the max available space is shown as the MaxQuota-UsedSpace.
Nextcloud 12.0 on raspbian (raspberry pi 3) updated from version 11.0.3
MySql - 5.5.54
PhP -5.6.30
Just saw that sabre-http had a similiar problem that was solved by this workaround.
Nexcloud - DAV also was affected by this sabe issue, which was fixed by @nickvergessen for NC13.
@akki42 correct. But still, even with these two fixes, the file size is still capped at 2GB. I think you found the correct location, max integer at 32 Bit is 2147483647, so this may be the reason (see my screenshot). Note that javascript and thus parseInt can handle more then 32 Bit.
Also with the latest patches, the file is downloaded correctly, so i think it is a matter of visualization.
I can confirm this issue on an odroid SBC (32 Bit machine).
@nickvergessen , @MorrisJobke : here is another small bug, which should be solved with 12.0.1 or 13: File sizes shown in the GUI are not above 2 GB on 32 Bit machines.
Update: I think the function "humanFileSize" is the problem. It takes an integer as argument, which is obviously capped at 32 Bits here. The function is used also at other places. I will investigate further ..
Nothing new here???
No :(
Experiencing the same issues. In the directory list, large dirs have a "pending" size, while the correct size is shown at the bottom when going into that directory.
I've got no problems with a max filesize of 2GB (as is expectable on 32 bit systems), but the problem is that large directories are bringing the system down: I've set a quotum of 100 GB but NC says I've only got a quotum of 28 GB after transferring my files.
I've already updated sabre-http manually but that obviously doesn't solve the problem.
I did not find the root cause, i still think it has to do with humanFileSize, but as i am not used to templates nor to php that much, i cannot help further here.
Same issue here. Fresh install on Raspberry Pi 3 Raspbian Jessie Lite, PHP5, MariaDB10 and Nextcloud 12.
Windows sync client has uploaded without problem on computer A but can't download files on computer B. Error only for big files -> server replied: Requested Range not satisfiable
Same isuue here. Fresh install on Raspberry Pi 3 with nextcloudpi. I have a file with 3 GB that I uploaded from a ubuntu client. On the web size appear as pending and I can't download with other ubuntu client.
Same here - Raspberry PI + nextcloud 12.
Pending file size.
Also, latest Windows client does not show file size of folder containing content with size > 2GB.
So this error was anoying me for some time now and i went and had a look at the code.
Found the point were the filesize is casted to integer in FileInfo.php
/**
* @return int
*/
public function getSize() {
$this->updateEntryfromSubMounts();
return isset($this->data['size']) ? (int) $this->data['size'] : 0;
}
changing the return statement to the following
return isset($this->data['size']) ? $this->data['size'] : 0;
solves the problem. All files and folders now show with correct file size.
This corresponds to the code prior to @mrow4a / @MorrisJobke's commit Ensure that FileInfo return values as required by its phpdoc.
for v12.0.0RC1.
Thus I'm not sure this if this might pose inadvertent side-effects?
I doubt it will have any negative side effects. The cast was introduced into the code in march 2017 and before that the result was not casted to integer.
@1manprojects well you were right... Your solution fixed my issue properly. Now I got the folder size again. Thank you.
Thank you a lot 1manprojects, your solution is perfect on Web access and Windows client sync. I hope it will be added to the next release !
But I have an other problem, Windows client can't download file larger than 2,3 Go (it's the big file my Windows client can download)
I tried to reproduce your problem but i cant. I uploaded a 2,4Gb file and it was synced correctly with another client on windows 10 (nextcloud client version 2.3.1). I will look at it again tomorrow but i dont think that the problem you are having is related with my fix.
Thank you to look at that, perhaps the limit is not 2.4 Gb but more. I'm going to try a bigger file.
Tested the patch and the filesizes are immediately correct!
Maybe the original change to casting to integer was only tested on 64bit systems?!
I am very surprised that no one found this error/problem during beta testing?!
Works for me, too. Thanks !
Now, let麓s try and get this in asap.
@Nico83500 I tested it out with a larger file again on another client and cannot reproduce youre problem.
@Nico83500 is it possible that you cannot download any File that is larger then 2GB correctly ?
This (was) another issue with 32 BIt devices, will be fixed in 12.0.1 also.
You can also try via your Browser or any WebDav Client, it should not work if you are on NC11 or NC12
@derkostka Yes you're right, I'm on NC12, I've tried Windows client and browser but it doesn't work. I'm using a RPi 3 so 32 bits device. When do you think 12.0.1 will be released ? Thank you.
Sorry i was wrong. It will not be fixed for NC12, but NC13.
I麓d suggest you to update Sabre-HTTP manually, see here:
Thank you but before I would like to be sure this is my problem : with Windows client, large file start to download but after some percent I have a "Connection closed" error. Is it similar to this issue ?
@Nico83500 Im also running on a Raspberry Pi Model 3, 32-bit, NC-12 and i have no problem syncing files larger then 2GB with the latest nextcloud client. For me this sounds like you have a wrong configuration. You can have look at this #3984
Got the same problem and fixed by patching FileInfo.php.
Thank you!
@1manprojects Thank you, I'll look at it. I've already changed max upload configuration but same issue. What is your OS on RPi 3 (Raspbian Jessie, Lite or not ?), database and php version ? And do you have a setup tutorial or a link to explain the configuration you use ?
I mentiond my setup (raspian) already in this thread and the best would be to open a new issue for your problem or comment on an exiting one.
@derkostka Can you clarify whether this issue will be corrected officially in 13.0 or 12.0.1?
Was eventually going to migrate to a 64-bit server to get over the 2GB 32-bit file issue. You stated above that this will be corrected for 32-bit machines in an upcoming release. Will this be in the Nextcloud 12 series or in Nextcloud 13?
Sorry if this feels redundant. Just want to clarify for all.
This will be Nextcloud 13. An external plugin, Sabre http, needs to be updated and this is not planned for Nextcloud 12.
You can try if you test your issue against the daily snapshot. I did some days ago and have no more problems related to file size or 32 bit topics right now.
@derkostka Daily client snapshot, correct?
No, it is a server issue (hope we talk about the same thing) https://download.nextcloud.com/server/daily/
Please see the related issues for details
Adapted file "/lib/private/Files/FileInfo.php" as mentioned by @1manprojects - works like expected now. Filesize is being displayed properly.
(Platform: Raspberry 3, 32 bit)
Confirm - it works now. 10x
I did changes in FileInfo.php as @1manprojects said and file size in web page is show rigth but I was able to download files larger than 2 GB so I updated sabre/http as @derkostka suggest and now I can download files larger than 2GB. Thanks a lot.
I am running nexcloud 12 on a raspberry pi raspbian (32 bit) (nexcloudpi.img).
maybe, if upload nexcloud 12 to nexcloud 12.0.1 or higher I will have problem with files I uploaded manually. Someone could say something about that?
Fix works perfect here. Not even a Filescan required.
Perfect.
Edit: I just want to add, that this also fixed my problem with a slowly listing of files and folders when browsing thru the nextcloud webinterface, in these cases when there was a folder without a file/folder size.
Fixed in #5744
And I have such observations:
Edit: I just want to add, that this also fixed my problem with a slowly listing of files and folders when browsing thru the nextcloud webinterface, in these cases when there was a folder without a file/folder size.
:+1:
Hi !
For me the problem persist, i'm on x64 system and i just upgraded my cloud to 12.0.1 + an occ scan --all,
And I always certains file with wrong size.
A file weight 4聽748聽122聽364
and my cloud tells me it weight only 453 155 068o
I just don't understand why ...
Or another file weight 9聽599聽233聽653o
and my cloud tells me it weight only 1 009 299 061o
ps: when i want to download it, the download stops at the size my cloud tells me so the file is broken
Hi there,
yesterday I did the updgrade from nextcloud 11.0.3 to 12.0.1 and the problem still exists.
I'm running nextcloud on an Odroid-XU4 (armv7h architecture), archlinux arm, PHP-FPM 7.1.8 and nginx 1.12.1. All users have a quota of 20 GB but the Webinterface just shows 4GB. As usage about 40 % are shown while using something about 16 GB of 20GB.
I don't have large files and thus, I can't check if a download of large files stops and results in broken files.
Best regards
As far as i can tell this change will be included in NextCloud 12.0.2 and the error is still present in 12.0.1.
As far as i can tell this change will be included in NextCloud 12.0.2 and the error is still present in 12.0.1.
Correct - this will be shipped via 12.0.3: https://github.com/nextcloud/server/pull/5925
@1manprojects Do you confirm that no side effects occur removing the (into) in fileinfo.php?
Is that enough? did you (or others) notice any strange behavior after that fix?
It's necessary to do the Sabre update?
@Ricardosgeral I hav had absolutly no side effect removing the cast and no other strange behaviors also i did not to the sabre update.
@1manprojects thanks. Did it and works.
The sabre update seems to complicated to me. I will wait to next versions of NC12/13 aand hope they incorporate those changes
Fix for me is valid.
Raspberry 2 + jessie + php 5.6 + postgresql 9.x + NC 12.0.1 after migrate from OC (with recommended steps)
The Sabre update is needed to be able to download files of that size. To only display the correct file size, the removes cast is sufficient. Please open a new issue in case of problems. Thanks!
This issue still exists today, even with everything on date: NC 13.0.1, 4.14 Kernel 64 bit, FileInfo.php fix etc.
Surely this should still be an open issue right? Even with NC 13.0.5 and the fixes mentioned in this thread, my NC instance still bombs on anything > 2GB. When you're trying to sync a media library with videos and whatnot it is an absolute dealbreaker.
A temporary solution is to make a other external storage in "local" instead of "SMB / CIFS". This new external storage will show right sizes, obviusly write permissions are not allowed, but at least users can download files properly.
"me too"
14,0,0,19
php7.0-common
4.4.0-135-generic
(why is bug closed? re-open?)
There is a _claim_ that this is fixed by #5744, but issue persists on latest version I've installed.
Any1 familiar with this functionality could code/ref me to "rough code area"? I am curious to RCA.
I guess you can try to start here:
https://github.com/nextcloud/server/blob/93c62d78db7847078727eafd3d8e40836a575cec/lib/private/Files/Cache/Cache.php#L169
And then you need to check all parent places from this call until the data is lost.
Hi guys! I've a VPS with a 32-bit Debian Linux and I'm facing the same problem. I have a .zip which size is ~11GB and Nextcloud shows that its size is 2 GB if I put the file in the root of the samba disk. However, if I put that .zip file inside a folder, it shows the correct size.
UPDATE: Well, I just noticed that the file that downloads is an incomplete zip (~1GB).
I think, there are any problems with files size and free space calculation. I my case, I have "Not enough free space, you are uploading 685 KB but only 0 B is left" error on user without quota and half full zfs partition (similar: https://central.owncloud.org/t/not-enough-free-space-you-are-uploading-244-kb-but-only-0-b-is-left/13828) Webdav API works fine, also android app, but not the desktop client (linux) and webclient. It can be, that my complex wordl with proxmox, lxc with system and data mountpoints, nginx and php-fastcgi have more then one problem. But where to start with debugging? (Newst nextcloud version runnning.)
Hi,
I don't know if this is the same issue, but with a 15.0.4 got wrong folder sizes in Windows and Linux client. I don't have this issue in another 15.0.2 install.
The folder size is about 31 GB, but Nextlcoud client reported as a 534 gb.
It is in process of being uploaded to the server through the client so I guess the size is being reported by the desktop client first.
Then I tested with a Linux client (same user and server) and got the same size reports. The linux client doesn't have the folder so it is just get the size from the Server.
HI,
my issue is an other. The desktop client and the mobile client works perfectly. But the web client does not. The answer to GET /apps/files/ajax/getstoragestats.php?dir=/ is empty. Its called in apps/files/js/files.js
// update quota
updateStorageQuotas: function() {
var state = Files.updateStorageQuotas;
state.call = $.getJSON(OC.filePath('files','ajax','getstoragestats.php'),function(response) {
Files.updateQuota(response);
});
},
Same issue here in 15.02 version., this only solve with the occ command scan after sharing a folder, how i can force scan user data with a Hook?, i need call the hook in successful sharing method:
Util::emitHook(
'OCA\Files\Command\Scan',
'files:scan',
array('user_id' => &$shareWith, 'unscanned' => true)
);
I have this method but _files:scan_ is not an Hook, is an Command, how i can run a command via php code in my method?
This issue solved, running the command
sudo -u www-data php occ files:scan USER_ID --unscanned
BUT is after all sharing files, i need run this command automatically without cronjobs after all successfully shared folders, do you have any idea about how i can make this?, thanks
Issue still exist in NextCloud 16.0.1 on Ubuntu 18.04, with Apache 2.4, PHP 7.2 and MySQL 5.7, we are not able to see the size of folders, all we see is "Pending".
Still a thing on 17.0.1!!.. anybody reading this??
"local" folders show size but imported "external storage" do not!
@dev-tejondigital solved it for me.
I simply had to run
./occ files:scan --unscanned MY_USERNAME
Still an issue on 18.0.0 . Tried running the above command mentioned by @cupcakearmy which temporarily fixes the problem however it quickly reverts back to the error state.
I've noticed that nested folders seem to be causing some issues. For example, if a folder has zero folders within it (only individual files) then it displays it's size correctly (when you navigate to the folder's details). Otherwise if there is a nested folder the size appears to be -1 B. Time modified remains to be 3 hours ago for both conditions.
Still an issue on 18.0.0 please tend to this ASAP!.
Still an issue on 18.0.0 please tend to this ASAP!.
Anyone have a clue for this?
A pull requests would be awesome to move forward! :smiley:
The same issue with newest 18.0.4, any advice?
Files uploaded via app are fine but uploaded via website have a wrong size. For example uploaded 8 Mb PDF file is showing up as 124Kb corrupted PDF file.
This is still an issue.
Ive never seen an issue persist for so long, are there fixes? like above why has a cron job not been made?
Are there links to documentation about begining developing with nextcloud? Maybe i could help.
@KenwoodFox have you tried running ./occ files:scan --unscanned MY_USERNAME
it's usually a one time thing.
@cupcakearmy ill give it a shot, one time you say? Should i still look for a way to integrate it?
Sure. I had to run it once, then never again. I guess the process died midway through and got "stuck" I would guess a cron job for this would waste a lot of resources. But a user clickable button in the settings might be very useful
@cupcakearmy thats good to know, yeah perhaps you're right, a button would be nice.
I had it run quite fast, 5 or so TB and it only took a few seconds. Then it was fixed, yay!
I am using nextcloud docker and since I upgraded 19, I am dealing with this issue. All large files are shown as pending. The total disk usage is shown much less than it actually is. The windows app can not sync any large files and in a constant download loop. (File is 9GB, but nextcloud shows it as 200mb, so windows client tries to download that 200mb file, detects an inconsistency with the file I already have (which is 9GB) and marks as conflict, on EVERY file).
This issue is horrendous. I setup all my next cloud setup multiple times with no change, all my system is broken, 1 week of my life gone trying to fix this. I am in a very sad place... Nothing works. Even tried 20 beta 1. files:scan takes 10 hours and the result is the same no matter how many times I try it (20+ times, by now)... I need to borrow a gun...
@driverinla Ive actually found files:scan fixes the issue but then when i add more large files and dont manually look at each folder they are not automatically updated, i can still upload large files though, your issue could be unrelated.
Same here since upgrade to NC 19. Running OCC --unscanned doesn't fix the problem. Running it explicitly for all files in the path doesn't either - all files in that situation show as Pending, and running unscanned consecutively scans the same files with no update. Files on the mobile client show with negative size.
I'm having the same issue after an update to NC 19, it seems though that this is not directly related to the original problem.
At least for me the file sizes are also incorrect in oc_filecache
. I already ran a ./occ files:scan -p [myuser]/path/to/my/file -vvv
as well as a ./occ files:scan --all -vvv
which did not correct the entries in the database.
select size from oc_filecache where name like '[MYFILENAME]';
+------------+
| size |
+------------+
| 1036308480 |
+------------+
1 row in set (0.104 sec)
ls -al /path/to/my/file
-rw-r--r-- 1 www-data www-data 5331275776 Aug 9 08:59 '/path/to/my/file'
I'm running NC on a RPI4 with 32 Bit Raspbian
I am using nextcloud docker and since I upgraded 19, I am dealing with this issue. All large files are shown as pending. The total disk usage is shown much less than it actually is. The windows app can not sync any large files and in a constant download loop. (File is 9GB, but nextcloud shows it as 200mb, so windows client tries to download that 200mb file, detects an inconsistency with the file I already have (which is 9GB) and marks as conflict, on EVERY file).
This issue is horrendous. I setup all my next cloud setup multiple times with no change, all my system is broken, 1 week of my life gone trying to fix this. I am in a very sad place... Nothing works. Even tried 20 beta 1. files:scan takes 10 hours and the result is the same no matter how many times I try it (20+ times, by now)... I need to borrow a gun...
Exactly the same here. Have Nextcloud run stable for a long time on an Odroid HC2 (32bit system, via snap), and for some time (update to 19?) it is completely unusable. The synchronization hangs, larger files create conflicts, a fix with "occ..." did not work so far. In the logfiles the message "Broken webserver returning empty content length for non-empty file on resume", in addition to this the message "pending" instead of showing a file size for many files in the web frontend. It's incredibly frustrating, I've spent so much time trying to get this working again. And the sync is the core of Nextcloud for me; all the apps are nice, but rather nice to have. If the sync doesn't work, the whole thing is just useless 馃槬. Maybe I have to deal with a downgrade, has anyone tried that yet?
Exactly the same here. Have Nextcloud run stable for a long time on an Odroid HC2 (32bit system, via snap), and for some time (update to 19?) it is completely unusable. The synchronization hangs, larger files create conflicts, a fix with "occ..." did not work so far. In the logfiles the message "Broken webserver returning empty content length for non-empty file on resume", in addition to this the message "pending" instead of showing a file size for many files in the web frontend. It's incredibly frustrating, I've spent so much time trying to get this working again. And the sync is the core of Nextcloud for me; all the apps are nice, but rather nice to have. If the sync doesn't work, the whole thing is just useless 馃槬. Maybe I have to deal with a downgrade, has anyone tried that yet?
I also use Odroid HC2 with Openmediavault and use Nextcloud as a docker. It was working since like version 10, with no issues (with files as big as 70GB). Like you, I use it only for syncing with different computers and family members. Extremely frustrating. I guess I can find and do a fresh install of an older version like 18, but what is the point if this thing persists in the new versions and I am stuck with getting no updates and/or fixes...
I just gave up at this point, unplugged the Odroid and stopped using the system... I am manually copying files between my computers, bad bad experience. I hope developers figure the problem out and fix it with an explanation as what was the issue for this big bug.
The issue occures with files of more than 2147483647 Bytes in Nextcloud 19.0.2 on Raspberry PI OS 10 32 bit. Version 17.0.9 and 18.0.8. are not affected. So in my opinion the reason is not the php version.
filesize() returns 0x0FFFFFFF byte as expected filesize, from 0x10000000 bytes a negative integer in PHP 7.1 and 7.3.
So nextcloud seem to calculates the filesize by another way or even not at all.
How could we get attention on an issue this old? Should we try and mention someone maybe?
I believe it is a fairly strong regression, I tested it using the docker image and got the same outcome as @c-0815 , works on 18, doesn't work on 19.
I believe many, many people are affected by it.
So @skjnldsv, i'm assuming this is then also a case of wontfix
like: https://github.com/nextcloud/server/issues/16431? That would really be a shame as this would break a lot of installations for good, at least without a complete OS reinstall
Is this really install breaking? I mean accessing files still works, its just annoying to have to run an occ command to update large folders.
Is this really install breaking? I mean accessing files still works, its just annoying to have to run an occ command to update large folders.
Updating large folders (or folders with large files) with an occ command does not work for me, unfortunately. In my case Nextcloud isn't able to sync large files. When I try to sync a large file Nextcloud creates a copy of the file (on the client side). The file with the "conflicted copy" in it's name has the right size, but the "original" files has now a smaller size. So sync doesn't work, and this is install breaking...
BTW: I also have this issue here (https://github.com/nextcloud/desktop/issues/2279). Maybe this is somehow correlated?
@dasaweb Ah I see, I don't have that issue, I can still sync files even if nextcloud cant display the size to me.
running the command does not fix the issue anymore. In my case i'm still able to upload and download larger files, the size shown is just wrong, and therefore the quota and used space calculations are also off by quite a bit if you have a couple of larger files. For systems which rely on correctly reported Quota i would describe this as install breaking.
Single User systems which do not rely on quota are probably still fine though
At least in my case, no matter how many occ commands I run on my NC 19 installation, the size stays as Pending. On NC 18 that was not the case. Be it --unscanned or --all, it stays the same. Just tried it on a small folder to make sure I wasn't mixing things up.
My files are on external storage, and since I upgraded to NC19 from NC18 I also found that occ doesn't trigger a scan when opening the external storage, which may or may not be related (i.e. it might trigger it but it may fail silently as I see no log of it).
I can access and manage files as usual, though.
The size access is completely broken on my side, the file on the storage (nextcloud) side is 2 gigs, while the real files should be way more than that. The file is then completely corrupted and it makes no sense to access it.
It's not just the "Pending", which also is a dealbreaker for quotas and stuff... And the occ
command doesn't fix anything for me, neither the access nor the pending.
Same issue here on Pi 4 B (32-bit Raspberry Pi OS). The instance worked like a charm before upgrading to Nextcloud 19. I will propably backup all data and downgrade to 18. Hopefully a fix is here soon.
Maybe we should open another issue since the current problems are connected to Nextcloud 19 only?
I installed 18 from scratch and moved all the files. Everything works as before. Seems like I will not be upgrading to 19 unless there is a fix for this.
Can we at least get some info from one of the developers? Are they aware of the issue? Is there any fix coming? Are they just ignoring it? What is the prospect?
@MegaMarkey I'm up for a new issue being opened, and happy to chime in there.
For what it's worth, I just downgraded to a previous NC18.0.8.2 backup I had, and after a few adjustments everything seems to be up and running as intended. I'm not biting the NC19 (or any major upgrade version) bullet anytime soon, certainly not until I am confident that this is solved.
Downgrading to NC18 also resolves this problem on XU4 32-bit system.
Same issue here on Pi 4 B (32-bit Raspberry Pi OS). The instance worked like a charm before upgrading to Nextcloud 19. I will propably backup all data and downgrade to 18. Hopefully a fix is here soon.
Maybe we should open another issue since the current problems are connected to Nextcloud 19 only?
I believe this is the right way to do it. Since it seems like this is a new issue that started with the new version 19. Would you be able to take the lead and open a new issue?
Hello, i developed a patch to fix the issue and opened a PR here: https://github.com/nextcloud/server/pull/23257
Wonderful,
Seems like this is in for the version 21. I would appreciate when someone tries this and confirms the fix.
If the PR will get merged, i'll try to have it backported to stable19.
thanks for the fix, i just made the same changes to my 19.0.2 installation and after a ./occ files:scan --all
everything seems to work like a charm again
Sounds promising! Unfortunately I have to wait until the fix reaches Nextcloud Snap. But at least there is hope :D
I successfully applied the patch on Nextcloud 20.0.0, everything works fine now. Thanks a lot, I hope it gets merged!
Hello,
Sorry to be so stupid here, but how do I apply the patch?
Thank you!
You need to edit the file "lib/private/Files/Storage/Local.php" as seen in the pull request (removing three lines and adding two). Of course maintenance mode should be enabled before. After you are done, disable maintenance mode and rescan all files via
sudo -u www-data php occ files:scan --all
.
You need to edit the file "lib/private/Files/Storage/Local.php" as seen in the pull request (removing three lines and adding two). Of course maintenance mode should be enabled before. After you are done, disable maintenance mode and rescan all files via
sudo -u www-data php occ files:scan --all
.
Thank you a lot! Worked like a charm!
I still have the problem with Nextcloud Webdav
Most helpful comment
So this error was anoying me for some time now and i went and had a look at the code.
Found the point were the filesize is casted to integer in FileInfo.php
changing the return statement to the following
solves the problem. All files and folders now show with correct file size.