cron should finish without error. The memory_limit in /etc/php/7.4/cli/php.ini is 4096m, so this should be plenty.
Two times per day, the cronjob gets killed because of exhausting memory. Before I set a memory_limit, I just saw that php was getting killed by the kernel and that nextcloud was down for several minutes. Then, I did set a memory limit and so I started getting the following message twice a day:
PHP Fatal error: Allowed memory size of 4294967296 bytes exhausted (tried to allocate 262144 bytes) in /var/www/****/cloud.****/apps/mail/lib/IMAP/Threading/Container.php on line 111
PHP Fatal error: Allowed memory size of 4294967296 bytes exhausted (tried to allocate 262144 bytes) in Unknown on line 0
Mail app version: 1.5.0
Mailserver or service: self-hosted dovecot
Operating system: Ubuntu 20.04
Web server: Nginx
Database: MariaDB
PHP version: 7.4
Nextcloud Version: 20.0.0
(doesn't apply here)
Operating system: (e.g. Arch Linux)
Please run occ mail:account:diagnose <accountId> for the account ID of that cron job.
And because it's related to threading you could export the account's threading data with occ mail:account:export-threads --redact <accountId> if you want to share that. All the info is redacted. You would only leak your number of emails and how those are related on an abstract level without any real IDs or anything.
Please run
occ mail:account:diagnose <accountId>for the account ID of that cron job.
IMAP capabilities:
- AUTH
- ENABLE
- ID
- IDLE
- IMAP4REV1
- LITERAL+
- LOGIN-REFERRALS
- SASL-IR
Horde error occurred: IMAP error reported by server.. See nextcloud.log for more details.
and here is what seems to be the corresponding error in nextcloud.log:
`:{"Exception":"Horde_Imap_Client_Exception_ServerResponse","Message":"IMAP error reported by server.","Code":0,"Trace":[{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/pear-pear.horde.org/Horde_Imap_Client/Horde/Imap/Client/Socket.php","line":4314,"function":"_getLine","class":"Horde_Imap_Client_Socket","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/pear-pear.horde.org/Horde_Imap_Client/Horde/Imap/Client/Socket.php","line":4242,"function":"_sendCmdChunk","class":"Horde_Imap_Client_Socket","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/pear-pear.horde.org/Horde_Imap_Client/Horde/Imap/Client/Socket.php","line":1751,"function":"_sendCmd","class":"Horde_Imap_Client_Socket","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/pear-pear.horde.org/Horde_Imap_Client/Horde/Imap/Client/Base.php","line":1733,"function":"_status","class":"Horde_Imap_Client_Socket","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/lib/Command/DiagnoseAccount.php","line":136,"function":"status","class":"Horde_Imap_Client_Base","type":"->"},{"function":"OCA\\Mail\\Command\\{closure}","class":"OCA\\Mail\\Command\\DiagnoseAccount","type":"->","args":["*** sensitive parameters replaced ***"]},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/lib/Command/DiagnoseAccount.php","line":138,"function":"array_reduce"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/lib/Command/DiagnoseAccount.php","line":90,"function":"printMailboxesMessagesStats","class":"OCA\\Mail\\Command\\DiagnoseAccount","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/symfony/console/Command/Command.php","line":255,"function":"execute","class":"OCA\\Mail\\Command\\DiagnoseAccount","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/symfony/console/Application.php","line":1005,"function":"run","class":"Symfony\\Component\\Console\\Command\\Command","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/symfony/console/Application.php","line":271,"function":"doRunCommand","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/symfony/console/Application.php","line":147,"function":"doRun","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/lib/private/Console/Application.php","line":215,"function":"run","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/console.php","line":100,"function":"run","class":"OC\\Console\\Application","type":"->"},{"file":"/var/www/chrooted/*****/htdocs/cloud.*****/occ","line":11,"args":["/var/www/chrooted/*****/htdocs/cloud.*****/console.php"],"function":"require_once"}],"File":"/var/www/chrooted/*****/htdocs/cloud.*****/apps/mail/vendor/pear-pear.horde.org/Horde_Imap_Client/Horde/Imap/Client/Socket.php","Line":4602,"CustomMessage":"Could not get account statistics"},"userAgent":"--","version":"20.0.0.9"}
`
https://gist.github.com/vasyugan/5b2200d8d9cd554d8f787814ba72a061
I can build the threads just fine with only 24MB
php occ mail:thread /tmp/mailid.log
6944 threads built from 8076 messages
24MB of memory used
can you reproduce this?
https://github.com/nextcloud/mail/pull/3871 is required, otherwise that command will fill the log with LOTS of errors.
php occ mail:thread /tmp/mailid.log
Yes, same output, only that it doesn't reveal the amount of memory used, but it finishes without error.
Okay … so it's not the threading. But just that at the point of threading the mem is used by something else.
Horde error occurred: IMAP error reported by server.. See nextcloud.log for more details.
That was because of one of the other bugs you reported, right?
You can also try just the sync (which includes threading) with occ mail:account:sync <accountId> in the hope that the problematic mailbox or message is not marked for background sync and thus the process hopefully works.
Okay … so it's not the threading. But just that at the point of threading the mem is used by something else.
Horde error occurred: IMAP error reported by server.. See nextcloud.log for more details.
That was because of one of the other bugs you reported, right?
Don't think so. Because re the other bugs, I have moved the offending message to another place, so that the bug no longer occurs, but the gist is that it can't get the account stats from dovecot, and that's probably because I haven't enabled the stats module in dovecot. I guess this is immaterial to the issue at hand.
You can also try just the sync (which includes threading) with
occ mail:account:sync <accountId>in the hope that the problematic mailbox or message is not marked for background sync and thus the process hopefully works.
Well, that's how I identified the offending mail account in the first place, because this threw an error. But weirdly, this time it finished without error. Which, I guess, makes tracing the cause of this bug more difficult. Let's see if I get more error mails from cron...
I see. Running out of memory is hard to trace but maybe you can find a pattern.
There is also the chance that another job runs before that Mail background sync and consumes most of the memory, then Mail does its job and reaches the limit.
I see. Running out of memory is hard to trace but maybe you can find a pattern.
There is also the chance that another job runs before that Mail background sync and consumes most of the memory, then Mail does its job and reaches the limit.
The first time I ran sync for the account manually, it actually did run out of the memory that I had given it (4 gigs!!) and it did so quickly. So this script had caused the memory exhaustion running standalone.
Good to know!
Could you maybe try running the sync in a loop a few times to see if it eventually shows up again?
Are you using the latest version of php? Just in case there's a bug on that level.
Good to know!
Could you maybe try running the sync in a loop a few times to see if it eventually shows up again?
Are you using the latest version of php? Just in case there's a bug on that level.
PHP 7.4 FPM on Ubuntu 20.04
So this time, I have been able to reproduce the memory exhaustion with a different account (a GMail account). The trick is apparently to append -f after sync in order to reproduce what happens during those cron jobs:
[debug] Account 50 has 64010 messages for threading
PHP Fatal error: Allowed memory size of 4294967296 bytes exhausted (tried to allocate 262144 bytes) in /var/www/htdocs/cloud.****/apps/mail/lib/IMAP/Threading/Container.php on line 111
Diagnose for this account says:
IMAP capabilities:
- AUTH
- CHILDREN
- ID
- IDLE
- IMAP4REV1
- NAMESPACE
- QUOTA
- SASL-IR
- UNSELECT
- X-GM-EXT-1
- XLIST
- XYZZY
Horde error occurred: The object could not be deleted because it does not exist.. See nextcloud.log for more details.
UPDATE:
@ChristophWurst So hear is the mail thread file:
Thanks. I can't reproduce the mem exhaustion because the provided threading data triggers an infinite recursion for me
php occ mail:thread ~/Downloads/mail-threads.log
An unhandled exception has been thrown:
Error: Maximum function nesting level of '256' reached, aborting! in /home/christoph/workspace/nextcloud/apps/mail/lib/IMAP/Threading/Container.php:106
Thanks. I can't reproduce the mem exhaustion because the provided threading data triggers an infinite recursion for me
php occ mail:thread ~/Downloads/mail-threads.log An unhandled exception has been thrown: Error: Maximum function nesting level of '256' reached, aborting! in /home/christoph/workspace/nextcloud/apps/mail/lib/IMAP/Threading/Container.php:106
To get to a memory exhaustion, you first have to set a memory limit. The default memory limit for the php cli on Debian and Ubuntu is -1, that is, there is no limit, see /etc/php/7.?/cli/php.ini
I attached a debugger and it peaks at 110MB memory used
i am seeing this error when running occ mail:account:sync <accound_id> i have 140k messages.
@eleith I've only tried with accounts of ~80k messages. So please see https://github.com/nextcloud/mail/blob/master/doc/admin.md#export-threading-data and if you provide us with this redacted data we can see if this is reproducible. Please also tell us at how many bytes the memory was exhausted.
occ mail:account:sync <account-id>
ends with the following error
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted
(tried to allocate 262144 bytes)
in /var/www/html/apps/mail/lib/IMAP/Threading/Container.php
on line 111
and
occ mail:account:export-threads --redact <account-id>
produces a 50+ Mb file
however, it's too large to upload to github, but i can upload it elsewhere or provide you a link privately. let me know your preference, if you still want the file.
occ mail:account:sync <account-id>ends with the following error
What's the rest of the output? E.g. it should show how many emails are there for threading. Please post that output here, it shouldn't contain any sensitive data except number of emails.
produces a 50+ Mb file
however, it's too large to upload to github, but i can upload it elsewhere or provide you a link privately. let me know your preference, if you still want the file.
Yes please put it into a compressed archive and send me a link to the email you can find in license headers.
email sent!
below is the full output of the sync command
./occ mail:account:sync <account-id>
[debug] account is up to date, skipping mailbox sync
[debug] Skipping mailbox sync for Archive
[debug] Skipping mailbox sync for Drafts
[debug] partial sync <account-id>:INBOX - get all known UIDs took 1s
[debug] partial sync <account-id>:INBOX - get new messages via Horde took 1s
[debug] partial sync <account-id>:INBOX - persist new messages took 0s
[debug] partial sync <account-id>:INBOX - get changed messages via Horde took 0s
[debug] partial sync <account-id>:INBOX - persist changed messages took 0s
[debug] partial sync <account-id>:INBOX - get vanished messages via Horde took 0s
[debug] partial sync <account-id>:INBOX - persist new messages took 0s
[debug] partial sync <account-id>:INBOX took 2s
[debug] Skipping mailbox sync for Junk
[debug] Skipping mailbox sync for Sent
[debug] Skipping mailbox sync for Trash
[debug] Account <account-id> has 142578 messages for threading
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 262144 bytes) in /var/www/html/apps/mail/lib/IMAP/Threading/Container.php on line 111
Thanks a lot. While I don't run out of memory (~280MB used with xdebug enabled) I get an infinite recursion.
What raises my suspicious is that your exported data contains ~140k messages. Does that seem right? Could you also run occ mail:account:diagnose for that account?
./occ mail:account:diagnose <account-id>
IMAP capabilities:
- AUTH
- IMAP4
- IMAP4REV1
- UIDPLUS
Account has 155026 messages in 7 mailboxes
i access this inbox with another email client and can confirm
Ah right that is also what you put into your other reply. Then the data seems right.
Do you also run into out of memory issues when you thread the exported data? E.g. with occ mail:thread export.json?
@ChristophWurst i do run out of memory, here is the output
php occ mail:thread email.json
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 262144 bytes) in /var/www/html/apps/mail/lib/IMAP/Threading/Container.php on line 111
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 262144 bytes) in Unknown on line 0
I'm seeing something similar to @eleith. I'm running NC 20.0.2 and Mail 1.7.1 on an updated Debian 10 VM running dovecot 2.3.4.1.
# php occ mail:account:diagnose -- 1
IMAP capabilities:
- AUTH
- ENABLE
- ID
- IDLE
- IMAP4REV1
- LITERAL+
- LOGIN-REFERRALS
- SASL-IR
Account has 170155 messages in 26 mailboxes
If I try to run ./occ mail:account:sync --force -- 1, within seconds the OS invokes the oom-killer and kills apache2. This was with me setting the php-cli memory limit to -1 to see what would happen. When not set to -1, you get the same PHP fatal errors that have been posted about "Allowed memory size exhausted".
Same problem on some (not all) of the cron runs. Too bad the error message did not tell me with which account it will happen.
Environment:
NC 20.02 on Scientific Linux 7.9 (same as CentOS 7.9)
rh-php72-php-7.2.24
Mailserver : Dovecot 2.3.11
around 5000 Users, 5.7 TByte of Mailstorage
Since update to NC 20 (20.0.4) I do have the same issue with my nextcloud cronjob regularly returning Allowed memory size exhausted error messages. I use the Mail Addon version 1.7.2 here. After some investigation, I identified one account for which nextcloud seem to try to allocate more than the 4GB maximum it was assigned for php-cli. See:
# sudo -u nginx php ./occ mail:account:sync -- 1
[...]
[debug] Account 1 has 42129 messages for threading
PHP Fatal error: Allowed memory size of 4294967296 bytes exhausted (tried to allocate 262144 bytes) in /var/www/localhost/htdocs/owncloud/apps/mail/lib/IMAP/Threading/Container.php on line 111
Another account just finishes fine without any issues. So has there been any progress in investigating this issue and identifying a solution or workaround for that issue? With older NC19 I didn't have such issues, btw.
I have the same issue with v20.0.4 and Mailapp 1.7.2.
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 262144 bytes) in html/apps/mail/lib/IMAP/Threading/Container.php on line 111
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 262144 bytes) in Unknown on line 0
Largest mailbox may be 1,5GB. I'm can reproduce this easily when clicking some folders in the mail app.
I have to see if there is an easy way to get the memory footprint down.
Now it uses ALL available memory. ~15GB via cron. i limited PHP CLI to 2gb for now.
mmap() failed: [12] Cannot allocate memory
mmap() failed: [12] Cannot allocate memory
PHP Fatal error: Out of memory (allocated 14805893120) (tried to allocate 262144 bytes) in /apps/mail/lib/IMAP/Threading/Container.php on line 111
somethings need to be really really wrong!
I'm also facing the same issue with one of the mail accounts:
php occ mail:account:diagnose 10
IMAP capabilities:
- AUTH
- ENABLE
- ID
- IMAP4REV1
- LITERAL+
- SASL-IR
Account has 8648 messages in 17 mailboxes
I tried exporting the threads of that account:
occ mail:account:export-threads 10 > /tmp/mailid.log!
The resulting file is only 2.1 MB.
And when I then do php occ mail:thread /tmp/mailid.log I get
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 262144 bytes) in /opt/nextcloud/apps/mail/lib/IMAP/Threading/Container.php on line 111
I've attached the redacted version of the thread output file.
mailidR.log
Thanks for any help! Ps. I'm running Nextcloud 20.0.4
So, I see this phenomenon once again. And when I set the memory limit to -1, the process just never finishes. Another redacted thread file is attached. When I run occ mail:thread on the file, it likewise never finishes.
threads.log
I strongly believe the problem isn't actually about memory consumption but that the threading data builds some infinitely large data structure due to some recursion. I'll try to reproduce soonish with one of your data sets. The smaller ones are ideal as it will hopefully allow my to manually shrink the file to a bare minimum where the problem is still reproducible. If anyone wants to help or is just curious: try to remove as many entries from the json array. At some point we should have a low number of messages that cause this problem.
I strongly believe the problem isn't actually about memory consumption but that the threading data builds some infinitely large data structure due to some recursion. I'll try to reproduce soonish with one of your data sets. The smaller ones are ideal as it will hopefully allow my to manually shrink the file to a bare minimum where the problem is still reproducible. If anyone wants to help or is just curious: try to remove as many entries from the json array. At some point we should have a low number of messages that cause this problem.
Isn't there some debug log that allows to see in real time what is going on?
BTW, since the trouble started only today, I assume that the culprit must be some message that arrived today. Before that, the account was just fine.
Isn't there some debug log that allows to see in real time what is going on?
Well, we already have lots of debug info from the previous reports. My guess is that the code dies in \OCA\Mail\IMAP\Threading\ThreadBuilder::buildIdTable. That is where we take the list of raw message data and transform that into a tree structure based on references between messages.
BTW, since the trouble started only today, I assume that the culprit must be some message that arrived today. Before that, the account was just fine.
That's a very good point. It might be very well the case. A completely wild guess is to check if there is any message the references itself, as in, message-id is found in in-reply-to or references. That would be a valid message at all, but someone could send it to trigger this bug.