
I recently updated mailcow after that I was not able to access the web interface anymore, because mailcow is apparently not available. However, this message does not go away. Now I am looking for help, as it was recommended to me.
Of course I answer all your questions
Please take a look at the issue template and provide more information.
Yes, if you want people to help you, provide logs and don't delete the template. :/
This is not the correct place for support anyway.
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) lua; lua_cfg_transform.lua:170: overriding actions from the legacy metric settings
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) lua; lua_cfg_transform.lua:123: overriding group MX from the legacy metric settingsrspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) lua; lua_cfg_transform.lua:161: group excessqp has no symbols
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) lua; lua_cfg_transform.lua:161: group excessb64 has no symbols
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) <>; lua; lua_cfg_transform.lua:467: enable options.check_all_filters for neural network
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) <>; lua; lua_cfg_transform.lua:525: converted surbl rules to rbl rules
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) <>; lua; lua_cfg_transform.lua:539: converted emails rules to rbl rules
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) cfg; rspamd_rcl_maybe_apply_lua_transform: configuration has been transformed in Lua
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) <6dsnjn>; cfg; rspamd_config_set_action_score: action greylist has been already registered with priority 0, override it with new priority: 0, old score: nan
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) <6dsnjn>; cfg; rspamd_config_set_action_score: action add header has been already registered with priority 0, override it with new priority: 0, old score: nan
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) <6dsnjn>; cfg; rspamd_config_set_action_score: action reject has been already registered with priority 0, override it with new priority: 0, old score: nan
rspamd-mailcow_1 | 2019-10-28 16:01:37 #1(main) rspamd_regexp_library_init: pcre is compiled with JIT for x86 64bit (little endian + unaligned)
rspamd-mailcow_1 | 2019-10-28 16:01:38 #1(main) <6dsnjn>; cfg; rspamd_language_detector_init: loaded 46 languages, 33122 trigramms
rspamd-mailcow_1 | 2019-10-28 16:01:38 #1(main) <6dsnjn>; cfg; chartable_module_config: init internal chartable module
rspamd-mailcow_1 | 2019-10-28 16:01:38 #1(main) <6dsnjn>; cfg; dkim_module_config: init internal dkim module
rspamd-mailcow_1 | 2019-10-28 16:01:38 #1(main) <6dsnjn>; cfg; spf_module_config: init internal spf module
rspamd-mailcow_1 | 2019-10-28 16:01:38 #1(main) <6dsnjn>; cfg; regexp_module_config: init internal regexp module, 127 regexp rules and 0 lua rules are loaded
rspamd-mailcow_1 | 2019-10-28 16:01:38 #1(main) <6dsnjn>; cfg; fuzzy_parse_rule: added fuzzy rule LOCAL_FUZZY_UNKNOWN, key: ef43ae80cc8d, shingles_key: ef43ae80cc8d, algorithm: mum
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; fuzzy_parse_rule: added fuzzy rule FUZZY_UNKNOWN, key: ef43ae80cc8d, shingles_key: ef43ae80cc8d, algorithm: mum
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; fuzzy_check_module_config: init internal fuzzy_check module, 2 rules loaded
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for antivirus
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/antivirus.wl' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module antivirus from /usr/share/rspamd/plugins/antivirus.lua; digest: 3557ab3ec8
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for arcrspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module arc from /usr/share/rspamd/plugins/arc.lua; digest: e8ad736c4f
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module asn from /usr/share/rspamd/plugins/asn.lua; digest: dc7bd0b161
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module bayes_expiry from /usr/share/rspamd/plugins/bayes_expiry.lua; digest: 79db1b5a72
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; clickhouse.lua:1154: no servers are specified, disabling module
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module clickhouse from /usr/share/rspamd/plugins/clickhouse.lua; digest: 59dbc226d9
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module clustering is enabled but has not been configured
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: clustering disabling unconfigured lua module
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module dcc is disabled in the configuration
acme-mailcow_1 | Mon Oct 28 16:01:36 CET 2019 - Waiting for Docker API...OK
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:143: reuse url for complex map definition chqbixcj: DKIM signing networks
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for dkim_signing
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module dkim_signing from /usr/share/rspamd/plugins/dkim_signing.lua; digest: 962de18f6b
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module dmarc from /usr/share/rspamd/plugins/dmarc.lua; digest: b4f0508825
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module dynamic_conf is enabled but has not been configured
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: dynamic_conf disabling unconfigured lua module
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for elastic
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; elastic.lua:445: no servers are specified, disabling module
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module elastic from /usr/share/rspamd/plugins/elastic.lua; digest: 976238ef0a
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module emails from /usr/share/rspamd/plugins/emails.lua; digest: 7ecf38ef46
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for external_services
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/antivirus.wl' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; external_services.lua:179: registered external services rule: oletools
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module external_services from /usr/share/rspamd/plugins/external_services.lua; digest: 3937a118b4
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; force_actions.lua:164: Registered symbol WHITELIST_FORWARDING_HOST_NO_GREYLIST
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; force_actions.lua:164: Registered symbol WHITELIST_FORWARDING_HOST_NO_REJECT
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module forged_recipients from /usr/share/rspamd/plugins/forged_recipients.lua; digest: 4797b13780
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module fuzzy_collect is enabled but has not been configured
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: fuzzy_collect disabling unconfigured lua module
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/greylist-whitelist-domains.inc' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/greylist-whitelist-domains.inc' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for greylist
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module greylist from /usr/share/rspamd/plugins/greylist.lua; digest: afab80696frspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module hfilter from /usr/share/rspamd/plugins/hfilter.lua; digest: 75a6e4a0b3
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for history_redis
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module history_redis from /usr/share/rspamd/plugins/history_redis.lua; digest: 8ab89b039b
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module ip_score is enabled but has not been configured
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: ip_score disabling unconfigured lua module
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module maillist from /usr/share/rspamd/plugins/maillist.lua; digest: b0ac7b11d2nginx-mailcow_1 | 2019/10/28 16:01:51 [error] 23#23: *2 connect() failed (111: Connection refused) while connecting to upstream, client: fd4d:6169:6c63:6f77::11, server: _, request: "HEAD /settings.php HTTP/1.1", upstream: "fastcgi://172.22.1.10:9001", host: "nginx"
nginx-mailcow_1 | 2019/10/28 16:01:51 [error] 23#23: *1 connect() failed (111: Connection refused) while connecting to upstream, client: fd4d:6169:6c63:6f77::11, server: _, request: "HEAD /forwardinghosts.php HTTP/1.1", upstream: "fastcgi://[fd4d:6169:6c63:6f77::e]:9001", host: "nginx"
nginx-mailcow_1 | 2019/10/28 16:01:51 [error] 23#23: *1 no live upstreams while connecting to upstream, client: fd4d:6169:6c63:6f77::11, server: _, request: "HEAD /forwardinghosts.php HTTP/1.1", upstream: "fastcgi://phpfpm", host: "nginx"
nginx-mailcow_1 | 2019/10/28 16:01:51 [error] 23#23: *2 connect() failed (111: Connection refused) while connecting to upstream, client: fd4d:6169:6c63:6f77::11, server: _, request: "HEAD /settings.php HTTP/1.1", upstream: "fastcgi://[fd4d:6169:6c63:6f77::e]:9001", host: "nginx"
nginx-mailcow_1 | fd4d:6169:6c63:6f77::11 - - [28/Oct/2019:16:01:51 +0100] "HEAD /forwardinghosts.php HTTP/1.1" 502 0 "-" "rspamd-2.0"
nginx-mailcow_1 | fd4d:6169:6c63:6f77::11 - - [28/Oct/2019:16:01:51 +0100] "HEAD /settings.php HTTP/1.1" 502 0 "-" "rspamd-2.0"
nginx-mailcow_1 | 2019/10/28 16:02:01 [error] 23#23: *6 connect() failed (111: Connection refused) while connecting to upstream, client: 80.187.114.193, server: mail.xslx.de, request: "GET / HTTP/1.1", upstream: "fastcgi://172.22.1.10:9002", host: "mail.xslx.de"
nginx-mailcow_1 | 2019/10/28 16:02:01 [error] 23#23: *6 connect() failed (111: Connection refused) while connecting to upstream, client: 80.187.114.193, server: mail.xslx.de, request: "GET / HTTP/1.1", upstream: "fastcgi://[fd4d:6169:6c63:6f77::e]:9002", host: "mail.xslx.de"
nginx-mailcow_1 | 80.187.114.193 - - [28/Oct/2019:16:02:01 +0100] "GET / HTTP/1.1" 502 1076 "-" "Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0"
nginx-mailcow_1 | 80.187.114.193 - - [28/Oct/2019:16:02:01 +0100] "GET /bower_components/bootstrap/dist/css/bootstrap.min.css HTTP/1.1" 404 146 "https://mail.xslx.de/" "Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0"
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module maps_stats is enabled but has not been configured
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: maps_stats disabling unconfigured lua module
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module metadata_exporter from /usr/share/rspamd/plugins/metadata_exporter.lua; digest: c0677c9de5
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module metric_exporter from /usr/share/rspamd/plugins/metric_exporter.lua; digest: c64704d4a8
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/mid.inc' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/mid.inc' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module mid from /usr/share/rspamd/plugins/mid.lua; digest: eaabb03713
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; milter_headers.lua:648: active routines [spam-header,x-spamd-result,x-rspamd-queue-id,authentication-results,remove-spam-flag]
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module milter_headers from /usr/share/rspamd/plugins/milter_headers.lua; digest: a82ca8ab17
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/mime_types.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/var/lib/rspamd/mime_types.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module mime_types from /usr/share/rspamd/plugins/mime_types.lua; digest: b65114b90b
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for multimap
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: freemail_envfrom (from)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: WHITELISTED_FWD_HOST (ip)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/free.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: freemail_from (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: BAD_WORDS_DE (content)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: disposable_envrcpt (rcpt)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: RCPT_MAILCOW_DOMAIN (rcpt)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/free.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: freemail_cc (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: BAD_WORDS (content)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/free.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: freemail_envrcpt (rcpt)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: RCPT_WANTS_SUBJECT_TAG (rcpt)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/disposable.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: disposable_to (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/disposable.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: disposable_replyto (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: LOCAL_BL_ASN (asn)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: GLOBAL_RCPT_WL (rcpt)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: SIEVE_HOST (ip)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: IP_WHITELIST (ip)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/free.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: freemail_replyto (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/disposable.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: disposable_from (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: GLOBAL_SMTP_FROM_BL (from)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: GLOBAL_SMTP_FROM_WL (from)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: GLOBAL_RCPT_BL (rcpt)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: RCPT_WANTS_SUBFOLDER_TAG (rcpt)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/free.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: freemail_to (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: GLOBAL_MIME_FROM_BL (header)rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/disposable.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: disposable_cc (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: GLOBAL_MIME_FROM_WL (header)rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:121: reuse url for https://maps.rspamd.com/freemail/disposable.txt.zst(hash)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: disposable_envfrom (from)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: MAILCOW_DOMAIN_HEADER_FROM (header)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; multimap.lua:1239: added multimap rule: FISHY_TLD (from)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module multimap from /usr/share/rspamd/plugins/multimap.lua; digest: 91838770d0rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for mx_check
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module mx_check from /usr/share/rspamd/plugins/mx_check.lua; digest: f7aad7447drspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for neural
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; neural.lua:1288: register ann rule LONG
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; neural.lua:1288: register ann rule SHORT
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module neural from /usr/share/rspamd/plugins/neural.lua; digest: 4d962bb6b4
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module once_received from /usr/share/rspamd/plugins/once_received.lua; digest: 81f834fec1
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module p0f is disabled in the configuration
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/redirectors.inc' is not found, but it can be loaded automatically laterrspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/redirectors.inc' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module phishing from /usr/share/rspamd/plugins/phishing.lua; digest: 64bb5243edrspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:772: old syntax for ratelimits: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:324: old style rate bucket config detected for to_ip: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:772: old syntax for ratelimits: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:324: old style rate bucket config detected for to: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:772: old syntax for ratelimits: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:324: old style rate bucket config detected for bounce_to: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:772: old syntax for ratelimits: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:324: old style rate bucket config detected for to_ip_from: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:772: old syntax for ratelimits: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:324: old style rate bucket config detected for bounce_to_ip: 100 / 1s
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:785: enabled ratelimit: to_ip [100 msgs burst, 100 msgs/sec rate]
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:785: enabled ratelimit: to [100 msgs burst, 100 msgs/sec rate]
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:785: enabled ratelimit: bounce_to [100 msgs burst, 100 msgs/sec rate]
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:785: enabled ratelimit: to_ip_from [100 msgs burst, 100 msgs/sec rate]
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; ratelimit.lua:785: enabled ratelimit: bounce_to_ip [100 msgs burst, 100 msgs/sec rate]
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for ratelimit
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module ratelimit from /usr/share/rspamd/plugins/ratelimit.lua; digest: 3a6baac778
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/surbl-whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/var/lib/rspamd/surbl-whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:862: added URL whitelist for RBL MSBL_EBL
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule MSBL_EBL: checks: alive,replyto
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule DWL_DNSWL: checks: alive,user,local,dkim,ip
netfilter-mailcow_1 | Clearing all bans
netfilter-mailcow_1 | Initializing mailcow netfilter chain
netfilter-mailcow_1 | Watching Redis channel F2B_CHANNEL
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RBL_UCEPROTECT_LEVEL2: checks: alive,user,local,ip
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule MAILSPIKE: checks: alive,user,local,ip
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RBL_SEM: checks: alive,user,local,ip
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:143: reuse url for complex map definition p1tsamp7: RBL url whitelist for SEM_URIBL_UNKNOWN
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:862: added URL whitelist for RBL SEM_URIBL_UNKNOWN
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule SEM_URIBL_UNKNOWN: checks: alive,dkim,emails,urls
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RBL_UCEPROTECT_LEVEL1: checks: alive,user,local,ip
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:143: reuse url for complex map definition p1tsamp7: RBL url whitelist for URIBL_MULTI
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:862: added URL whitelist for RBL URIBL_MULTI
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule URIBL_MULTI: checks: alive,dkim,emails,urls
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule SPAMHAUS: checks: alive,user,local,ip,received
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RBL_VIRUSFREE_UNKNOWN: checks: alive,user,local,ip
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RBL_NIXSPAM: checks: alive,user,local,iprspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule BLOCKLISTDE: checks: alive,user,local,ip,received
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RBL_SENDERSCORE: checks: alive,user,local,ip
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:143: reuse url for complex map definition p1tsamp7: RBL url whitelist for RSPAMD_URIBLrspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:862: added URL whitelist for RBL RSPAMD_URIBL
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RSPAMD_URIBL: checks: alive,dkim,emails,urls
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RBL_SEM_IPV6: checks: alive,user,local,ip
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:143: reuse url for complex map definition p1tsamp7: RBL url whitelist for SURBL_MULTI
postfix-mailcow_1 | 2019-10-28 16:01:30,116 INFO Set uid to user 0 succeeded
postfix-mailcow_1 | 2019-10-28 16:01:30,120 INFO supervisord started with pid 1
postfix-mailcow_1 | 2019-10-28 16:01:31,124 INFO spawned: 'processes' with pid 8postfix-mailcow_1 | 2019-10-28 16:01:31,127 INFO spawned: 'postfix' with pid 9
postfix-mailcow_1 | 2019-10-28 16:01:31,142 INFO spawned: 'syslog-ng' with pid 10
postfix-mailcow_1 | Oct 28 16:01:31 mail syslog-ng[10]: syslog-ng starting up; version='3.19.1'postfix-mailcow_1 | 2019-10-28 16:01:32,382 INFO success: processes entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:862: added URL whitelist for RBL SURBL_MULTI
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule SURBL_MULTI: checks: alive,dkim,emails,urls
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RCVD_IN_DNSWL: checks: alive,user,local,ip
postfix-mailcow_1 | 2019-10-28 16:01:32,382 INFO success: postfix entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
postfix-mailcow_1 | 2019-10-28 16:01:32,382 INFO success: syslog-ng entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:143: reuse url for complex map definition p1tsamp7: RBL url whitelist for RSPAMD_EMAILBL
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:862: added URL whitelist for RBL RSPAMD_EMAILBL
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule RSPAMD_EMAILBL: checks: alive,replyto
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:143: reuse url for complex map definition p1tsamp7: RBL url whitelist for DBL
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:862: added URL whitelist for RBL DBL
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule DBL: checks: alive,dkim,emails,urls
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_maps.lua:143: reuse url for complex map definition p1tsamp7: RBL url whitelist for SEM_URIBL_FRESH15_UNKNOWN
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:862: added URL whitelist for RBL SEM_URIBL_FRESH15_UNKNOWN
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; rbl.lua:903: added rbl rule SEM_URIBL_FRESH15_UNKNOWN: checks: alive,dkim,emails,urls
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module rbl from /usr/share/rspamd/plugins/rbl.lua; digest: 7df28a9498
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for replies
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module replies from /usr/share/rspamd/plugins/replies.lua; digest: 8b3141b345
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for reputation
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module reputation from /usr/share/rspamd/plugins/reputation.lua; digest: 5b579e9bee
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module rspamd_update is disabled in the configuration
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module settings from /usr/share/rspamd/plugins/settings.lua; digest: fe67219523rspamd-mailcow_1 | 2019-10-28 16:01:46 #1(main) <6dsnjn>; lua; spamassassin.lua:1644: loading SA rules from /etc/rspamd/custom/sa-rules
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; lua; spamassassin.lua:1595: loaded 0 freemail domains definitions
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; lua; spamassassin.lua:1598: loaded 0 blacklist/whitelist elements
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module spamassassin from /usr/share/rspamd/plugins/spamassassin.lua; digest: e52bcd0f41
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_config_is_module_enabled: lua module spamtrap is disabled in the configuration
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; lua; trie.lua:167: no tries defined
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module trie from /usr/share/rspamd/plugins/trie.lua; digest: 21c3eed3ab
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; lua; lua_redis.lua:540: use default Redis settings for url_redirector
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; lua; url_redirector.lua:339: no redirector_hosts_map option is specified, disabling module
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module url_redirector from /usr/share/rspamd/plugins/url_redirector.lua; digest: 4ce40418ff
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/spf_dkim_whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/var/lib/rspamd/spf_dkim_whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/dmarc_whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/var/lib/rspamd/dmarc_whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/dkim_whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/var/lib/rspamd/dkim_whitelist.inc.local' is not found, but it can be loaded automatically later
dockerapi-mailcow_1 | * Serving Flask app "dockerapi" (lazy loading)
dockerapi-mailcow_1 | * Environment: production
dockerapi-mailcow_1 | WARNING: This is a development server. Do not use it in a production deployment.
dockerapi-mailcow_1 | Use a production WSGI server instead.
dockerapi-mailcow_1 | * Debug mode: off
dockerapi-mailcow_1 | * Running on https://0.0.0.0:443/ (Press CTRL+C to quit)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/maps.d/dkim_whitelist.inc' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/local.d/maps.d/spf_whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/var/lib/rspamd/spf_whitelist.inc.local' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/etc/rspamd/maps.d/spf_whitelist.inc' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_init_lua_filters: init lua module whitelist from /usr/share/rspamd/plugins/whitelist.lua; digest: 19cd25e815
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) rspamd_url_init: initialized 9111 url match suffixes from '/usr/share/rspamd/effective_tld_names.dat'
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; cfg; rspamd_map_parse_backend: map '/var/lib/rspamd/rspamd_dynamic' is not found, but it can be loaded automatically later
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; symcache; rspamd_symcache_process_dep: cannot find dependency on symbol MAILCOW_WHITE for symbol LOCAL_BL_ASN
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <483af3>; re_cache; rspamd_re_cache_init: loaded hyperscan engine with cpu tune 'ivy' and features ''
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <84c5si>; map; read_map_file: /etc/rspamd/local.d/maps.d/spf_whitelist.inc.local: map file is not found; it will be read automatically if created
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <84c5si>; map; read_map_file: /var/lib/rspamd/spf_whitelist.inc.local: map file is not found; it will be read automatically if created
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <8m79ek>; map; read_map_file: /etc/rspamd/local.d/maps.d/dkim_whitelist.inc.local: map file is not found; it will be read automatically if created
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <8m79ek>; map; read_map_file: /var/lib/rspamd/dkim_whitelist.inc.local: map file is not found; it will be read automatically if created
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <6dsnjn>; lua; settings.lua:988: loaded 3 elements of settings
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <48y7dd>; map; read_map_file: /etc/rspamd/local.d/maps.d/redirectors.inc: map file is not found; it will be read automatically if created
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main) <48y7dd>; map; read_map_file: /etc/rspamd/local.d/redirectors.inc: map file is not found; it will be read automatically if created
rspamd-mailcow_1 | 2019-10-28 16:01:49 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
watchdog-mailcow_1 | Waiting for containers to settle...
watchdog-mailcow_1 | Mon Oct 28 16:02:01 CET 2019 - USE_WATCHDOG=n, skipping watchdog...
olefy-mailcow_1 | olefy INFO
olefy-mailcow_1 | olefy INFO
olefy-mailcow_1 | olefy INFO
olefy-mailcow_1 | olefy INFO
olefy-mailcow_1 | olefy INFO
olefy-mailcow_1 | olefy INFO
olefy-mailcow_1 | olefy INFO
olefy-mailcow_1 | olefy INFO
olefy-mailcow_1 | olefy INFO
sogo-mailcow_1 | 2019-10-28 16:01:30,853 CRIT Set uid to user 0
sogo-mailcow_1 | 2019-10-28 16:01:30,861 INFO supervisord started with pid 1
sogo-mailcow_1 | 2019-10-28 16:01:31,866 INFO spawned: 'processes' with pid 8sogo-mailcow_1 | 2019-10-28 16:01:31,879 INFO spawned: 'syslog-ng' with pid 9sogo-mailcow_1 | 2019-10-28 16:01:31,894 INFO spawned: 'cron' with pid 10
sogo-mailcow_1 | 2019-10-28 16:01:31,946 INFO spawned: 'bootstrap-sogo' with pid 11
sogo-mailcow_1 | Oct 28 16:01:32 40ae6ca240f4 syslog-ng[9]: syslog-ng starting up; version='3.8.1'
sogo-mailcow_1 | 2019-10-28 16:01:33,020 INFO success: processes entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
sogo-mailcow_1 | 2019-10-28 16:01:33,020 INFO success: syslog-ng entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
sogo-mailcow_1 | 2019-10-28 16:01:33,020 INFO success: cron entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
sogo-mailcow_1 | 2019-10-28 16:01:33,020 INFO success: bootstrap-sogo entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <7n3qic>; map; read_map_file_chunks: /var/lib/rspamd/cb5a8189726ac3a8880e1c44ed6220c6d794521b.map: read map chunk, 657 bytes
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <7n3qic>; map; rspamd_map_read_http_cached_file: read cached data for https://maps.rspamd.com/rspamd/mid.inc.zst from /var/lib/rspamd/cb5a8189726ac3a8880e1c44ed6220c6d794521b.map, 657 bytes
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <7n3qic>; map; read_map_file: /etc/rspamd/local.d/maps.d/mid.inc: map file is not found; it will be read automatically if created
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <7n3qic>; map; read_map_file: /etc/rspamd/local.d/mid.inc: map file is not found; it will be read automatically if created
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <7n3qic>; map; rspamd_kv_list_fin: read hash of 23 elements
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; read_map_static: static: read map data, 180 bytes
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; rspamd_regexp_list_fin: read regexp list of 2 elements
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; read_map_static: static: read map data, 2466 bytes
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; rspamd_regexp_list_fin: read regexp list of 104 elements
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; read_map_static: static: read map data, 452 bytes
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; rspamd_regexp_list_fin: read regexp list of 24 elements
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; read_map_static: static: read map data, 47 bytes
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; rspamd_regexp_list_fin: read regexp list of 2 elements
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; read_map_static: static: read map data, 10 bytes
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <4dego7>; map; rspamd_radix_fin: read radix trie of 1 elements: ents=1 dup=0 tbm=0 lc=1 mem=1k free=0 waste=0
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; main: rspamd 2.0 is starting, build id: release
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; main: cpu features: avx, sse2, sse3, ssse3, sse4.1, sse4.2, rdrand
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; main: cryptobox configuration: curve25519(libsodium), chacha20(avx), poly1305(libsodium), siphash(libsodium), blake2(libsodium), base64(sse42)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; main: libottery prf: AES-128
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; main: skip writing pid in no-fork mode
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; main: event loop initialised with backend: epoll
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process fuzzy (0); listen on: *rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process fuzzy (1); listen on: *rspamd-mailcow_1 | 2019-10-28 16:01:50 #21(fuzzy) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #21(fuzzy) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process rspamd_proxy (0); listen on: rspamd
rspamd-mailcow_1 | 2019-10-28 16:01:50 #22(fuzzy) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #22(fuzzy) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process controller (0); listen on: /var/lib/rspamd/rspamd.sock mode=0666 owner=nobody
rspamd-mailcow_1 | 2019-10-28 16:01:50 #23(rspamd_proxy) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #23(rspamd_proxy) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B maxrspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process normal (0); listen on: *
rspamd-mailcow_1 | 2019-10-28 16:01:50 #24(controller) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #24(controller) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #24(controller) <6dsnjn>; controller; rspamd_controller_password_sane: your normal password is not encrypted, we strongly recommend to replace it with the encrypted one
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process normal (1); listen on: *
rspamd-mailcow_1 | 2019-10-28 16:01:50 #25(normal) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #25(normal) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process normal (2); listen on: *
rspamd-mailcow_1 | 2019-10-28 16:01:50 #26(normal) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #26(normal) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process normal (3); listen on: *
rspamd-mailcow_1 | 2019-10-28 16:01:50 #27(normal) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB max
solr-mailcow_1 | Starting Solr 7.7.2
solr-mailcow_1 | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
solr-mailcow_1 | 2019-10-28 15:01:35.226 INFO (main) [ ] o.e.j.u.log Logging initialized @3846ms to org.eclipse.jetty.util.log.Slf4jLog
solr-mailcow_1 | 2019-10-28 15:01:35.736 WARN (main) [ ] o.e.j.s.AbstractConnector Ignoring deprecated socket close linger time
solr-mailcow_1 | 2019-10-28 15:01:35.768 INFO (main) [ ] o.e.j.s.Server jetty-9.4.14.v20181114; built: 2018-11-14T21:20:31.478Z; git: c4550056e785fb5665914545889f21dc136ad9e6; jvm 11.0.4+11
solr-mailcow_1 | 2019-10-28 15:01:35.889 INFO (main) [ ] o.e.j.d.p.ScanningAppProvider Deployment monitor [file:///opt/solr/server/contexts/] at interval 0
solr-mailcow_1 | 2019-10-28 15:01:36.595 INFO (main) [ ] o.e.j.w.StandardDescriptorProcessor NO JSP Support for /solr, did not find org.apache.jasper.servlet.JspServlet
solr-mailcow_1 | 2019-10-28 15:01:36.630 INFO (main) [ ] o.e.j.s.session DefaultSessionIdManager workerName=node0
solr-mailcow_1 | 2019-10-28 15:01:36.631 INFO (main) [ ] o.e.j.s.session No SessionScavenger set, using defaults
solr-mailcow_1 | 2019-10-28 15:01:36.652 INFO (main) [ ] o.e.j.s.session node0 Scavenging every 600000ms
solr-mailcow_1 | 2019-10-28 15:01:36.848 INFO (main) [ ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
solr-mailcow_1 | 2019-10-28 15:01:36.859 INFO (main) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome to Apache Solrâ„¢ version 7.7.2
solr-mailcow_1 | 2019-10-28 15:01:36.860 INFO (main) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Starting in standalone mode on port 8983
redis-mailcow_1 | 1:C 28 Oct 2019 16:01:29.029 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
redis-mailcow_1 | 1:C 28 Oct 2019 16:01:29.029 # Redis version=5.0.6, bits=64, commit=00000000, modified=0, pid=1, just started
redis-mailcow_1 | 1:C 28 Oct 2019 16:01:29.029 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
redis-mailcow_1 | 1:M 28 Oct 2019 16:01:29.042 * Running mode=standalone, port=6379.
redis-mailcow_1 | 1:M 28 Oct 2019 16:01:29.043 # WARNING: The TCP backlog setting of 511 cannot be enforced because /proc/sys/net/core/somaxconn is set to the lower value of 128.
redis-mailcow_1 | 1:M 28 Oct 2019 16:01:29.043 # Server initialized
redis-mailcow_1 | 1:M 28 Oct 2019 16:01:29.043 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
redis-mailcow_1 | 1:M 28 Oct 2019 16:01:29.043 # WARNING you have Transparent Huge Pages (THP) support enabled in your kernel. This will create latency and memory usage issues with Redis. To fix this issue run the command 'echo never > /sys/kernel/mm/transparent_hugepage/enabled' as root, and add it to your /etc/rc.local in order to retain the setting after a reboot. Redis must be restarted after THP is disabled.
redis-mailcow_1 | 1:M 28 Oct 2019 16:01:29.123 * DB loaded from disk: 0.080 seconds
redis-mailcow_1 | 1:M 28 Oct 2019 16:01:29.123 * Ready to accept connections
solr-mailcow_1 | 2019-10-28 15:01:36.861 INFO (main) [ ] o.a.s.s.SolrDispatchFilter __ / _ \ | '_| Install dir: /opt/solr
solr-mailcow_1 | 2019-10-28 15:01:36.861 INFO (main) [ ] o.a.s.s.SolrDispatchFilter |___/___/_|_| Start time: 2019-10-28T15:01:36.861737Z
solr-mailcow_1 | 2019-10-28 15:01:37.024 INFO (main) [ ] o.a.s.c.SolrResourceLoader Using system property solr.solr.home: /opt/solr/server/solr
solr-mailcow_1 | 2019-10-28 15:01:37.055 INFO (main) [ ] o.a.s.c.SolrXmlConfig Loading container configuration from /opt/solr/server/solr/solr.xml
solr-mailcow_1 | 2019-10-28 15:01:37.269 INFO (main) [ ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@463fd068, but no JMX reporters were configured - adding default JMX reporter.
solr-mailcow_1 | 2019-10-28 15:01:38.828 INFO (main) [ ] o.a.s.c.SolrResourceLoader [null] Added 0 libs to classloader, from paths: []
solr-mailcow_1 | 2019-10-28 15:01:39.961 INFO (main) [ ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=true]
solr-mailcow_1 | 2019-10-28 15:01:41.218 INFO (main) [ ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
solr-mailcow_1 | 2019-10-28 15:01:41.227 INFO (main) [ ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
solr-mailcow_1 | 2019-10-28 15:01:41.488 INFO (main) [ ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@463fd068
solr-mailcow_1 | 2019-10-28 15:01:41.490 INFO (main) [ ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@463fd068
solr-mailcow_1 | 2019-10-28 15:01:41.521 INFO (main) [ ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@463fd068
solr-mailcow_1 | 2019-10-28 15:01:41.630 INFO (main) [ ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /opt/solr/server/solr
solr-mailcow_1 | 2019-10-28 15:01:41.632 INFO (main) [ ] o.a.s.c.CorePropertiesLocator Cores are: [dovecot-fts]solr-mailcow_1 | 2019-10-28 15:01:41.835 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.c.SolrResourceLoader [dovecot-fts] Added 58 libs to classloader, from paths: [/opt/solr/contrib/clustering/lib, /opt/solr/contrib/extraction/lib, /opt/solr/contrib/langid/lib, /opt/solr/contrib/velocity/lib, /opt/solr/dist]
solr-mailcow_1 | 2019-10-28 15:01:41.948 INFO (main) [ ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@2f058b8a{/solr,file:///opt/solr/server/solr-webapp/webapp/,AVAILABLE}{/opt/solr/server/solr-webapp/webapp}
solr-mailcow_1 | 2019-10-28 15:01:41.993 INFO (main) [ ] o.e.j.s.AbstractConnector Started ServerConnector@40e37b06{HTTP/1.1,[http/1.1]}{0.0.0.0:8983}
solr-mailcow_1 | 2019-10-28 15:01:41.995 INFO (main) [ ] o.e.j.s.Server Started @10622ms
solr-mailcow_1 | 2019-10-28 15:01:42.074 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.7.0
solr-mailcow_1 | 2019-10-28 15:01:42.381 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.s.IndexSchema [dovecot-fts] Schema name=dovecot-fts
solr-mailcow_1 | 2019-10-28 15:01:42.797 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.s.IndexSchema Loaded schema dovecot-fts/2.0 with uniqueid field id
solr-mailcow_1 | 2019-10-28 15:01:42.818 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.c.CoreContainer Creating SolrCore 'dovecot-fts' using configuration from instancedir /opt/solr/server/solr/dovecot-fts, trusted=truesolr-mailcow_1 | 2019-10-28 15:01:42.921 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.dovecot-fts' (registry 'solr.core.dovecot-fts') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@463fd068
solr-mailcow_1 | 2019-10-28 15:01:42.993 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.c.SolrCore [[dovecot-fts] ] Opening new SolrCore at [/opt/solr/server/solr/dovecot-fts], dataDir=[/opt/solr/server/solr/dovecot-fts/data/]
solr-mailcow_1 | 2019-10-28 15:01:43.089 ERROR (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.c.SolrCore [dovecot-fts] Solr index directory '/opt/solr/server/solr/dovecot-fts/data/index/' is locked (lockType=native). Throwing exception.
solr-mailcow_1 | 2019-10-28 15:01:43.091 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.c.SolrCore [dovecot-fts] CLOSING SolrCore org.apache.solr.core.SolrCore@37d131d0
solr-mailcow_1 | 2019-10-28 15:01:43.092 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.dovecot-fts, tag=37d131d0
solr-mailcow_1 | 2019-10-28 15:01:43.095 INFO (coreLoadExecutor-9-thread-1) [ x:dovecot-fts] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@43619847: rootName = null, domain = solr.core.dovecot-fts, service url = null, agent id = null] for registry solr.core.dovecot-fts / com.codahale.metrics.MetricRegistry@62608ee7
solr-mailcow_1 | 2019-10-28 15:01:43.132 ERROR (coreContainerWorkExecutor-2-thread-1) [ ] o.a.s.c.CoreContainer Error waiting for SolrCore to be loaded on startup
solr-mailcow_1 | org.apache.solr.common.SolrException: Unable to create core [dovecot-fts]
solr-mailcow_1 | at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1208) ~[solr-core-7.7.2.jar:7.7.2 d4c30fc2856154f2c1fefc589eb7cd070a415b94 - janhoy - 2019-05-28 23:37:48]
solr-mailcow_1 | at org.apache.solr.core.CoreContainer.lambda$load$13(CoreContainer.java:699) ~[solr-core-7.7.2.jar:7.7.2 d4c30fc2856154f2c1fefc589eb7cd070a415b94 - janhoy - 2019-05-28 23:37:48]
solr-mailcow_1 | at com.codahale.metrics.InstrumentedExecutorService$InstrumentedCallable.call(InstrumentedExecutorService.java:197) ~[metrics-core-3.2.6.jar:3.2.6]
solr-mailcow_1 | at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
unbound-mailcow_1 | Setting console permissions...
unbound-mailcow_1 | Receiving anchor key...
unbound-mailcow_1 | Receiving root hints...
unbound-mailcow_1 | #=#=# ##O#- # ##O=# # ######################################################################## 100.0%
unbound-mailcow_1 | setup in directory /etc/unbound
unbound-mailcow_1 | generating unbound_server.key
unbound-mailcow_1 | Generating RSA private key, 3072 bit long modulus (2 primes)unbound-mailcow_1 | ................................++++
solr-mailcow_1 | at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209) [solr-solrj-7.7.2.jar:7.7.2 d4c30fc2856154f2c1fefc589eb7cd070a415b94 - janhoy - 2019-05-28 23:37:52]
solr-mailcow_1 | at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
solr-mailcow_1 | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
solr-mailcow_1 | at java.lang.Thread.run(Thread.java:834) [?:?]
solr-mailcow_1 | Caused by: org.apache.solr.common.SolrException: Index dir '/opt/solr/server/solr/dovecot-fts/data/index/' of core 'dovecot-fts' is already locked. The most likely cause is another Solr server (or another solr core in this server) also configured to use this directory; other possible causes may be specific to lockType: native
solr-mailcow_1 | at org.apache.solr.core.SolrCore.
solr-mailcow_1 | at org.apache.solr.core.SolrCore.
solr-mailcow_1 | at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1187) ~[solr-core-7.7.2.jar:7.7.2 d4c30fc2856154f2c1fefc589eb7cd070a415b94 - janhoy - 2019-05-28 23:37:48]
solr-mailcow_1 | ... 7 more
solr-mailcow_1 | Caused by: org.apache.lucene.store.LockObtainFailedException: Index dir '/opt/solr/server/solr/dovecot-fts/data/index/' of core 'dovecot-fts' is already locked. The most likely cause is another Solr server (or another solr core in this server) also configured to use this directory; other possible causes may be specific to lockType: native
solr-mailcow_1 | at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:754) ~[solr-core-7.7.2.jar:7.7.2 d4c30fc2856154f2c1fefc589eb7cd070a415b94 - janhoy - 2019-05-28 23:37:48]
solr-mailcow_1 | at org.apache.solr.core.SolrCore.
solr-mailcow_1 | at org.apache.solr.core.SolrCore.
solr-mailcow_1 | at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1187) ~[solr-core-7.7.2.jar:7.7.2 d4c30fc2856154f2c1fefc589eb7cd070a415b94 - janhoy - 2019-05-28 23:37:48]
solr-mailcow_1 | ... 7 more
unbound-mailcow_1 | ...............................................................................................................................++++unbound-mailcow_1 | e is 65537 (0x010001)
unbound-mailcow_1 | generating unbound_control.key
unbound-mailcow_1 | Generating RSA private key, 3072 bit long modulus (2 primes)unbound-mailcow_1 | .........................................................++++
unbound-mailcow_1 | ....................................++++
unbound-mailcow_1 | e is 65537 (0x010001)
rspamd-mailcow_1 | 2019-10-28 16:01:50 #27(normal) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; rspamd_fork_worker: prepare to fork process hs_helper (0), no bind socket
rspamd-mailcow_1 | 2019-10-28 16:01:50 #28(normal) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #28(normal) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B max
rspamd-mailcow_1 | 2019-10-28 16:01:50 #1(main) <34db67>; main; main: listening for control commands on /var/lib/rspamd/rspamd.sockrspamd-mailcow_1 | 2019-10-28 16:01:50 #29(hs_helper) <34db67>; main; rspamd_worker_set_limits: use system max file descriptors limit: 1024KiB cur and 1024KiB maxrspamd-mailcow_1 | 2019-10-28 16:01:50 #29(hs_helper) <34db67>; main; rspamd_worker_set_limits: use system max core size limit: -1B cur and -1B max
mysql-mailcow_1 | 2019-10-28 16:01:31 0 [Note] mysqld (mysqld 10.3.18-MariaDB-1:10.3.18+maria~bionic) starting as process 1 ...
mysql-mailcow_1 | 2019-10-28 16:01:31 0 [ERROR] mysqld: Can't lock aria control file '/var/lib/mysql/aria_log_control' for exclusive use, error: 11. Will retry for 30 seconds
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [ERROR] mysqld: Got error 'Could not get an exclusive lock; file is probably in use by another process' when trying to use aria control file '/var/lib/mysql/aria_log_control'
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [ERROR] Plugin 'Aria' init function returned error.
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [ERROR] Plugin 'Aria' registration as a STORAGE ENGINE failed.
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Using Linux native AIO
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Uses event mutexes
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Number of pools: 1
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Using SSE2 crc32 instructions
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Initializing buffer pool, total size = 256M, instances = 1, chunk size = 128M
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Completed initialization of buffer pool
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority().
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:02 0 [Note] InnoDB: Retrying to lock the first data file
mysql-mailcow_1 | 2019-10-28 16:02:03 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:03 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
unbound-mailcow_1 | create unbound_server.pem (self signed certificate)
unbound-mailcow_1 | create unbound_control.pem (signed client certificate)unbound-mailcow_1 | Signature ok
unbound-mailcow_1 | subject=CN = unbound-control
unbound-mailcow_1 | Getting CA Private Key
unbound-mailcow_1 | Setup success. Certificates created. Enable in unbound.conf file to use
unbound-mailcow_1 | [1572274900] unbound[1:0] notice: init module 0: validator
unbound-mailcow_1 | [1572274900] unbound[1:0] notice: init module 1: iterator
unbound-mailcow_1 | [1572274900] unbound[1:0] info: start of service (unbound 1.9.1).
unbound-mailcow_1 | [1572274904] unbound[1:0] info: generate keytag query _ta-4f66. NULL IN
mysql-mailcow_1 | 2019-10-28 16:02:04 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:04 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:05 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:05 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:06 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:06 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:07 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:07 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:08 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:08 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:09 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:09 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:10 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:10 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:11 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:11 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:12 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:12 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:13 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:13 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:14 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:14 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:15 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:15 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:16 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:16 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:17 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:17 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:18 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:18 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:19 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:19 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:20 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:20 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:21 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:21 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:22 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:22 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:23 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:23 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:24 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:24 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
mysql-mailcow_1 | 2019-10-28 16:02:25 0 [ERROR] InnoDB: Unable to lock ./ibdata1 error: 11
mysql-mailcow_1 | 2019-10-28 16:02:25 0 [Note] InnoDB: Check that you do not already have another mysqld process using the same InnoDB data or log files.
clamd-mailcow_1 | Cleaning up tmp files...
clamd-mailcow_1 | Copying non-empty whitelist.ign2 to /var/lib/clamav/whitelist.ign2
clamd-mailcow_1 | Stating whitelist.ign2
clamd-mailcow_1 | File: /var/lib/clamav/whitelist.ign2
clamd-mailcow_1 | Size: 27 Blocks: 8 IO Block: 4096 regular file
clamd-mailcow_1 | Device: fe01h/65025d Inode: 10017857 Links: 1
clamd-mailcow_1 | Access: (0644/-rw-r--r--) Uid: ( 700/ clamav) Gid: ( 700/ clamav)
clamd-mailcow_1 | Access: 2019-10-28 16:01:28.619821317 +0100
clamd-mailcow_1 | Modify: 2019-10-28 16:01:28.619821317 +0100
clamd-mailcow_1 | Change: 2019-10-28 16:01:28.631820215 +0100
clamd-mailcow_1 | Birth: -
clamd-mailcow_1 | dos2unix: converting file /var/lib/clamav/whitelist.ign2 to Unix format...
clamd-mailcow_1 | Running freshclam...
clamd-mailcow_1 | Mon Oct 28 16:01:28 2019 -> ClamAV update process started at Mon Oct 28 16:01:28 2019
clamd-mailcow_1 | Mon Oct 28 16:01:48 2019 -> ^Can't query current.cvd.clamav.net
clamd-mailcow_1 | Mon Oct 28 16:01:48 2019 -> ^Invalid DNS reply. Falling back to HTTP mode.
clamd-mailcow_1 | Mon Oct 28 16:02:00 2019 -> Downloading main.cvd [100%]
clamd-mailcow_1 | Mon Oct 28 16:02:17 2019 -> main.cvd updated (version: 58, sigs: 4566249, f-level: 60, builder: sigmgr)
I have the same problem. It looks like mysql-mailcow container does not work correctly on fresh install. I've got php-fpm logs look like this:
root@mailgw /opt/mailcow-dockerized # docker-compose logs --tail=20 -f php-fpm-mailcow
Attaching to mailcowdockerized_php-fpm-mailcow_1
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
php-fpm-mailcow_1 | Waiting for SQL...
When I tried to connect to the mysql database using credentials from mailcow.conf and failed. I tried few more times with various credential combinations and determined that I can log into mysql console using mailcow and root users but only without passwords. Passwords that were set in mailcow.conf were not correct.
This example shows how I was not able to log into mysql db:
root@mailgw /opt/mailcow-dockerized # source mailcow.conf
root@mailgw /opt/mailcow-dockerized # docker-compose exec mysql-mailcow mysql -u${DBUSER} -p${DBPASS} ${DBNAME}
ERROR 1045 (28000): Access denied for user 'mailcow'@'localhost' (using password: YES)
And here is how I was able to log in:
➜ mailcow-dockerized git:(master) docker-compose exec mysql-mailcow mysql -
Welcome to the MariaDB monitor. Commands end with ; or \g.
Your MariaDB connection id is 293
Server version: 10.3.18-MariaDB-1:10.3.18+maria~bionic mariadb.org binary distribution
Copyright (c) 2000, 2018, Oracle, MariaDB Corporation Ab and others.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
MariaDB [(none)]> show databases;
+--------------------+
| Database |
+--------------------+
| information_schema |
| mysql |
| performance_schema |
| test |
+--------------------+
4 rows in set (0.001 sec)
MariaDB [(none)]> Bye
I decided to try again with fresh mailcow install and it happened again - after docker-compose pull and docker-compose up -d, all containers started, but mysql-mailcow container did not had passwords configured for its users (hence php-fpm-mailcow was unable to connect to database), and the database itself was not created.
@SamuelNitsche, @andryyy Any suggestions for other logs that could be of use?
I found a solution. Comment out skip-name-resolve in /opt/mailcow-dockerized/data/conf/mysql/my.cnf. When this option is on, MySQL/MariaDB will check for user access rights only in rows having host in form of IP address. When running in docker, host will be container id (alphanumeric) so the db will not use this user for checking access rights thus preventing any access to the db (php-fpm).
@Clasoheld does this help?
I found a solution. Comment out skip-name-resolve in
/opt/mailcow-dockerized/data/conf/mysql/my.cnf. When this option is on, MySQL/MariaDB will check for user access rights only in rows having host in form of IP address. When running in docker, host will be container id (alphanumeric) so the db will not use this user for checking access rights thus preventing any access to the db (php-fpm).@Clasoheld does this help?
I have the same problem and tried your solution. This did not work for me.
I still have
php-fpm-mailcow_1 | Waiting for SQL...
in my logs.
We moved the tz import to php-fpm-mailcow. On "slow" disks it failed previously. Known bug in the MariaDB image.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
Yes, if you want people to help you, provide logs and don't delete the template. :/
This is not the correct place for support anyway.