After a period of inactivity (in my case about a day or so) the database connection seems to be closed. When I visit the frontpage I get presented with an error 500 (see screenshot).
I am using a docker deployment with mysql as the database technology.
docker stack deploy report --compose-file ./stack.yml
My stack file:
version: '3.6'
services:
ghost:
image: ghost
ports:
- 80:2368
environment:
url: https://my.url
database__client: mysql
database__connection__host: db
database__connection__user: root
database__connection__password: randomly_generated
database__connection__database: ghost
volumes:
- ghost:/var/lib/ghost/content
db:
image: mysql:5.7
environment:
MYSQL_ROOT_PASSWORD: randomly_generated
volumes:
- mysql:/var/lib/mysql
volumes:
ghost:
mysql:
Logs:
2018-06-19T07:54:31.620481474Z [2018-06-19 07:54:31] ERROR
2018-06-19T07:54:31.620518661Z
2018-06-19T07:54:31.620526193Z CODE: ECONNRESET
2018-06-19T07:54:31.620532453Z MESSAGE: select `clients`.* from `clients` where `clients`.`slug` = 'ghost-frontend' and `clients`.`type` = 'ua' limit 1 - read ECONNRESET
2018-06-19T07:54:31.620538871Z
2018-06-19T07:54:31.620544563Z Error: read ECONNRESET
2018-06-19T07:54:31.620550436Z at exports._errnoException (util.js:1020:11)
2018-06-19T07:54:31.620556321Z at TCP.onread (net.js:580:26)
2018-06-19T07:54:31.620562180Z --------------------
2018-06-19T07:54:31.620567996Z at Protocol._enqueue (/var/lib/ghost/versions/1.24.2/node_modules/mysql/lib/protocol/Protocol.js:145:48)
2018-06-19T07:54:31.620574107Z at Connection.query (/var/lib/ghost/versions/1.24.2/node_modules/mysql/lib/Connection.js:208:25)
2018-06-19T07:54:31.620580107Z at /var/lib/ghost/versions/1.24.2/node_modules/knex/lib/dialects/mysql/index.js:161:18
2018-06-19T07:54:31.620586080Z at Promise._execute (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/debuggability.js:303:9)
2018-06-19T07:54:31.620592130Z at Promise._resolveFromExecutor (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/promise.js:483:18)
2018-06-19T07:54:31.620598328Z at new Promise (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/promise.js:79:10)
2018-06-19T07:54:31.620604372Z at Client_MySQL._query (/var/lib/ghost/versions/1.24.2/node_modules/knex/lib/dialects/mysql/index.js:155:12)
2018-06-19T07:54:31.620610420Z at Client_MySQL.query (/var/lib/ghost/versions/1.24.2/node_modules/knex/lib/client.js:206:17)
2018-06-19T07:54:31.620616377Z at Runner.<anonymous> (/var/lib/ghost/versions/1.24.2/node_modules/knex/lib/runner.js:155:36)
2018-06-19T07:54:31.620622505Z at Runner.tryCatcher (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/util.js:16:23)
2018-06-19T07:54:31.620637647Z at Runner.query (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/method.js:15:34)
2018-06-19T07:54:31.620644199Z at /var/lib/ghost/versions/1.24.2/node_modules/knex/lib/runner.js:61:21
2018-06-19T07:54:31.620650077Z at tryCatcher (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/util.js:16:23)
2018-06-19T07:54:31.620655962Z at /var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/using.js:185:26
2018-06-19T07:54:31.620661835Z at tryCatcher (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/util.js:16:23)
2018-06-19T07:54:31.620667734Z at Promise._settlePromiseFromHandler (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/promise.js:512:31)
2018-06-19T07:54:31.620673672Z
2018-06-19T07:54:31.628986660Z ERROR [2018-06-19 07:54:31] "GET /" 500 13ms
2018-06-19T07:54:31.629074994Z
2018-06-19T07:54:31.629083165Z NAME: InternalServerError
2018-06-19T07:54:31.629089226Z CODE: ECONNRESET
2018-06-19T07:54:31.629095005Z MESSAGE: select count(distinct posts.id) as aggregate from `posts` where (`posts`.`status` = 'published') and (`posts`.`page` = false) - read ECONNRESET
2018-06-19T07:54:31.629101133Z
2018-06-19T07:54:31.629106596Z level:normal
2018-06-19T07:54:31.629112373Z
2018-06-19T07:54:31.629117931Z InternalServerError: select count(distinct posts.id) as aggregate from `posts` where (`posts`.`status` = 'published') and (`posts`.`page` = false) - read ECONNRESET
2018-06-19T07:54:31.629124042Z at new GhostError (/var/lib/ghost/versions/1.24.2/core/server/lib/common/errors.js:9:26)
2018-06-19T07:54:31.629129843Z at prepareError (/var/lib/ghost/versions/1.24.2/core/server/web/middleware/error-handler.js:39:19)
2018-06-19T07:54:31.629145234Z at Layer.handle_error (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/layer.js:71:5)
2018-06-19T07:54:31.629152560Z at trim_prefix (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:315:13)
2018-06-19T07:54:31.629158445Z at /var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:284:7
2018-06-19T07:54:31.629164140Z at Function.process_params (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:335:12)
2018-06-19T07:54:31.629169923Z at next (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:275:10)
2018-06-19T07:54:31.629175671Z at Layer.handle_error (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/layer.js:67:12)
2018-06-19T07:54:31.629214708Z at trim_prefix (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:315:13)
2018-06-19T07:54:31.629223386Z at /var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:284:7
2018-06-19T07:54:31.629229400Z at Function.process_params (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:335:12)
2018-06-19T07:54:31.629240146Z at Immediate.next (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:275:10)
2018-06-19T07:54:31.629246653Z at Immediate.<anonymous> (/var/lib/ghost/versions/1.24.2/node_modules/express/lib/router/index.js:635:15)
2018-06-19T07:54:31.629252729Z at runCallback (timers.js:674:20)
2018-06-19T07:54:31.629258373Z at tryOnImmediate (timers.js:645:5)
2018-06-19T07:54:31.629263989Z at processImmediate [as _immediateCallback] (timers.js:617:5)
2018-06-19T07:54:31.629269658Z
2018-06-19T07:54:31.629275063Z Error: read ECONNRESET
2018-06-19T07:54:31.629289926Z at exports._errnoException (util.js:1020:11)
2018-06-19T07:54:31.629297816Z at TCP.onread (net.js:580:26)
2018-06-19T07:54:31.629303669Z --------------------
2018-06-19T07:54:31.629309329Z at Protocol._enqueue (/var/lib/ghost/versions/1.24.2/node_modules/mysql/lib/protocol/Protocol.js:145:48)
2018-06-19T07:54:31.629315290Z at Connection.query (/var/lib/ghost/versions/1.24.2/node_modules/mysql/lib/Connection.js:208:25)
2018-06-19T07:54:31.629321062Z at /var/lib/ghost/versions/1.24.2/node_modules/knex/lib/dialects/mysql/index.js:161:18
2018-06-19T07:54:31.629326828Z at Promise._execute (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/debuggability.js:303:9)
2018-06-19T07:54:31.629332879Z at Promise._resolveFromExecutor (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/promise.js:483:18)
2018-06-19T07:54:31.629338756Z at new Promise (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/promise.js:79:10)
2018-06-19T07:54:31.629344616Z at Client_MySQL._query (/var/lib/ghost/versions/1.24.2/node_modules/knex/lib/dialects/mysql/index.js:155:12)
2018-06-19T07:54:31.629350571Z at Client_MySQL.query (/var/lib/ghost/versions/1.24.2/node_modules/knex/lib/client.js:206:17)
2018-06-19T07:54:31.629356402Z at Runner.<anonymous> (/var/lib/ghost/versions/1.24.2/node_modules/knex/lib/runner.js:155:36)
2018-06-19T07:54:31.629362380Z at Runner.tryCatcher (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/util.js:16:23)
2018-06-19T07:54:31.629368185Z at Runner.query (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/method.js:15:34)
2018-06-19T07:54:31.629374020Z at /var/lib/ghost/versions/1.24.2/node_modules/knex/lib/runner.js:61:21
2018-06-19T07:54:31.629379660Z at tryCatcher (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/util.js:16:23)
2018-06-19T07:54:31.629385388Z at /var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/using.js:185:26
2018-06-19T07:54:31.629391175Z at tryCatcher (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/util.js:16:23)
2018-06-19T07:54:31.629397077Z at Promise._settlePromiseFromHandler (/var/lib/ghost/versions/1.24.2/node_modules/bluebird/js/release/promise.js:512:31)
2018-06-19T07:54:31.629403105Z
2018-06-19T07:54:31.753664948Z INFO [2018-06-19 07:54:31] "GET /assets/built/screen.css?v=eee8f52bf2" 200 1ms
2018-06-19T07:54:31.903233465Z INFO [2018-06-19 07:54:31] "GET /favicon.ico" 302 6ms
2018-06-19T07:54:31.929420024Z INFO [2018-06-19 07:54:31] "GET /favicon.ico" 302 2ms
2018-06-19T07:54:55.043872223Z INFO [2018-06-19 07:54:55] "GET /" 200 36ms
2018-06-19T07:54:55.157143346Z INFO [2018-06-19 07:54:55] "GET /assets/built/screen.css?v=eee8f52bf2" 304 2ms
2018-06-19T07:54:55.169146742Z INFO [2018-06-19 07:54:55] "GET /public/ghost-sdk.min.js?v=eee8f52bf2" 200 0ms
2018-06-19T07:54:55.180905659Z INFO [2018-06-19 07:54:55] "GET /assets/js/jquery.fitvids.js?v=eee8f52bf2" 304 2ms
2018-06-19T07:54:55.182154748Z INFO [2018-06-19 07:54:55] "GET /assets/js/infinitescroll.js?v=eee8f52bf2" 304 3ms
2018-06-19T07:54:55.206722352Z INFO [2018-06-19 07:54:55] "GET /content/images/2018/06/graphic_400px.png" 200 3ms
2018-06-19T07:54:55.237869105Z INFO [2018-06-19 07:54:55] "GET /content/images/2018/06/zany-jadraque-571205-unsplash.jpg" 304 1ms
2018-06-19T07:58:25.211952872Z INFO [2018-06-19 07:58:25] "GET /content/images/2018/06/zany-jadraque-571205-unsplash.jpg" 200 18ms
Hey @ChappIO 馃憢 We ask that you please do not use GitHub for help or support, the default issue template pointed you to our forum for this type of question 馃槃 We use GitHub solely for bug-tracking and on-going feature development so we try to keep it noise free.
Many questions can be answered by reviewing our docs for self-hosters, our theme API, or our public API. If you can't find an answer then our forum is a great place to get community support, plus it helps create a central location for searching problems/solutions.
FYI: Many projects have their own support guidelines and GitHub will highlight them for you, or the project owners will use issue templates to point you in the right direction, please read them before opening issues
@kevinansfield alright no problem, I will open a topic there. You might want to adjust your Bug Report issue template as well to prevent this from happening more often.
My goal was not to receive support but to report this issue and maybe help you guys fix it.
Have a good day!
@kevinansfield I have seen this issue in more products. It is a result of the unusual way MySQL manages idle connections (server side). I am willing to create a pull request to fix the issue, but before I work on it I would like to know if you would be willing to accept such a pull request.
The issue is that MySQL cleans up idle connections (by default this means no activity for 8 hours). However, the MySQL driver for node does NOT clean up those connections. So what happens is the next time an idle connection is used, it crashes and a new connection is created so that the next request works. Very annoying issue at the MySQL side (and has been like this for years).
The workaround/fix several people employ is to register a scheduled job that runs the SELECT 1; query ever hour or so. This way the connection does not become idle.
I would add this, no problem. But I don't want to waste time so please tell me if you'd be willing to accept that pull request.
I think this is more in @kirrg001's territory
For @kirrg001, You can read more about this issue here: https://github.com/mysqljs/mysql/issues/464
Knex handles this situation well as far as i can tell.
It was reported and optimised a couple times: https://github.com/tgriesser/knex/issues?q=is%3Aissue+econnreset+is%3Aclosed
We could reply on one of the closed issues in the knex repository and ask for help.
@ChappIO Which pool configuration are using?
@kirrg001 I am not sure exactly what you mean by pool configuration. But I posted my (full) configuration in the issue description. It contains the docker stackfile, as well as the exact docker image tag. So I guess the default?
@ChappIO MySQL Pool Configuration (you can find it in your production config). But i guess it's default if you haven't changed anything.
@kirrg001 correct
I just found the file but it looks like it is not configured to use mysql.
{
"url": "http://localhost:2368",
"server": {
"port": 2368,
"host": "0.0.0.0"
},
"database": {
"client": "sqlite3",
"connection": {
"filename": "/var/lib/ghost/content/data/ghost.db"
}
},
"mail": {
"transport": "Direct"
},
"logging": {
"transports": [
"file",
"stdout"
]
},
"process": "systemd",
"paths": {
"contentPath": "/var/lib/ghost/content"
}
}
I used the docker environment variables to configure the container:
GHOST_INSTALL=/var/lib/ghost
NODE_VERSION=6.14.3
HOSTNAME=1351a8131140
TERM=xterm
url=https://itaintstupidifitworks.com
GHOST_VERSION=1.24.4
GHOST_CONTENT=/var/lib/ghost/content
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/ghost/current/node_modules/knex-migrator/bin
PWD=/var/lib/ghost
NODE_ENV=production
SHLVL=1
HOME=/root
GHOST_CLI_VERSION=1.8.1
GOSU_VERSION=1.10
YARN_VERSION=1.6.0
database__client=mysql
database__connection__database=ghost
database__connection__host=db
database__connection__user=root
database__connection__password=extremely_secret
_=/usr/bin/env
@ChappIO Can you please post steps to reproduce?
I am unable to reproduce this problem - it recovers okay for me.
Please share as much as possible details. We can post them on one of the closed knex issues and ask for help. Thanks!
@kirrg001 I will walk through the steps again in a new environment to document what I do. It might take a while since it is caused by idle connections. I'll get back to you when I am done. If you like I can also open up a port for you so you can ssh into this test vm. Maybe it helps if you get to take a look for yourself after I have the steps.
@kirrg001 I must apologise. I cannot seem to reproduce this reliably. I will continue to use my environment and debug some more if it occurs again. It is not the wait_timeout issue because I reduced that to a minute and like you said everything works fine.
Thanks for your time and have a nice evening!
@ChappIO Thanks for sharing 馃憤
I found this solution:
https://netslum.it/dealing-with-ghost-econnreset-errors-on-docker-swarm/
The problem is related to docker swarm and pooling.
Disable pooling (min:0)
Or check out this:
https://success.docker.com/article/ipvs-connection-timeout-issue