kibana 5.3.0 returns error when elasticsearch isn't available and security is disabled

Created on 18 Apr 2017  路  16Comments  路  Source: elastic/kibana

I've created an elasticsearch cluster using the 5.3.0 docker images. It appears to be working properly with the expected log entries and curl response.

Kibana version:
docker.elastic.co/kibana/kibana:5.3.0

Elasticsearch version:
docker.elastic.co/elasticsearch/elasticsearch:5.3.0

Server OS version:
Ubuntu 16.10

Original install method (e.g. download page, yum, from source, etc.):
docker run -p 5601:5601 -e ELASTICSEARCH_URL=http://10.108.76.86 -e XPACK_SECURITY_ENABLED=false docker.elastic.co/kibana/kibana:5.3.0

Kibana produces log messages indicating that it's connection with elasticsearch isn't healthy. Then a request produces error output.

Steps to reproduce:

  1. Run kibana docker run -p 5601:5601 -e ELASTICSEARCH_URL=http://10.108.76.86:9200 -e XPACK_SECURITY_ENABLED=false -e ELASTICSEARCH_USERNAME -e ELASTICSEARCH_PASSWORD docker.elastic.co/kibana/kibana:5.3.0
  2. Wait for server to initialize
  3. Make request to localhost:5601/app/kibana

Errors in browser console (if relevant):

$ curl -v localhost:5601/app/kibana
*   Trying 127.0.0.1...
* Connected to localhost (127.0.0.1) port 5601 (#0)
> GET /app/kibana HTTP/1.1
> Host: localhost:5601
> User-Agent: curl/7.50.1
> Accept: */*
> 
< HTTP/1.1 500 Internal Server Error
< kbn-name: kibana
< kbn-version: 5.3.0
< content-type: application/json; charset=utf-8
< cache-control: no-cache
< content-length: 96
< Date: Tue, 18 Apr 2017 20:03:47 GMT
< Connection: keep-alive
< 
* Connection #0 to host localhost left intact
{"statusCode":500,"error":"Internal Server Error","message":"An internal server error occurred"}
$ docker run -p 5601:5601 -e ELASTICSEARCH_URL=http://10.108.76.86:9200 -e XPACK_SECURITY_ENABLED=false -e ELASTICSEARCH_USERNAME -e ELASTICSEARCH_PASSWORD docker.elastic.co/kibana/kibana:5.3.0
{"type":"log","@timestamp":"2017-04-18T19:59:24Z","tags":["info","optimize"],"pid":7,"message":"Optimizing and caching bundles for graph, monitoring, kibana, timelion and status_page. This may take a few minutes"}
{"type":"log","@timestamp":"2017-04-18T20:01:45Z","tags":["info","optimize"],"pid":7,"message":"Optimization of bundles for graph, monitoring, kibana, timelion and status_page complete in 140.41 seconds"}
{"type":"log","@timestamp":"2017-04-18T20:01:45Z","tags":["status","plugin:[email protected]","info"],"pid":7,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:45Z","tags":["status","plugin:[email protected]","info"],"pid":7,"state":"yellow","message":"Status changed from uninitialized to yellow - Waiting for Elasticsearch","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:45Z","tags":["status","plugin:[email protected]","info"],"pid":7,"state":"yellow","message":"Status changed from uninitialized to yellow - Waiting for Elasticsearch","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:45Z","tags":["status","plugin:[email protected]","info"],"pid":7,"state":"yellow","message":"Status changed from uninitialized to yellow - Waiting for Elasticsearch","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:45Z","tags":["status","plugin:[email protected]","info"],"pid":7,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:45Z","tags":["reporting","warning"],"pid":7,"message":"Generating a random key for xpack.reporting.encryptionKey. To prevent pending reports from failing on restart, please set xpack.reporting.encryptionKey in kibana.yml"}
{"type":"log","@timestamp":"2017-04-18T20:01:45Z","tags":["status","plugin:[email protected]","info"],"pid":7,"state":"yellow","message":"Status changed from uninitialized to yellow - Waiting for Elasticsearch","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:48Z","tags":["status","plugin:[email protected]","error"],"pid":7,"state":"red","message":"Status changed from yellow to red - Request Timeout after 3000ms","prevState":"yellow","prevMsg":"Waiting for Elasticsearch"}
{"type":"log","@timestamp":"2017-04-18T20:01:48Z","tags":["status","plugin:[email protected]","error"],"pid":7,"state":"red","message":"Status changed from yellow to red - Request Timeout after 3000ms","prevState":"yellow","prevMsg":"Waiting for Elasticsearch"}
{"type":"log","@timestamp":"2017-04-18T20:01:48Z","tags":["status","plugin:[email protected]","error"],"pid":7,"state":"red","message":"Status changed from yellow to red - Request Timeout after 3000ms","prevState":"yellow","prevMsg":"Waiting for Elasticsearch"}
{"type":"log","@timestamp":"2017-04-18T20:01:48Z","tags":["status","plugin:[email protected]","error"],"pid":7,"state":"red","message":"Status changed from yellow to red - Request Timeout after 3000ms","prevState":"yellow","prevMsg":"Waiting for Elasticsearch"}
{"type":"log","@timestamp":"2017-04-18T20:01:58Z","tags":["status","plugin:[email protected]","error"],"pid":7,"state":"red","message":"Status changed from uninitialized to red - Request Timeout after 3000ms","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:58Z","tags":["status","plugin:[email protected]","error"],"pid":7,"state":"red","message":"Status changed from uninitialized to red - Request Timeout after 3000ms","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:58Z","tags":["status","plugin:[email protected]","info"],"pid":7,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:58Z","tags":["status","plugin:[email protected]","info"],"pid":7,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
{"type":"log","@timestamp":"2017-04-18T20:01:58Z","tags":["listening","info"],"pid":7,"message":"Server running at http://0:5601"}
{"type":"log","@timestamp":"2017-04-18T20:01:58Z","tags":["status","ui settings","error"],"pid":7,"state":"red","message":"Status changed from uninitialized to red - Elasticsearch plugin is red","prevState":"uninitialized","prevMsg":"uninitialized"}


{"type":"log","@timestamp":"2017-04-18T20:03:11Z","tags":["warning","process"],"pid":7,"level":"error","message":"Unhandled promise rejection (rejection id: 18): Error: reply interface called twice","error":{"message":"Unhandled promise rejection (rejection id: 18): Error: reply interface called twice","name":"UnhandledPromiseRejectionWarning","stack":"UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 18): Error: reply interface called twice\n    at emitPendingUnhandledRejections (internal/process/promises.js:57:27)\n    at runMicrotasksCallback (internal/process/next_tick.js:61:9)\n    at _combinedTickCallback (internal/process/next_tick.js:67:7)\n    at process._tickDomainCallback (internal/process/next_tick.js:122:9)"}}
{"type":"error","@timestamp":"2017-04-18T20:03:11Z","tags":[],"pid":7,"level":"error","message":"Cannot read property 'toJSON' of undefined","error":{"message":"Cannot read property 'toJSON' of undefined","name":"TypeError","stack":"TypeError: Cannot read property 'toJSON' of undefined\n    at withXpackInfo (/usr/share/kibana/plugins/x-pack/plugins/xpack_main/server/lib/replace_injected_vars.js:5:32)\n    at replaceInjectedVars$ (/usr/share/kibana/plugins/x-pack/plugins/xpack_main/server/lib/replace_injected_vars.js:10:12)\n    at tryCatch (/usr/share/kibana/node_modules/regenerator/runtime.js:61:40)\n    at GeneratorFunctionPrototype.invoke [as _invoke] (/usr/share/kibana/node_modules/regenerator/runtime.js:328:22)\n    at GeneratorFunctionPrototype.prototype.(anonymous function) [as next] (/usr/share/kibana/node_modules/regenerator/runtime.js:94:21)\n    at invoke (/usr/share/kibana/node_modules/regenerator/runtime.js:136:37)\n    at callInvokeWithMethodAndArg (/usr/share/kibana/node_modules/regenerator/runtime.js:172:16)\n    at previousPromise (/usr/share/kibana/node_modules/regenerator/runtime.js:194:19)\n    at AsyncIterator.enqueue (/usr/share/kibana/node_modules/regenerator/runtime.js:193:13)\n    at AsyncIterator.prototype.(anonymous function) [as next] (/usr/share/kibana/node_modules/regenerator/runtime.js:94:21)\n    at Object.runtime.async (/usr/share/kibana/node_modules/regenerator/runtime.js:215:14)\n    at replaceInjectedVars (/usr/share/kibana/plugins/x-pack/plugins/xpack_main/server/lib/replace_injected_vars.js:3:22)\n    at /usr/share/kibana/src/ui/index.js:67:22\n    at undefined.next (native)\n    at step (/usr/share/kibana/src/ui/index.js:9:273)\n    at /usr/share/kibana/src/ui/index.js:9:443\n    at /usr/share/kibana/src/ui/index.js:9:99\n    at tryCatcher (/usr/share/kibana/node_modules/bluebird/js/main/util.js:26:23)\n    at ReductionPromiseArray._promiseFulfilled (/usr/share/kibana/node_modules/bluebird/js/main/reduce.js:109:18)\n    at ReductionPromiseArray.init (/usr/share/kibana/node_modules/bluebird/js/main/promise_array.js:92:18)\n    at ReductionPromiseArray.init (/usr/share/kibana/node_modules/bluebird/js/main/reduce.js:42:10)\n    at Async._drainQueue (/usr/share/kibana/node_modules/bluebird/js/main/async.js:128:12)"},"url":{"protocol":null,"slashes":null,"auth":null,"host":null,"port":null,"hostname":null,"hash":null,"search":"","query":{},"pathname":"/app/kibana","path":"/app/kibana","href":"/app/kibana"}}
{"type":"response","@timestamp":"2017-04-18T20:03:11Z","tags":[],"pid":7,"method":"get","statusCode":500,"req":{"url":"/app/kibana","method":"get","headers":{"host":"localhost:5601","connection":"keep-alive","cache-control":"max-age=0","upgrade-insecure-requests":"1","user-agent":"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36","accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8","dnt":"1","referer":"http://localhost:5601/","accept-encoding":"gzip, deflate, sdch, br","accept-language":"en-US,en;q=0.8"},"remoteAddress":"172.17.0.1","userAgent":"172.17.0.1","referer":"http://localhost:5601/"},"res":{"statusCode":500,"responseTime":85,"contentLength":9},"message":"GET /app/kibana 500 85ms - 9.0B"}

Operations bug

Most helpful comment

Your comments got me to investigate the docker elasticsearch cluster I'd spun up. It wasn't as healthy as I thought. Once I resolved that kibana connected and started correctly.

So @jbudz, I'd agree with your original assessment

The error message is a bug - this could happen if elasticsearch is unavailable and security is disabled.

The correct resolution for this should be to return a 500 with a message body that elasticsearch isn't functional and perhaps the elasticsearch query and response to aid in debugging. I'll edit the title and text to reflect this.

@jbudz, thanks for your patience and help.

All 16 comments

The error message is a bug - this could happen if elasticsearch is unavailable and security is disabled. Can you check if your elasticsearch url is reachable from the docker container?

I'll hold off on the bug label until we can get the title more specific.

Not having access was an issue. Reran though with host networking and got similar error. (slightly different call stack)

$ docker run --network=host -p 5601:5601 -e ELASTICSEARCH_URL=http://10.108.76.86:9200 -e XPACK_SECURITY_ENABLED=false -e ELASTICSEARCH_USERNAME -e ELASTICSEARCH_PASSWORD docker.elastic.co/kibana/kibana:5.3.0
<snip>
{"type":"log","@timestamp":"2017-04-18T22:04:08Z","tags":["warning","process"],"pid":6,"level":"error","message":"Unhandled promise rejection (rejection id: 68): Error: reply interface called twice","error":{"message":"Unhandled promise rejection (rejection id: 68): Error: reply interface called twice","name":"UnhandledPromiseRejectionWarning","stack":"UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 68): Error: reply interface called twice\n    at emitPendingUnhandledRejections (internal/process/promises.js:57:27)\n    at runMicrotasksCallback (internal/process/next_tick.js:61:9)\n    at _combinedTickCallback (internal/process/next_tick.js:67:7)\n    at process._tickDomainCallback (internal/process/next_tick.js:122:9)"}}
{"type":"error","@timestamp":"2017-04-18T22:04:08Z","tags":[],"pid":6,"level":"error","message":"Cannot read property 'toJSON' of undefined","error":{"message":"Cannot read property 'toJSON' of undefined","name":"TypeError","stack":"TypeError: Cannot read property 'toJSON' of undefined\n    at withXpackInfo (/usr/share/kibana/plugins/x-pack/plugins/xpack_main/server/lib/replace_injected_vars.js:5:32)\n    at replaceInjectedVars$ (/usr/share/kibana/plugins/x-pack/plugins/xpack_main/server/lib/replace_injected_vars.js:10:12)\n    at tryCatch (/usr/share/kibana/node_modules/regenerator/runtime.js:61:40)\n    at GeneratorFunctionPrototype.invoke [as _invoke] (/usr/share/kibana/node_modules/regenerator/runtime.js:328:22)\n    at GeneratorFunctionPrototype.prototype.(anonymous function) [as next] (/usr/share/kibana/node_modules/regenerator/runtime.js:94:21)\n    at invoke (/usr/share/kibana/node_modules/regenerator/runtime.js:136:37)\n    at callInvokeWithMethodAndArg (/usr/share/kibana/node_modules/regenerator/runtime.js:172:16)\n    at previousPromise (/usr/share/kibana/node_modules/regenerator/runtime.js:194:19)\n    at AsyncIterator.enqueue (/usr/share/kibana/node_modules/regenerator/runtime.js:193:13)\n    at AsyncIterator.prototype.(anonymous function) [as next] (/usr/share/kibana/node_modules/regenerator/runtime.js:94:21)\n    at Object.runtime.async (/usr/share/kibana/node_modules/regenerator/runtime.js:215:14)\n    at replaceInjectedVars (/usr/share/kibana/plugins/x-pack/plugins/xpack_main/server/lib/replace_injected_vars.js:3:22)\n    at /usr/share/kibana/src/ui/index.js:67:22\n    at undefined.next (native)\n    at step (/usr/share/kibana/src/ui/index.js:9:273)\n    at /usr/share/kibana/src/ui/index.js:9:443\n    at /usr/share/kibana/src/ui/index.js:9:99\n    at tryCatcher (/usr/share/kibana/node_modules/bluebird/js/main/util.js:26:23)\n    at ReductionPromiseArray._promiseFulfilled (/usr/share/kibana/node_modules/bluebird/js/main/reduce.js:109:18)\n    at ReductionPromiseArray.init (/usr/share/kibana/node_modules/bluebird/js/main/promise_array.js:92:18)\n    at ReductionPromiseArray.init (/usr/share/kibana/node_modules/bluebird/js/main/reduce.js:42:10)\n    at Async._drainQueue (/usr/share/kibana/node_modules/bluebird/js/main/async.js:128:12)\n    at Async._drainQueues (/usr/share/kibana/node_modules/bluebird/js/main/async.js:133:10)\n    at Immediate.Async.drainQueues (/usr/share/kibana/node_modules/bluebird/js/main/async.js:15:14)"},"url":{"protocol":null,"slashes":null,"auth":null,"host":null,"port":null,"hostname":null,"hash":null,"search":"","query":{},"pathname":"/app/kibana","path":"/app/kibana","href":"/app/kibana"}}
{"type":"response","@timestamp":"2017-04-18T22:04:08Z","tags":[],"pid":6,"method":"get","statusCode":500,"req":{"url":"/app/kibana","method":"get","headers":{"host":"localhost:5601","connection":"keep-alive","cache-control":"max-age=0","upgrade-insecure-requests":"1","user-agent":"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.110 Safari/537.36","accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8","dnt":"1","referer":"http://localhost:5601/","accept-encoding":"gzip, deflate, sdch, br","accept-language":"en-US,en;q=0.8"},"remoteAddress":"127.0.0.1","userAgent":"127.0.0.1","referer":"http://localhost:5601/"},"res":{"statusCode":500,"responseTime":23,"contentLength":9},"message":"GET /app/kibana 500 23ms - 9.0B"}

Verification that the container can see the es cluster.

$ docker ps|grep kibana
95dfc09a605d        docker.elastic.co/kibana/kibana:5.3.0                                                                                            "/bin/sh -c /usr/loca"   2 minutes ago       Up 2 minutes                            big_archimedes
$ docker exec -it big_archimedes bash
kibana@mathewslaptop:~$ curl http://10.108.76.86:9200
{
  "name" : "es-client-3477934031-qmwt9",
  "cluster_name" : "myesdb",
  "cluster_uuid" : "_na_",
  "version" : {
    "number" : "5.3.0",
    "build_hash" : "3adb13b",
    "build_date" : "2017-03-23T03:31:50.652Z",
    "build_snapshot" : false,
    "lucene_version" : "6.4.1"
  },
  "tagline" : "You Know, for Search"
}

Thanks for the update. It looks like you're already doing it, but I would also double check ELASTICSEARCH_USERNAME and ELASTICSEARCH_PASSWORD. The kibana security plugin will automatically fill in the defaults, but not if it's disabled/or the username and password are wrong. That may also be a cause of this error.

Your comments got me to investigate the docker elasticsearch cluster I'd spun up. It wasn't as healthy as I thought. Once I resolved that kibana connected and started correctly.

So @jbudz, I'd agree with your original assessment

The error message is a bug - this could happen if elasticsearch is unavailable and security is disabled.

The correct resolution for this should be to return a 500 with a message body that elasticsearch isn't functional and perhaps the elasticsearch query and response to aid in debugging. I'll edit the title and text to reflect this.

@jbudz, thanks for your patience and help.

Shouldn't Kibana at least load and show the ES cluster status as Red? I think that is the normal Wait/Help page you used to get before 5.x.

If that is the historical behavior, that might be preferable. I can't see what good getting the kibana interface up without elasticsearch working would be though. Getting the 500 error w/ an explanation of the problem seems the fastest way to get to a working stack.

Sorry, I meant the Kibana Status page (see image) should be displayed instead of just a JSON-formatted error. Maybe the Status page for mine was not displayed because my other indexes were being initialized _before_ the Kibana index was started?

screen shot 2017-04-26 at 2 44 25 pm

I can reproduce this. Docker, disabled security

I also run this on Docker without X-Pack security and only the "Monitoring" tab works. Opening any other tab shows 502 (Bad Gateway) in the browser console and in the Kibana logs.

We recently merged a PR to fix this. It should be out in the next release, either 5.4.1 or 5.5.0

@jazoom that sounds like a different issue. Would you mind opening up a new ticket with details if it's still happening?

jbudz, I reverted to Elasticsearch 2.4. I was having too many issues with 5.3. Hopefully the issue got fixed :-)

Is there any workaround to get 5.3.0 to work with XPack disabled? I'm trying to deploy kibana 5.3.0 onto kubernetes with it pointed to an AWS elasticsearch instance.

From inside the running kibana container, I can curl my aws elasticsearch and get the standard json response, but the kibana web app throws a stack trace on start and /app/kibana shows:

{
  statusCode: 500,
  error: "Internal Server Error",
  message: "An internal server error occurred"
}

Because AWS only provides elasticsearch 5.3, I'm constrained to use kibana 5.3.

Of note, I'm aware that aws elasticsearch provides a kibana plugin, but I want to deploy a separate kibana onto kubernetes so that I can put a gateway/proxy in front of it that will perform authentication.

@jar349 this may be worth opening a separate issue for. Are there any kibana server error logs?

I had the same issue, seems after installing x-pack and disabling security the browser prompted me when doing a simple ES uri check "http://localhost:9200/_cluster/health?wait_for_status=yellow&timeout=60s". I used the default username: elastic / password: changeme. Kibana didn't prompt so I checked my kibana.yml, I had default values set for es username/password. I changed them to the es ones above and I got a prompt for kibana @ "http://localhost:5601/app/kibana#/discover?_g=()". Entered my credentials for kibana username: kibana / password: changeme and it loaded. Hope this helps someone. FYI I'm using version 5.1 but 5.3 "should" be the same.

Was this page helpful?
0 / 5 - 0 ratings