Hello Kibana team,
Even after following index creation documentations (link provided on kibana interface and on the web) I'm not able to access the ES data on kibana. I keep having "Warning No default index pattern. You must select or create one to continue.".
FYI, I didn't that much trouble starting indexing and visualizing in 6.0, if it can help.
GET /_cat/indices?v
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open logstash-jobs nHtULW65QWistGDCmqTUOA 5 1 2033 0 2.9mb 2.9mb
yellow open metricbeat-6.2.2-2018.02.25 asLt2TcaR4e09Sg14G_Lrg 1 1 79231 0 13.3mb 13.3mb
yellow open bank 95IoLJkIQyG8ulWsPGardQ 5 1 1000 0 475.2kb 475.2kb
green open .kibana _lCtMjcKTa-kq9UDhfAJpw 1 0 1 0 3.7kb 3.7kb
yellow open logstash-2013.12.11 5pd7IExpSmyNpUXKWGmiUQ 5 1 1 0 12.5kb 12.5kb
yellow open logstash-2015.05.18 x3S3XIDsTYeMjzT3eVHy6w 5 1 4631 0 21mb 21mb
yellow open metricbeat-6.2.2-2018.02.23 HYNnU_WLQzSBG71PehYLZw 1 1 33768 0 5.9mb 5.9mb
yellow open logstash-2015.05.20 4hLt-qsgQP2rvHOGY5CMqw 5 1 4750 0 21mb 21mb
yellow open logstash-2011.05.18 c3C9mCqSTIClbi957jP1Kg 5 1 2 0 26.4kb 26.4kb
yellow open metricbeat-6.2.2-2018.02.24 uw4NdzCTRFOxhsXROeZvrA 1 1 137924 0 22.9mb 22.9mb
yellow open logstash-2011.05.19 cX3Co3AgQa6jiuJTU-CyTg 5 1 1 0 13.8kb 13.8kb
yellow open shakespeare WV0rqPgmTM6f8G9E1lTl5g 5 1 111396 0 21.3mb 21.3mb
yellow open logstash-2015.05.19 98m0utEBTFWWvuLY3IadPg 5 1 4624 0 21.1mb 21.1mb
I create a logstash-jobs such as:
PUT /logstash-jobs
{
"mappings": {
"doc": {
"properties" : {
"@id" : { "type" : "integer"},
"@title" : { "type" : "text"},
"@description" : { "type" : "text"},
"@company" : { "type" : "text"},
"@city" : { "type" : "text"},
"@state" : { "type" : "text"},
"@salary" : { "type" : "text"},
"@duration" : { "type" : "text"},
"@link" : { "type" : "text"},
"@provider" : { "type" : "text"},
"@update" : { "type" : "date", "format": "yyyy-MM-dd HH:mm:ss"}
}
}
}
}
indexPattern:placeholder = logstash-*
timelion:es.timefield = @timestamp
timelion:es.default_index = _all
I'm not sure how to investigate this from there.
Feel free to ask me for logs, actions, results, ...
Thank you very much for your help
Best regards
Arnaud
Kibana version: 6.2.2
Elasticsearch version: 6.2.2
Server OS version: Centos7
Linux Elastic.newfrontierdata.com 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
Browser version: Firefox 58.0.2 (64-bit)
Browser OS version: Windows 10
Original install method (e.g. download page, yum, from source, etc.):
Extracted .tar.gz for ELK 6.2.2
Description of the problem including expected versus actual behavior:
Steps to reproduce:
GET /_cat/indices?v
in console : successErrors in browser console (if relevant):
Couldn't find any Elasticsearch data
You'll need to index some data into Elasticsearch before you can create an index pattern. Learn how.
Provide logs and/or server output (if relevant):
No errors/warnings
In fact, I could find a fix for indexing, but still didn't reach a stable situation.
Running:
POST /.kibana/doc/index-pattern:logstash-jobs
{
"type": "index-pattern",
"index-pattern": {
"title": "logstash-jobs",
"timeFieldName": "update"
}
}
displayed the index in the index pattern list. I still don't see the others, can you tell me why? do I have to manually add all the index patterns manually?
But when I try to display any charts on this dataset, the dataset "seems" empty.
For example a blank map when I try to count by state, or even more informative, a metric with a global count is "0 count".
Something is wrong with the mapping or something... Can you orient me to the right direction?
Thank you very much
Best regards
Arnaud
Ok,
I worked around the "no-data" in visualization (0 counts) by transforming the data into json using that script:
import csv
import json
csvfile = open('jobs.csv', 'r')
jsonfile = open('jobs.json', 'w')
fieldnames = ("id", "title", "description", "company", "city", "state", "salary", "duration", "link", "provider", "update")
reader = csv.DictReader( csvfile, fieldnames)
for row in reader:
json.dump(row, jsonfile)
jsonfile.write('\n')
and then by running: sed -e 's/^/{ "index" : {} }\n/' -i jobs.json
Then, I changed the types of the fields I wanted aggregated to keyword (another issue I had, which I fixed easily.)
I finally ran the following command:
curl -H 'Content-Type: application/x-ndjson' -XPOST 'localhost:9200/aja-jobs/doc/_bulk?pretty' --data-binary @jobs.json
and that's it. it's working now.
I don;t want to do that for the future dataset. Can you tell me what's wrong with my logstash configuration?
jobs.yml
input {
file {
path => "/home/elastic/logstash-6.2.2/config/jobs.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["id", "title", "description", "company", "city", "state", "salary", "duration", "link", "provider", "update"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "nfd-jobs"
template => "/home/elastic/logstash-6.2.2/config/jobs.template.json"
}
stdout {}
}
jobs.template.json
{
"template": "jobs",
"settings" : {
"number_of_shards" : 1,
"number_of_replicas" : 1,
"index" : {
"query" : { "default_field" : "id" }
}
},
"mappings": {
"_default_": {
"_all": { "enabled": false },
"_source": { "compress": true },
"dynamic_templates": [
{
"text_template" : {
"match" : "*",
"mapping": { "type": "text", "index": "not_analyzed" },
"match_mapping_type" : "text"
}
}
],
"properties" : {
"id" : { "type" : "integer"},
"title" : { "type" : "keyword"},
"description" : { "type" : "text"},
"company" : { "type" : "keyword"},
"city" : { "type" : "keyword"},
"state" : { "type" : "keyword"},
"salary" : { "type" : "text"},
"duration" : { "type" : "text"},
"link" : { "type" : "text"},
"provider" : { "type" : "keyword"},
"@update" : { "type" : "date", "format": "yyyy-MM-dd HH:mm:ss"}
}
}
}
}
Your problem is you cannot setup index correctly in management area?
I'm not exactly sure the problem here.
To view ES data within Kibana, you need to create index patterns that identifies indices within ES.
Have you done that part yet?
I have this problem as well. I have used Logstash to create 3 indices in ES. They are seen when I use: curl -XGET http://elk.foo.loc:9200/_cat/indices?v. The issue is that when I go to Kibana->Management->Index Patterns nothing shows up. In previous versions, I was able to define my index pattern manually by typing it in, but in 6.2 it has to see them first before I can define them.
@parky118 @jaspart I am having a hard time following the descriptions of the problem here.
Here's what I see when I have data in ES and have not yet defined an index pattern:
Can one of you please walk me through what you are seeing, maybe with some screenshots?
I am having the same issue. Eagerly waiting for a reply. I have 4 indexes created. Working fine. But not reflecting here. Ticked the checkbox to include system indices. But in vain
Also experiencing this issue, I have created an index and can see it in elastic search by running curl -X GET "localhost:9200/_cat/indices?v"
. However, I cannot define an index pattern as the index cannot be found by Kibana for some reason.
I also have this issue, I even tried it with a fresh instance of elasticsearch and kibana, but I get the same issue.
One thing to double check is that the indices contain at least a single document. The index pattern creation UI will _only_ detect indices that contain documents.
If that's not the case, then please let me know and include more detailed information about your environment, including versions, OS, and browser. Please also include the output of _cat/indices
entirely.
Thanks
Closing this for lack of response. I'm assuming that people have discovered that index pattern creation can only see indices with documents in them. Please reopen with detailed reproduction steps if this continues to be an issue.
I have similar problem with ELK 6.3.2 and 6.4.0 running from https://hub.docker.com/r/sebp/elk/
I surely have growing data in my indices, tried loading shakespeare demo etc - nothing.
But I believe the problem is not with indices or elastic. One very interesting point is that even if I install one-click sample data about flights, it appears and works perfectly.
If i delete the definition of this index in Kibana with "trash" button (the indices with data is still intact in Elastic) and try to add them once again - nothing helps.
However I tried to look what Kibana is searching with inspector, and I see whatever I type in index pattern box, UI makes request to : /elasticsearch/elastic*/_search with this payload: {"size":0,"aggs":{"indices":{"terms":{"field":"_index","size":200}}}} and of course it returns 404.
Steps to reproduce:
My logstash output config:
output {
elasticsearch {
hosts => ["localhost"]
manage_template => false
index => "elasticsearch-%{+YYYY.MM.dd}"
document_type => "_doc"
}
}
My indices look like:
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open elasticsearch-2018.09.19 02x26DIVSVyBkDebh9vvXw 5 1 4066 0 6.2mb 6.2mb
Thanks for the details @justinasjaronis I'll take a look into this some more.
@justinasjaronis I think you may have your elasticsearch URL configured incorrectly in Kibana. This URL: https://logs.impressbox.eu/elasticsearch/elastic/_search yields nothing but this URL: https://logs.impressbox.eu/elastic/_search works and returns hits. Kibana is not adding in that "elasticsearch" string to the URL. What is the elasticsearch URL set to in kibana.yml?
@justinasjaronis that docker image you are using is not the official one. I think there is a config bug in that image. I recommend looking at this documentation and using the official docker images: https://www.elastic.co/guide/en/elasticsearch/reference/6.4/docker.html
kibana.yml was default (all settings are commented). Thanks for insight about official image, since we were using this image for ages and I didn't bother to look now.
However I found the problem. Kibana index creation works perfectly if I work directly with Kibana (i.e. open 5601 port) But in this setup there is nginx reverse proxy (utilizing jwilder/docker-gen). Somewhy urls get screwed with it. Maybe there has to be some other setting for kibana if it is under reverse proxy.
It is interesting that 5.x worked perfectly with the same proxy.
Is the proxy config identical b/t the 5.x versions of the docker image and the ones where you are having the issue? Anyway, I would suggest opening a bug with the author of the docker image. Again, this image is not associated with Elastic. We don't even use the ELK stack terminology anymore and haven't for some time.
Yes proxy config was identical (it was not touched). In meantime, we'll try to migrate to the official stack. If the problem persists with original images I'll try to dig deeper and provide more information.
Most helpful comment
I have this problem as well. I have used Logstash to create 3 indices in ES. They are seen when I use: curl -XGET http://elk.foo.loc:9200/_cat/indices?v. The issue is that when I go to Kibana->Management->Index Patterns nothing shows up. In previous versions, I was able to define my index pattern manually by typing it in, but in 6.2 it has to see them first before I can define them.