Attempting to migrate to 2.x, I've encountered a strange problem with range filters not able to match any of the mappings I try holding epoch_millis. Maybe related to #10971
Given a document with a time_stamp field in the mapping:
"time_stamp": {
"type": "date",
"format": "epoch_millis"
}
With some data:
station: "ASN00033312",
precip_tenths_mm: 2,
time_stamp: 1390712400000,
location: {
lat: -23.2233,
lon: 150.605
}
I would expect the following to match:
"query": {
"filtered": {
"filter": {
"bool": {
"must": [
{
"range": {
"time_stamp": {
"gte": 1390712400000,
"format" : "epoch_millis"
}
}
}
]
}
}
}
}
The above returns no document, even though there are thousands that should respond. Removing the format at query time, doesn't help. Mapping the time_stamp field as format "basic_date", as I had in 1.x, doesn't seem to return anything either.
Bug? Pebcak?
I seem to have tracked it down... On indexing, my epoch timestamps are float format, ie. 1390798800000.0
ES is ignoring them, perhaps because of...
"index.mapping.ignore_malformed" : true,
"index.mapping.coerce" : true
I would've expected ES to "coerce" the float to an int and treat it as a epoch_millis, but instead it seems to be ignoring them altogether.
Not sure if this is the expected behaviour, so feel free to open/close the issue depending.
Hi @royrusso
Your diagnosis seems correct. I agree that we could coerce floats to longs for epoch_millis.
馃憤 this is a pretty bad bug IMO, Go's json encoder may give you floats, and JSON numerical values by definition are floats, there's only the one type. Is there a workaround?
I got similar situation and ES rejects timestamp as floating point values returned by json parser. coerce does not seem to work with date type either. Is there any plan to either allow floating point as date or add coerce option to date type?
Just wanted to mention that I'm also having problems with this when upgrading from Elasticsearch 1.x to 2.x. Elasticsearch 1.x accepted floats with the date_time format, but now 2.x doesn't do that anymore, and I tried specifying date_time||epoch_seconds, but it still doesn't parse floats (where the part after decimal point represents milliseconds). So it doesn't seem like there is a way to get the old behaviour.
I ran into this too. For anyone else running into this (and using 5.x), one solution is to convert the timestamp in an ingest pipeline. Something like
{
"description": "my pipe",
"processors": [
{
"script": {
"lang": "painless",
"inline": " ctx.timestamp = (long) (ctx.timestamp * 1000) "
}
}
]
}
Simpler than adding logstash to the stack (unless it's already there of course...).
I ran into this too. For anyone else running into this (and using 5.x), one solution is to convert the timestamp in an ingest pipeline. Something like
{ "description": "my pipe", "processors": [ { "script": { "lang": "painless", "inline": " ctx.timestamp = (long) (ctx.timestamp * 1000) " } } ] }Simpler than adding logstash to the stack (unless it's already there of course...).
The float conversion was working for me in 6.5.4
but I could not get it to work in 7.1.1
using the _ingest/pipeline workound. Thanks for sharing