Influxdb: Fill(Linear) returns invalid JSON despite Influxdb responded with HTTP code 200 (OK)

Created on 20 Jun 2017  Â·  8Comments  Â·  Source: influxdata/influxdb

Bug report

__System info:__
Influxdb version: 1.2
Operating system: Ubuntu 16.04.2 LTS

__Steps to reproduce:__

  1. Insert 2 groups of point data into the database.
  • Group 1: sampling interval = 15mins. That is, the point data are in timestamp order of 00:00, 00:15, 00:30 and so on...

  • Set2: sampling interval = 30mins. Timestamp will be like 00:00 00:30, 01:00, 01:30 and so on...

Sample data set:

bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999945.898 1483225200000000000
bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999957.398 1483227000000000000
bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999945.898 1483225200000000000
bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999970.898 1483228800000000000
bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999985.102 1483230600000000000
bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=99999999907.801 1483232400000000000
bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=99999999926.301 1483234200000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123455.6 1483225200000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123455.8 1483226100000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123456.0 1483227000000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123456.5 1483227900000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123456.9 1483228800000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123457.2 1483229700000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123457.5 1483230600000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123457.9 1483231500000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123458.5 1483232400000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123458.6 1483233300000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123459.0 1483234200000000000
bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123459.1 1483235100000000000

Note: You can use the NodeJS influxdb driver to insert the data above.

Dataset screenshot

  1. Do a SELECT query with GROUP BY TIME and FILL(LINEAR) option.
SELECT mean(value) 
 FROM bug 
WHERE (ref = 'stoh.DON_KEY_002_ele' OR ref = 'stoh.DON_KEY_003_ele') AND 
               time >= '2017-01-01T00:00:00Z' AND 
               time <= '2017-01-01T00:15:00Z' 
GROUP BY time(5m, 0s), ref 
fill(linear)
  1. InfluxDB returns HTTP Code 200 (OK) But invalid JSON.
    InfluxDB admin result

If a curl command is used on postman you get
_Unexpected 'j'_
Postman-screenshot

__Expected behavior:__
I am expecting influxDB to return;

  1. a valid JSON
  2. a data set containing the linear interpolated result

__Actual behavior:__
InfluxDB responded with HTTP code 200 (OK) but produced an invalid JSON.
When the influxDB.Kapacitor and also its admin UI tried to parse the result,
The following error was observed.
Invalid JSON: Unexpected 'j'_

__Additional info:__
Works fine if you limit the scope of SELECT to just 1 series but fails when the 2 groups are selected together.

Also if I extend the SELECT window to 30 mins e.g. time >= '2017-01-01T00:00:00Z' AND time <= '2017-01-01T00:30:00Z'

Like:
SELECT mean(value) FROM bug WHERE (ref = 'stoh.DON_KEY_002_ele' OR ref = 'stoh.DON_KEY_003_ele') AND time >= '2017-01-01T00:00:00Z' AND time <= '2017-01-01T00:30:00Z' GROUP BY time(5m, 0s), ref fill(linear)

Works fine also.

I am happy to provide someone with quick and dirty 10 lines NodeJS application code for someone to load the above test data to try the bug. Just let me know.

arequeries kinbug

All 8 comments

Can you give the output from the same command without trying to parse the output? It appears like there was an error with marshaling that is getting ignored. The j is likely from a message that says, json: <some error message>.

@jsternberg Thanks for getting back.

This is the raw response from influxDB.
json: error calling MarshalJSON for type httpd.Response: json: error calling MarshalJSON for type *influxql.Result: json: unsupported value: -Inf

Let me know if the error message make sense. I'm happy to assist further in any way.

json: unsupported value: -Inf

This sounds like a bug in the implementation for fill(linear). We shouldn't allow ±∞ as an output value in that code path.

@jsternberg @mark-rushakoff
thanks for getting back to us.
I can see that this issue has now been officially marked as an influxDB bug.

Unfortunately for us this is a show stopper. :sob: I hope you don't mind me asking this but Is there a typical time frame which we will expect this to be fixed?

I understand this is an open source project and I shouldn't be asking this sort of question. Just curious, if it is too difficult question then its fine.

Meantime I'll try setup an influx dev environment on my spare time to see if I can produce something for a quick win.

@SamuelToh I added this to the storage and query team backlog. We will groom it this week or next and determine a timeframe for a fix.

Thanks for bringing it to our attention.

@rbetts Awesome news! Thanks a lot for this.

@SamuelToh Can you verify that you're still experiencing this issue on 1.3 of InfluxDB? I have been unable to reproduce the issue locally

Here are the steps I followed

Create database and insert data

> create database mydb
> use mydb
Using database mydb
> insert bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999945.898 1483225200000000000
> insert bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999957.398 1483227000000000000
> insert bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999945.898 1483225200000000000
> insert bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999970.898 1483228800000000000
> insert bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=999999985.102 1483230600000000000
> insert bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=99999999907.801 1483232400000000000
> insert bug,ref=stoh.DON_KEY_002_ele,id=stoh.DON_KEY_002_ele.cupcake value=99999999926.301 1483234200000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123455.6 1483225200000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123455.8 1483226100000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123456.0 1483227000000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123456.5 1483227900000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123456.9 1483228800000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123457.2 1483229700000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123457.5 1483230600000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123457.9 1483231500000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123458.5 1483232400000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123458.6 1483233300000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123459.0 1483234200000000000
> insert bug,ref=stoh.DON_KEY_003_ele,id=stoh.DON_KEY_003_ele.cupcake value=123459.1 1483235100000000000

Issue curl command with the appropriate query

And pipe the results to jq

curl http://localhost:8086/query?db=mydb --data-urlencode "q=SELECT mean(value) FROM bug WHERE (ref = 'stoh.DON_KEY_002_ele' OR ref = 'stoh.DON_KEY_003_ele') AND time >= '2017-01-01T00:00:00Z' AND time <= '2017-01-01T00:15:00Z' GROUP BY time(5m, 0s), ref fill(linear) | jq"

Which yields

{
  "results": [
    {
      "statement_id": 0,
      "series": [
        {
          "name": "bug",
          "tags": {
            "ref": "stoh.DON_KEY_002_ele"
          },
          "columns": [
            "time",
            "mean"
          ],
          "values": [
            [
              "2017-01-01T00:00:00Z",
              999999970.898
            ],
            [
              "2017-01-01T00:05:00Z",
              null
            ],
            [
              "2017-01-01T00:10:00Z",
              null
            ],
            [
              "2017-01-01T00:15:00Z",
              null
            ]
          ]
        },
        {
          "name": "bug",
          "tags": {
            "ref": "stoh.DON_KEY_003_ele"
          },
          "columns": [
            "time",
            "mean"
          ],
          "values": [
            [
              "2017-01-01T00:00:00Z",
              123456.9
            ],
            [
              "2017-01-01T00:05:00Z",
              123457
            ],
            [
              "2017-01-01T00:10:00Z",
              123457.09999999999
            ],
            [
              "2017-01-01T00:15:00Z",
              123457.2
            ]
          ]
        }
      ]
    }
  ]
}

@desa Sorry for getting back to you late.

Seems like this is no longer an issue in the latest 1.3.x release. Something must have fixed it between 1.2.x to 1.3.x.

Hooray! I'll close this.

Was this page helpful?
0 / 5 - 0 ratings