Thanos, Prometheus and Golang version used
```thanos, version v0.1.0-rc.1 (branch: master, revision: 5b66b72d01b7ca4dd3a5bca2ac10e8057d32cf29)
build user: circleci@9ff8295b71d0
build date: 20180615-13:12:22
go version: go1.10.3
prometheus 2.3.0 official docker image is used
**What happened**
got error `receive series: rpc error: code = Internal desc = runtime error: index out of range` when running query with huge result(Total time series: 29367) or if i calculate huge number of timeseries on thanos web ui, and not all of result is returned, when i run the same not from thanos all ok. i have only one sidecar and one query without store or any other component configured, because for now i need only HA for prometheus servers. This error not always appear sometime is all ok.
**What you expected to happen**
i expect to see all metrics
**How to reproduce it (as minimally and precisely as possible)**:
Not sure about it but if you have a lot of metrics generated and stored in prometheus i think toy will have this error too, because i tried different version with different options but no luck for now
**Full logs to relevant components**
here is sample of log from sidecar
```level=error ts=2018-06-20T14:51:52.657720389Z caller=main.go:195 component=store msg="recovered from panic" panic="runtime error: index out of range" stack="goroutine 2283 [running]:\nruntime/debug.Stack(0xf06c20, 0xc4202c7e00, 0xf06c20)\n\t/usr/local/go/src/runtime/debug/stack.go:24 +0xa7\nmain.defaultGRPCServerOpts.func1(0xd1d8e0, 0x1434e60, 0xc420358400, 0xb5f948)\n\t/go/src/github.com/improbable-eng/thanos/cmd/thanos/main.go:195 +0x84\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.recoverFrom(0xd1d8e0, 0x1434e60, 0xc4203020f0, 0x3, 0xc42003c570)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:47 +0x43\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.StreamServerInterceptor.func1.1(0xc4200b2b30, 0xc4269b79d0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:35 +0x63\npanic(0xd1d8e0, 0x1434e60)\n\t/usr/local/go/src/runtime/panic.go:502 +0x229\ngithub.com/improbable-eng/thanos/pkg/store.(*PrometheusStore).Series(0xc4202b0e60, 0xc420382e60, 0xf19e40, 0xc420078b80, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/pkg/store/prometheus.go:150 +0xb02\ngithub.com/improbable-eng/thanos/pkg/store/storepb._Store_Series_Handler(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc423e1d640, 0xc425668130, 0xc420421970)\n\t/go/src/github.com/improbable-eng/thanos/pkg/store/storepb/rpc.pb.go:404 +0x10e\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc423e1d640, 0xf086a0, 0xc420078b70)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:69 +0xe0\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc423e1d640, 0xc423e1d600, 0xc420382dc0, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:39 +0x89\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc423e1d640, 0xc4200d23c0, 0x14)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:72 +0x93\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/tracing/opentracing.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc423e1d640, 0xc423e1d600, 0xc420382dc0, 0xc423e1d620, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/tracing/opentracing/server_interceptors.go:46 +0x14c\ngithub.com/improbable-eng/thanos/pkg/tracing.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18d00, 0xc423e1d620, 0xc423e1d600, 0xc420382dc0, 0xc4200d2301, 0xc423e1d620)\n\t/go/src/github.com/improbable-eng/thanos/pkg/tracing/grpc.go:35 +0x162\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18d00, 0xc423e1d620, 0x14, 0xc420382e10)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:72 +0x93\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-prometheus.(*ServerMetrics).StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18fa0, 0xc4200da8c0, 0xc423e1d600, 0xc420382dc0, 0xc420421c40, 0x4108d8)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-prometheus/server_metrics.go:125 +0x12a\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1(0xdbea40, 0xc4202b0e60, 0xf18fa0, 0xc4200da8c0, 0xc423e1d600, 0xe8e960, 0xc4209dec30, 0xc4209dec30)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:75 +0x166\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).processStreamingRPC(0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc4200dc140, 0xc420302270, 0x1437ea0, 0x0, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:1060 +0x3bb\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).handleStream(0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc4200dc140, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:1147 +0x12b1\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).serveStreams.func1.1(0xc420038180, 0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc4200dc140)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:638 +0x9f\ncreated by github.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).serveStreams.func1\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:636 +0xa1\n"
level=error ts=2018-06-20T14:52:11.640568706Z caller=main.go:195 component=store msg="recovered from panic" panic="runtime error: index out of range" stack="goroutine 2260 [running]:\nruntime/debug.Stack(0xf06c20, 0xc4202c7e00, 0xf06c20)\n\t/usr/local/go/src/runtime/debug/stack.go:24 +0xa7\nmain.defaultGRPCServerOpts.func1(0xd1d8e0, 0x1434e60, 0xc42005e400, 0xb5f948)\n\t/go/src/github.com/improbable-eng/thanos/cmd/thanos/main.go:195 +0x84\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.recoverFrom(0xd1d8e0, 0x1434e60, 0xc4203020f0, 0x3, 0xc42003c570)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:47 +0x43\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.StreamServerInterceptor.func1.1(0xc4200b2b30, 0xc4435f19d0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:35 +0x63\npanic(0xd1d8e0, 0x1434e60)\n\t/usr/local/go/src/runtime/panic.go:502 +0x229\ngithub.com/improbable-eng/thanos/pkg/store.(*PrometheusStore).Series(0xc4202b0e60, 0xc4203b28c0, 0xf19e40, 0xc45fe894a0, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/pkg/store/prometheus.go:150 +0xb02\ngithub.com/improbable-eng/thanos/pkg/store/storepb._Store_Series_Handler(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45ff0ae20, 0xc42038a730, 0xc420424970)\n\t/go/src/github.com/improbable-eng/thanos/pkg/store/storepb/rpc.pb.go:404 +0x10e\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45ff0ae20, 0xf086a0, 0xc45fe89490)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:69 +0xe0\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45ff0ae20, 0xc45ff0ade0, 0xc4203b2820, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:39 +0x89\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45ff0ae20, 0xc4200d23c0, 0x14)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:72 +0x93\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/tracing/opentracing.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45ff0ae20, 0xc45ff0ade0, 0xc4203b2820, 0xc45ff0ae00, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/tracing/opentracing/server_interceptors.go:46 +0x14c\ngithub.com/improbable-eng/thanos/pkg/tracing.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18d00, 0xc45ff0ae00, 0xc45ff0ade0, 0xc4203b2820, 0xc4200d2301, 0xc45ff0ae00)\n\t/go/src/github.com/improbable-eng/thanos/pkg/tracing/grpc.go:35 +0x162\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18d00, 0xc45ff0ae00, 0x14, 0xc4203b2870)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:72 +0x93\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-prometheus.(*ServerMetrics).StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18fa0, 0xc42024c960, 0xc45ff0ade0, 0xc4203b2820, 0xc420424c40, 0x4108d8)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-prometheus/server_metrics.go:125 +0x12a\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1(0xdbea40, 0xc4202b0e60, 0xf18fa0, 0xc42024c960, 0xc45ff0ade0, 0xe8e960, 0xc4228f2330, 0xc4228f2330)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:75 +0x166\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).processStreamingRPC(0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc420208c80, 0xc420302270, 0x1437ea0, 0x0, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:1060 +0x3bb\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).handleStream(0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc420208c80, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:1147 +0x12b1\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).serveStreams.func1.1(0xc420038180, 0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc420208c80)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:638 +0x9f\ncreated by github.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).serveStreams.func1\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:636 +0xa1\n"
level=error ts=2018-06-20T14:53:39.73453544Z caller=main.go:195 component=store msg="recovered from panic" panic="runtime error: index out of range" stack="goroutine 2270 [running]:\nruntime/debug.Stack(0xf06c20, 0xc4202c7e00, 0xf06c20)\n\t/usr/local/go/src/runtime/debug/stack.go:24 +0xa7\nmain.defaultGRPCServerOpts.func1(0xd1d8e0, 0x1434e60, 0xc420314000, 0xb5f948)\n\t/go/src/github.com/improbable-eng/thanos/cmd/thanos/main.go:195 +0x84\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.recoverFrom(0xd1d8e0, 0x1434e60, 0xc4203020f0, 0x3, 0xc42003a070)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:47 +0x43\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.StreamServerInterceptor.func1.1(0xc4200b2b30, 0xc4201799d0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:35 +0x63\npanic(0xd1d8e0, 0x1434e60)\n\t/usr/local/go/src/runtime/panic.go:502 +0x229\ngithub.com/improbable-eng/thanos/pkg/store.(*PrometheusStore).Series(0xc4202b0e60, 0xc4203748c0, 0xf19e40, 0xc45e69d580, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/pkg/store/prometheus.go:150 +0xb02\ngithub.com/improbable-eng/thanos/pkg/store/storepb._Store_Series_Handler(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45c124e00, 0xc4203ca030, 0xc420357970)\n\t/go/src/github.com/improbable-eng/thanos/pkg/store/storepb/rpc.pb.go:404 +0x10e\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45c124e00, 0xf086a0, 0xc45e69d570)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:69 +0xe0\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45c124e00, 0xc45c124dc0, 0xc420374820, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/recovery/interceptors.go:39 +0x89\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45c124e00, 0xc4200d23c0, 0x14)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:72 +0x93\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/tracing/opentracing.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18be0, 0xc45c124e00, 0xc45c124dc0, 0xc420374820, 0xc45c124de0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/tracing/opentracing/server_interceptors.go:46 +0x14c\ngithub.com/improbable-eng/thanos/pkg/tracing.StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18d00, 0xc45c124de0, 0xc45c124dc0, 0xc420374820, 0xc4200d2301, 0xc45c124de0)\n\t/go/src/github.com/improbable-eng/thanos/pkg/tracing/grpc.go:35 +0x162\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1.1(0xdbea40, 0xc4202b0e60, 0xf18d00, 0xc45c124de0, 0x14, 0xc420374870)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:72 +0x93\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-prometheus.(*ServerMetrics).StreamServerInterceptor.func1(0xdbea40, 0xc4202b0e60, 0xf18fa0, 0xc45f31eaa0, 0xc45c124dc0, 0xc420374820, 0xc420357c40, 0x4108d8)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-prometheus/server_metrics.go:125 +0x12a\ngithub.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware.ChainStreamServer.func1(0xdbea40, 0xc4202b0e60, 0xf18fa0, 0xc45f31eaa0, 0xc45c124dc0, 0xe8e960, 0xc45e6cab70, 0xc45e6cab70)\n\t/go/src/github.com/improbable-eng/thanos/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:75 +0x166\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).processStreamingRPC(0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc420209180, 0xc420302270, 0x1437ea0, 0x0, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:1060 +0x3bb\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).handleStream(0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc420209180, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:1147 +0x12b1\ngithub.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).serveStreams.func1.1(0xc420038180, 0xc4200c74a0, 0xf1a1a0, 0xc42040d800, 0xc420209180)\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:638 +0x9f\ncreated by github.com/improbable-eng/thanos/vendor/google.golang.org/grpc.(*Server).serveStreams.func1\n\t/go/src/github.com/improbable-eng/thanos/vendor/google.golang.org/grpc/server.go:636 +0xa1\n"
Anything else we need to know
here is my runtime flags for prometheus
- '--config.file=/etc/prometheus/prometheus.yml'
- '--storage.tsdb.path=/prometheus'
- '--storage.tsdb.retention=31d'
- '--web.enable-admin-api'
- '--storage.tsdb.max-block-duration=2h'
- '--storage.tsdb.min-block-duration=2h'
- '--web.external-url=http://[ip]:9090/'
and here is runtime flags for sidecar
- 'sidecar'
- '--log.level=debug'
- '--tsdb.path=/prometheus'
- '--prometheus.url=http://localhost:9090'
- '--cluster.peers=[external ip]:10900'
- '--cluster.address=[external ip]:10900'
- '--grpc-address=[external ip]:10901'
- '--http-address=0.0.0.0:10902'
Environment:
Linux dlukyanchikov-prom-dev-1 4.4.0-127-generic #153-Ubuntu SMP Sat May 19 10:58:46 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux):Also i have a question do i need to make max-block-duration and min-block-duration the same and by 2h if i need only HA mode with sidecar near prometheus and query on other server ?
Today i notice that its not always a problem with huge query result sometimes error appear with small result too, but original metrics without aggregation is huge
Hello! Anyone have any idea?
@Bplotka
Thanks for letting us know and sorry for the massive delay.
By looking of it, we are just having quite unexpected results from Prometheus itself. The logs you gave from sidecar are super useful!
github.com/improbable-eng/thanos/pkg/store.(*PrometheusStore).Series(0xc4202b0e60, 0xc4203748c0, 0xf19e40, 0xc45e69d580, 0x0, 0x0)\n\t/go/src/github.com/improbable-eng/thanos/pkg/store/prometheus.go:150
So basically this line is raising nil panic:
MinTime: int64(e.Samples[0].Timestamp),
It is clear that your Prometheus just gave us a series with no chunk, no samples. I haven't tried Thanos with Prometheus 2.3.0 but maybe something changed? ):
Seems to easy to fix on our side and doing the fix now, but worth to check what's going on upstream logic, why that happens.
Asked on Prometheus-dev IRC channel if that behaviour is right, waiting on response.
For all the wandering around with that log message, from a discussion with @bwplotka in slack, it's clear that Thanos handles it gracefully and it doesn't lead to data corruption or any other negative effects.
But the root cause - would be interesting to see (:
Hm, our sidecars logs are flooded with those messages even with the most recent versions of Prometheus and Thanos so the issue persists.
@bwplotka did you get any response on the IRC or any other place?
Most helpful comment
Hm, our sidecars logs are flooded with those messages even with the most recent versions of Prometheus and Thanos so the issue persists.
@bwplotka did you get any response on the IRC or any other place?