For all general issues, please provide the following details for fast resolution:
Ubuntu 16.04input {
kafka {
auto_offset_reset => "earliest"
bootstrap_servers => "XXX"
enable_auto_commit => "false"
fetch_max_bytes => "5242880"
group_id => "YYY"
max_partition_fetch_bytes => "104857600"
max_poll_records => "100000"
topics => ["ZZZ"]
}
}
filter {
json {
source => "message"
remove_field => ["message", "@version"]
}
date {
match => ["timestamp", "UNIX_MS"]
remove_field => ["timestamp"]
}
}
output {
elasticsearch {
action => "index"
document_id => "%{id}"
hosts => ["XXX"]
index => "YYY"
manage_template => false
resurrect_delay => 2
retry_initial_interval => 1
retry_max_interval => 4
}
}
Every once in a while, we start seeing the following error repeatedly in logstash log file:
[2019-04-23T07:58:48,268][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"bignum too big to convert into `long'", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:27:in `jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-9.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in `block in bulk'", "org/jruby/RubyArray.java:2577:in `map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-9.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in `block in bulk'", "org/jruby/RubyArray.java:1792:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-9.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in `bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-9.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:286:in `safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-9.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:191:in `submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-9.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:159:in `retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-9.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:38:in `multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in `multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:390:in `block in output_batch'", "org/jruby/RubyHash.java:1419:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:389:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:341:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:304:in `block in start_workers'"]}
The only way to make it go away is by restarting logstash.
I believe there is a specific field that causes the conversion to fail, and constantly retry.
Anyone?
Upstream issue: https://github.com/guyboertje/jrjackson/issues/73
This is fixed and released upstream in jrjackson. Logstash will be released soon with an updated version.
If you don't want to wait for the update you can "patch" the jrjackon gem to use the new code.
.zip and unzip it.lib/jrjackson/jars and get ready to copy jrjackson-1.2.26.jarvendor/bundle/jruby/2.5.0/gems/jrjackson-0.4.7-java/lib/jrjackson/build_info.rb of your Logstash installation.build_info.rb, on line 20, change the 25 to 26 so it reads '1.2.26' and save it.vendor/bundle/jruby/2.5.0/gems/jrjackson-0.4.7-java/lib/jrjackson/jars/Test changes with:
$ bin/logstash -i irb
irb(main):001:0> require 'jrjackson'
=> false
irb(main):002:0> puts JrJackson::Base.generate({"b"=> 2**63})
{"b":9223372036854775808}
=> nil
You should see the large number is serialised. You might like to do a before and after test to be really certain. You can reverse this by simply changing build_info.rb back to how it was before.
This is really bad. We have the same error in Logstash from time to time. Can't tell you when or why this is happening. The log is a joke telling me that there is an error. atm we are at 7.0.0 and having problems all over since 6.4. In the past week we got rid of all our filters to identify the root cause but still the error remains. Sometimes the error will show up after 5 minutes and sometimes only after a day.
However how have you been able to identify the cause? This would maybe help in the future. I am not a programmer but spamming the logs with random error text will not help anyone. And the best part? Logstash does not even crash. It is active and running but not working. Observing the service to take actions when it fails will not help here. GREAT
Correct folder is: /usr/share/logstash/vendor/...
Any expected release date for the fix ?
I confirm the fix allow to restore a working pipeline but we have errors in the log of logstash and elasticsearch and some events seems to be lost.
In logstash 6.7.2:
[2019-05-10T13:09:57,335][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.05.10", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x3b98a2c9>], :response=>{"index"=>{"_index"=>"logstash-2019.05.10", "_type"=>"doc", "_id"=>"hwemoWoBUS-NnizQYRAV", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"No matching token for number_type [BIG_INTEGER]"}}}}}
In Elasticsearch (6.7.2):
[2019-05-10T13:17:52,851][DEBUG][o.e.a.b.TransportShardBulkAction] [waT_qhC] [logstash-2019.05.10][0] failed to execute bulk item (index) index {[logstash-2019.05.10][doc][tQetoWoBUS-NnizQpJPN], source[n/a, actual length: [2.2kb], max length: 2kb]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
at org.elasticsearch.index.mapper.DocumentParser.wrapInMapperParsingException(DocumentParser.java:174) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:72) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:281) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:799) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.shard.IndexShard.applyIndexOperation(IndexShard.java:775) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.shard.IndexShard.applyIndexOperationOnPrimary(IndexShard.java:744) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.lambda$executeIndexRequestOnPrimary$3(TransportShardBulkAction.java:454) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.executeOnPrimaryWhileHandlingMappingUpdates(TransportShardBulkAction.java:477) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.executeIndexRequestOnPrimary(TransportShardBulkAction.java:452) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.executeBulkItemRequest(TransportShardBulkAction.java:216) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.performOnPrimary(TransportShardBulkAction.java:159) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.performOnPrimary(TransportShardBulkAction.java:151) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:139) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:79) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryShardReference.perform(TransportReplicationAction.java:1050) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryShardReference.perform(TransportReplicationAction.java:1028) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.ReplicationOperation.execute(ReplicationOperation.java:104) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.runWithPrimaryShardReference(TransportReplicationAction.java:424) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.lambda$doRun$0(TransportReplicationAction.java:370) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:61) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.shard.IndexShardOperationPermits.acquire(IndexShardOperationPermits.java:273) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.shard.IndexShardOperationPermits.acquire(IndexShardOperationPermits.java:240) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.shard.IndexShard.acquirePrimaryOperationPermit(IndexShard.java:2561) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.TransportReplicationAction.acquirePrimaryOperationPermit(TransportReplicationAction.java:987) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.doRun(TransportReplicationAction.java:369) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryOperationTransportHandler.messageReceived(TransportReplicationAction.java:324) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryOperationTransportHandler.messageReceived(TransportReplicationAction.java:311) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler$1.doRun(SecurityServerTransportInterceptor.java:250) [x-pack-security-6.7.2.jar:6.7.2]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler.messageReceived(SecurityServerTransportInterceptor.java:308) [x-pack-security-6.7.2.jar:6.7.2]
at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:66) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.transport.TransportService$7.doRun(TransportService.java:692) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:751) [elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.7.2.jar:6.7.2]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_171]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_171]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]
Caused by: java.lang.IllegalStateException: No matching token for number_type [BIG_INTEGER]
at org.elasticsearch.common.xcontent.json.JsonXContentParser.convertNumberType(JsonXContentParser.java:209) ~[elasticsearch-x-content-6.7.2.jar:6.7.2]
at org.elasticsearch.common.xcontent.json.JsonXContentParser.numberType(JsonXContentParser.java:67) ~[elasticsearch-x-content-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.createBuilderFromDynamicValue(DocumentParser.java:768) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseDynamicValue(DocumentParser.java:825) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseValue(DocumentParser.java:621) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:410) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:384) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:485) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:523) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:395) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:384) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:485) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:523) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:395) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:384) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:485) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:523) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:395) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:384) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:485) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:523) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:395) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:384) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:485) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:505) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:395) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:384) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:485) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:505) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:395) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:384) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.internalParseDocument(DocumentParser.java:96) ~[elasticsearch-6.7.2.jar:6.7.2]
at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:69) ~[elasticsearch-6.7.2.jar:6.7.2]
... 36 more
Hi @guyboertje . Do you know when are planning to include it in new release? Currently, i am using docker public version 7.0.0. (https://www.docker.elastic.co/) Thanks in advance
I have packaged the workaround to make applying it easier.
Copy this file to /tmp on the target host
ls-7.0.1-jrjackson-fix-1.tar.gz
then (assuming logstash is installed in /usr/share/logstash):
cd /usr/share/logstash
tar zxvf /tmp/ls-7.0.1-jrjackson-fix-1.tar.gz
service logstash restart
NOTE: This is for LogStash 7.0.1, it maybe ok for others, but I have not tested it.
According to https://github.com/elastic/logstash/blob/7.1/versions.yml jrjackson was bumped to 0.4.8 with v7.1 so that version and beyond should be when the issue got resolved, can you confirm @robbavey ?
@kostasb I believe that is correct, yes
the issue is expected to be resolved, in any recent LS 7.x ... let us know if its not the case for >= 7.2
Most helpful comment
This is fixed and released upstream in
jrjackson. Logstash will be released soon with an updated version.If you don't want to wait for the update you can "patch" the jrjackon gem to use the new code.
.zipand unzip it.lib/jrjackson/jarsand get ready to copyjrjackson-1.2.26.jarvendor/bundle/jruby/2.5.0/gems/jrjackson-0.4.7-java/lib/jrjackson/build_info.rbof your Logstash installation.build_info.rb, on line 20, change the 25 to 26 so it reads'1.2.26'and save it.vendor/bundle/jruby/2.5.0/gems/jrjackson-0.4.7-java/lib/jrjackson/jars/Test changes with:
You should see the large number is serialised. You might like to do a before and after test to be really certain. You can reverse this by simply changing
build_info.rbback to how it was before.