Beats: Some filebeat modules incorrectly parse timestamps

Created on 2 Oct 2019  路  2Comments  路  Source: elastic/beats

There are still some filebeat modules with similar issues to the ones fixed in https://github.com/elastic/beats/pull/13308, probably caused by https://github.com/elastic/beats/pull/12253. Timestamps without timezone are parsed as UTC and then converted to a different timezone, what is incorrect, they should be parsed directly as the final timezone.

https://github.com/elastic/beats/pull/13874 has been created to earlier detect unexpected changes in timestamps when non-UTC timezones are used. We could identify some modules where this seems to be still happening:

  • [x] logstash (plain at least, also reported in https://github.com/elastic/beats/pull/13308#issuecomment-536407658, fix in https://github.com/elastic/beats/pull/13890)
  • [x] Cisco (asa and ftd) #13893

  • [x] Cisco (ios doesn't have a date parser) elastic/beats#13893

  • [x] iptables #13926

  • [x] mssql #13926

  • [x] panw (also reported in https://github.com/elastic/beats/issues/13867) elastic/beats#13926

  • [x] Rabbitmq (https://github.com/elastic/beats/pull/13879)
  • [x] Consider removing event.timezone from events that didn't need it (see https://github.com/elastic/beats/pull/13874#pullrequestreview-296215892) elastic/beats#13918
  • [x] Add if: "ctx.event.timezone == null" condition to pipelines in date processors where another processor exists with the opposite condition. (https://github.com/elastic/beats/pull/13883)
Filebeat Integrations bug in progress meta

Most helpful comment

Hi,

I can confirm that timezone conversion for Logstash plain logs is an issue with Filebeat 7.3.2. I checked the generated ingest pipeline and I can resolve the issue by refactoring the date processing to look the same way as the Kafka module. The fix for me was to do the following in the ingest pipeline:

  • Do not use the ISO8601 format shorthand but instead use an explicit format string (yyyy-MM-dd'T'HH:mm:ss,SSS)
  • Do not chain the two date processor but instead use the logstash.log.timestamp as source field and @timestamp as target field in both processors

Not sure if my fix is the preferred way to do it though :)

/Andreas

All 2 comments

Hi,

I can confirm that timezone conversion for Logstash plain logs is an issue with Filebeat 7.3.2. I checked the generated ingest pipeline and I can resolve the issue by refactoring the date processing to look the same way as the Kafka module. The fix for me was to do the following in the ingest pipeline:

  • Do not use the ISO8601 format shorthand but instead use an explicit format string (yyyy-MM-dd'T'HH:mm:ss,SSS)
  • Do not chain the two date processor but instead use the logstash.log.timestamp as source field and @timestamp as target field in both processors

Not sure if my fix is the preferred way to do it though :)

/Andreas

Hey @atoom, yes, this is the fix we are applying in modules with similar issues, we'll do the same in logstash.

Was this page helpful?
0 / 5 - 0 ratings