I want to transfer data from MongoDB to ElasticSearch via logstash. The transfer works as long as I exclude _id.
The following error is generated:
Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::OrgLogstash::MissingConverterException: Missing Converter handling for full class name=org.bson.types.ObjectId, simple name=ObjectId>}
It is possible that ObjectId is not converted correctly? In my opinion only the unique identifier (a string) should be returned (not an ObjectId).
Input of logstash config (in folger pipeline)
input {
jdbc {
jdbc_driver_library => "mongojdbc1.5.jar"
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_user => "<username>"
jdbc_password => "<password>"
jdbc_connection_string => "jdbc:mongodb://<username>:<password>@<server>:<port>/?AuthMechanism=SCRAM-SHA-1&authSource=<db>"
statement => "<db>.<collection>.find({ },{'_id': false})" # this works
# statement => "<db>.<collection>.find()" # this raises the mentioned error
}
}
Dockerfile
FROM docker.elastic.co/logstash/logstash:6.8.3
# Install jdbc input plugin
RUN bin/logstash-plugin install logstash-input-jdbc && \
bin/logstash-plugin install logstash-output-stdout
# Remove the default pipeline
RUN rm -f /usr/share/logstash/pipeline/logstash.conf
# Inject jdbc driver directly into java classpath
# (see issue https://github.com/logstash-plugins/logstash-filter-jdbc_static/issues/47)
COPY ./drivers /usr/share/logstash/logstash-core/lib/jars/
# Add logstash and pipeline config
COPY config/ /usr/share/logstash/config
COPY pipeline/ /usr/share/logstash/pipeline
ENTRYPOINT ["bin/logstash"]
Drivers
Downloaded from https://dbschema.com/jdbc-drivers/MongoDbJdbcDriver.zip and unzipped into ./drivers.
I'm in the same situation...have you been able to find a solution?
I went away from logstash and wrote a short python script which did the transfer.
Any Solutions ?
Most helpful comment
I went away from logstash and wrote a short python script which did the transfer.