Huge scream for reference here. When I try to convert a MySQL value into an elasticsearch nested field using logstash, I get the following error.
{"exception"=>"expecting List or Map, found class org.logstash.bivalues.StringBiValue", "backtrace"=>["org.logstash.Accessors.newCollectionException(Accessors.java:195)"
Using the following configuration file:
input { jdbc { jdbc_driver_library => "/logstash/mysql-connector-java-5.1.42-bin.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/data" jdbc_user => "username" jdbc_password => "password" statement => "SELECT id, suggestions, address_count FROM `suggestions` WHERE id <= 100" jdbc_paging_enabled => "true" jdbc_page_size => "50000" } } filter { mutate { rename => { 'address_count' => '[suggestions][payload][count]' } } } output { elasticsearch { hosts => [ "localhost:9200" ] index => "dev_suggestions" document_type => "address" } }
However, if I rename address_count to a field that was not already in my mapping, then it works fine, and it correctly adds the value as a nested property , I tried to use other fields in my index, and not just the .payloads.address_count sentences and I I get the same problem, It only works if the field is not defined in the mapping.
It caused me serious headaches, and if someone could help me overcome this problem, I would be very grateful for this, since I spent the last 48 hours hitting my head on the table!
It was originally supposed that I could do the following with a MySQL query:
SELECT id, suggestion, '[suggestions][payload][count]' FROM `suggestions` WHERE id <= 100
Then I also tried
SELECT id, suggestion, 'suggestions.payload.count' FROM `suggestions` WHERE id <= 100
Both could not insert a value with a later version, indicating an error that the field cannot contain points.
And finally, the mapping:
{ "mappings": { "address": { "properties": { "suggestions": { "type": "completion", "payloads" : true } } } } }
Thanks to Val - and for future users in the same situation as me, I need to convert MySQL data into nested Elasticsearch objects using logstash. Here is a working solution using Logstash 5 and Elasticsearch 2. *
input { jdbc { jdbc_driver_library => "/logstash/mysql-connector-java-5.1.42-bin.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/data" jdbc_user => "username" jdbc_password => "password" statement => "SELECT addrid, suggestion, address_count FROM `suggestions` WHERE id <= 20" jdbc_paging_enabled => "true" jdbc_page_size => "50000" } } filter { ruby { code => " event.set('[suggestions][input]', event.get('suggestion')) event.set('[suggestions][payload][address_count]', event.get('address_count')) event.set('[v][payload][id]', event.get('addrid')) " remove_field => [ 'suggestion', 'address_count', 'addrid' ] } } output { elasticsearch { hosts => [ "localhost:9200" ] index => "dev_suggestions" document_type => "address" } }