logstash 1.5.3 failing to index same json with a f

2019-09-11 09:50发布

问题:

I am seeing a weird error where when I use kafka as an input in the logstash configuration and elasticsearch as an output. I am able to send a json object such as the following:

{ 
 "user": "foo"
 "amount": 1 
}

but when it tries to write:

{
 "user": "foo"
 "amount": 0.1
}

it fails with :message=>"failed action with response of 400, dropping action:

that is the only difference between the two documents. It spits out an error that classifies amount as follows: \"amount\"=>#<BigDecimal:37335f46,'0.15197E3',5(8)>

I couldn't find any examples of this issue from searching through the internet. Interestingly when I manually curl to POST the document both work. Logstash seems to fail when the amount is a BigDecimal

回答1:

If you don't specify a mapping, elasticsearch guesses based on the first input.

If the first input is an integer, the field will be an integer. Trying to put a decimal into an integer seemingly is not allowed (but I would imagine that the opposite would be OK).

Check your mapping for the field.

Note that you can set the field type in logstash, either in the grok pattern or with mutate->convert. This won't change an existing index, but would be used for subsequent new indexes.