I am seeing a weird error where when I use kafka as an input in the logstash configuration and elasticsearch as an output. I am able to send a json object such as the following:
{
"user": "foo"
"amount": 1
}
but when it tries to write:
{
"user": "foo"
"amount": 0.1
}
it fails with
:message=>"failed action with response of 400, dropping action:
that is the only difference between the two documents. It spits out an error that classifies amount as follows:
\"amount\"=>#<BigDecimal:37335f46,'0.15197E3',5(8)>
I couldn't find any examples of this issue from searching through the internet. Interestingly when I manually curl to POST the document both work. Logstash seems to fail when the amount is a BigDecimal