I am logging to logstash,in json format,
my logs have the following fields, each field is a string and the atts
field is a stringified json (note: atts
sub fields are different each time)
here is an example:
{"name":"bob","last":"builder", "atts":"{\"a\":111, \"b\":222}"}
I would like to parse it to something like this:
{
"name" => "bob",
"last" => "builder"
"atss" => {
"a" => 111,
"b" => 222}
}
here is my configuration:
input { stdin { } }
filter {
json {
source => "message"
target => "parsed"
}
}
output { stdout { codec => rubydebug }}
ok, so now I get this:
{
"@timestamp" => 2017-04-05T12:19:04.090Z,
"parsed" => {
"atss" => "{\"a\":111, \"b\":222}",
"name" => "bob",
"last" => "the builder"
},
"@version" => "1",
"host" => "0.0.0.0"
}
how can I parse the atts
field to json so I receive:
{
"@timestamp" => 2017-04-05T12:19:04.090Z,
"parsed" => {
"atss" =>
{"a" => 111,
"b" => 222},
"name" => "bob",
"last" => "the builder"
},
"@version" => "1",
"host" => "0.0.0.0"
}
There is a
json
filter. Just pass it the field you want to parse and a target where you want it.Something like:
I'm not sure if you can put atss as new field. It might or might not work. If it doesn't, use the
mutate
filter toremove_field
andrename_field
.thanks to @Alcanzar here is what I did