It have a logfile that stores event with a timestamp and a json message. For example:
timestamp {"foo": 12, "bar": 13}
I would like to decompose the keys (foo and bar) in the json part into fields in the Logstash output.
I'm aware that I can set the format field in the Logstash file filter to json_event but in that case I have to include the timestamp in json. There is also a json filter, but that adds a single field with the complete json data structure, instead of using the keys.
Any ideas how this can be done?
I've done this with the following config:
By the way, this is a config for the new version 1.2.0.
In version 1.1.13 you need to include a target on the json filter and the reference for message in the grok filter is @message.
Try the latest logstash 1.2.1 and use codec value to parse json events directly.
You can just use plain Grok filters (regex style filters/patterns) and assign the matched value into a variable for easy organization, filtering and searching.
An example:
Something along those lines.
Use the GrokDebugger to help out if you get stuck on the syntax, patterns and things you think should be matching but aren't.
Hope that helps a bit.
your JSON is wrong
{"foo": 12, "bar" 13}
should be:
{"foo": 12, "bar": 13}