Decompose Logstash json message into fields

2019-03-10 06:06发布

It have a logfile that stores event with a timestamp and a json message. For example:

timestamp {"foo": 12, "bar": 13}

I would like to decompose the keys (foo and bar) in the json part into fields in the Logstash output.

I'm aware that I can set the format field in the Logstash file filter to json_event but in that case I have to include the timestamp in json. There is also a json filter, but that adds a single field with the complete json data structure, instead of using the keys.

Any ideas how this can be done?

标签: logstash
4条回答
闹够了就滚
2楼-- · 2019-03-10 06:12

I've done this with the following config:

filter {
  grok {
    match => ["message", "\[%{WORD}:%{LOGLEVEL}\] %{TIMESTAMP_ISO8601:tstamp} :: %{GREEDYDATA:msg}"]
  }
  date {
    match => [ "tstamp", "yyyy-MM-dd HH:mm:ss" ]
  }
  json {
    source => "msg"
  }
}

By the way, this is a config for the new version 1.2.0.

In version 1.1.13 you need to include a target on the json filter and the reference for message in the grok filter is @message.

查看更多
冷血范
3楼-- · 2019-03-10 06:26

Try the latest logstash 1.2.1 and use codec value to parse json events directly.

input {
    file {
        type => "tweetfile"
        path => ["/home/nikhil/temp/feed/*.txt"]
        codec => "json"
    }
}
filter{
    json{
        source => "message"
        target => "tweet"
    }
}
output {
    stdout { }
    elasticsearch { embedded => true }
}
查看更多
混吃等死
4楼-- · 2019-03-10 06:28

You can just use plain Grok filters (regex style filters/patterns) and assign the matched value into a variable for easy organization, filtering and searching.

An example:

((?<foo_identifier>(\"foo\"))):((?<foo_variable_value>(\d+,)))

Something along those lines.

Use the GrokDebugger to help out if you get stuck on the syntax, patterns and things you think should be matching but aren't.

Hope that helps a bit.

查看更多
萌系小妹纸
5楼-- · 2019-03-10 06:36

your JSON is wrong {"foo": 12, "bar" 13}

should be:

{"foo": 12, "bar": 13}

查看更多
登录 后发表回答