Grok parse error while parsing multiple line messa

2019-09-17 01:47发布

I am trying to figure out grok pattern for parsing multiple messages like exception trace & below is one such log

2017-03-30 14:57:41 [12345] [qtp1533780180-12] ERROR com.app.XYZ - Exception occurred while processing
java.lang.NullPointerException: null
        at spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:162)
        at spark.webserver.JettyHandler.doHandle(JettyHandler.java:61)
        at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:189)
        at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
        at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:119)
        at org.eclipse.jetty.server.Server.handle(Server.java:517)
        at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:302)
        at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:242)
        at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:245)
        at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
        at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:75)
        at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:213)
        at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:147)
        at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
        at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
        at java.lang.Thread.run(Thread.java:745)

Here is my logstash.conf

    input {
  file {
    path => ["/debug.log"]
    codec => multiline {
      # Grok pattern names are valid! :)
      pattern => "^%{TIMESTAMP_ISO8601} "
      negate => true
      what => previous
    }
  }
}

filter {

  mutate {
    gsub => ["message", "r", ""]
  }
  grok {
    match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NOTSPACE:uid}\] \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:class}\-%{GREEDYDATA:message}" ]
    overwrite => [ "message" ]
  }
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
  }
}


output {
  elasticsearch { hosts => localhost }
  stdout { codec => rubydebug }
}

This works fine for single line logs parsing but fails in

0] "_grokparsefailure"

for multiline exception traces

Can someone please suggest me the correct filter pattern for parsing multiline logs ?

2条回答
女痞
2楼-- · 2019-09-17 02:30

If you are working with Multiline logs then please use Multiline filter provided by logstash. You first need to distinguish the starting of a new record in multiline filter. From your logs I can see new record is starting with "TIMESTAMP", below is the example usage.

Example usage ::

filter {
  multiline {
    type => "/debug.log"
    pattern => "^%{TIMESTAMP}"
    what => "previous"
 }
}

You can then use Gsub to replace "\n" and "\r" which will be added by multiline filter to your record. After that use Grok.

查看更多
何必那么认真
3楼-- · 2019-09-17 02:37

The above logstash config worked fine after removing

mutate { gsub => ["message", "r", ""] }

So the working logstash config for parsing single line & multi line inputs for the above log pattern

input {
  file {
    path => ["./debug.log"]
    codec => multiline {
      # Grok pattern names are valid! :)
      pattern => "^%{TIMESTAMP_ISO8601} "
      negate => true
      what => previous
    }
  }
}

filter {
  grok {
    match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NOTSPACE:uid}\] \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:class}\-%{GREEDYDATA:message}" ]
    overwrite => [ "message" ]
  }
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
  }
}


output {
  elasticsearch { hosts => localhost }
  stdout { codec => rubydebug }
}
查看更多
登录 后发表回答