In my log file I have entries like the following:
2014-06-25 12:36:18,176 [10] ((null)) INFO [s=(null)] [u=(null)] Hello from Serilog, running as "David"! [Program]
2014-06-25 12:36:18,207 [10] ((null)) WARN [s=(null)] [u=(null)] =======MyOwnLogger====== Hello from log4net, running as David! [MyOwnLogger]
2014-06-25 12:36:18,209 [10] ((null)) ERROR [s=(null)] [u=(null)] =======MyOwnLogger====== Hello from log4net, running as David! [MyOwnLogger]
which are of loglevel INFO, WARN and ERROR respectively.
What I would like to do is to only output to Elasticsearch those entries which are of ERROR level. Here is my Logstash configuration file:
input {
file {
path => "Somepath/*.log"
}
}
# This filter doesn't work
filter {
if [loglevel] != "error" {
drop { }
}
}
output {
elasticsearch { host => localhost }
stdout {}
}
Effectively, currently nothing gets sent to Elasticsearch. I know it is related to the filter because if it's not there, all the entries get sent to Elastisearch.
Try this grok filter. It is works at me with your logs
First, you need to grok the loglevel, then just you can use the field to do if else condition and decide drop or not.