I am using logstash 2.4.0
My output is like this:
{
"@timestamp" => "2017-05-10T18:14:47.269Z",
"message" => "[2017-01-14 10:59:58,591][WARN ][index.search.slowlog.query] [yaswanth] [bank][3] took[50ms], took_millis[50], types[details], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}], extra_source[], \r",
"@version" => "1",
"path" => "F:\\logstash-2.4.0\\logstash-2.4.0\\bin\\picaso.txt",
"host" => "yaswanth",
"TIMESTAMP" => "2017-01-14 10:59:58,591",
"LEVEL" => "WARN",
"QUERY" => "index.search.slowlog.query",
"QUERY1" => "yaswanth",
"INDEX-NAME" => "bank",
"SHARD" => "3",
"TOOK" => "50ms",
"TOOKM" => 50,
"types" => "details",
"search_type" => "QUERY_THEN_FETCH",
"total_shards" => "5",
"source_query" => "{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}"
}
{
"@timestamp" => "2017-05-10T18:14:47.270Z",
"message" => "[2017-01-14 10:59:58,591][WARN ][index.search.slowlog.query] [yaswanth] [bank][2] took[50.2ms], took_millis[50], types[details], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}], extra_source[], \r",
"@version" => "1",
"path" => "F:\\logstash-2.4.0\\logstash-2.4.0\\bin\\picaso.txt",
"host" => "yaswanth",
"TIMESTAMP" => "2017-01-14 10:59:58,591",
"LEVEL" => "WARN",
"QUERY" => "index.search.slowlog.query",
"QUERY1" => "yaswanth",
"INDEX-NAME" => "bank",
"SHARD" => "2",
"TOOK" => "50.2ms",
"TOOKM" => 50,
"types" => "details",
"search_type" => "QUERY_THEN_FETCH",
"total_shards" => "5",
"source_query" => "{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}"
}
But what i want is like this
{
"@timestamp" => "2017-05-10T18:14:47.269Z",
"message" => "[2017-01-14 10:59:58,591][WARN ][index.search.slowlog.query] [yaswanth] [bank][3] took[50ms], took_millis[50], types[details], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}], extra_source[], \r",[2017-01-14 10:59:58,591][WARN ][index.search.slowlog.query] [yaswanth] [bank][2] took[50.2ms], took_millis[50], types[details], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}], extra_source[], \r"
"@version" => "1",
"path" => "F:\\logstash-2.4.0\\logstash-2.4.0\\bin\\picaso.txt",
"host" => "yaswanth",
"TIMESTAMP" => "2017-01-14 10:59:58,591",
"LEVEL" => "WARN",
"QUERY" => "index.search.slowlog.query",
"QUERY1" => "yaswanth",
"INDEX-NAME" => "bank",
"SHARD" => "3",
"TOOK" => "50ms",
"TOOKM" => 50,
"types" => "details",
"search_type" => "QUERY_THEN_FETCH",
"total_shards" => "5",
"source_query" => "{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}"
}
I want to send all the message fields from multiple events to a single event for sending email .
Is there anything wrong in the above config ? Do i have to use aggregate filter for this type of requirement?
Thanks
What you could do is to aggregate a number of events of at the level of the file input plugin before sending them to the output plugin. A good example is given here.
You might have to modify your grok filter a little bite.