Does Logstash has a limit size for each event-mess

2019-02-18 09:12发布

I am implementing a monitoring tool on servers of my company service. To do that, I am using logstash. Our applications send their logs via a log4net udp appender to logstash (input udp) and then logstash grok them, and send them to elasticsearch. When I display my logs in kibana, I see that some logs are truncated, the last main part is missing (for big logs). So my question is, does Logstash has a size limit for each message-event received. If yes, is it possible to increase the size. I need all my logs and none of them truncated.

标签: logstash
3条回答
Summer. ? 凉城
2楼-- · 2019-02-18 09:39

I have test it with Logstash 1.4.0 and Logstash 1.3.3. I found that the maximum size of an event is 4095!

So, If your logs have larger than this size, maybe you have to split it to multiple event at the time you send the logs to logstash.

查看更多
趁早两清
3楼-- · 2019-02-18 09:41

For the udp case, I think that I have found the solution : -increase the buffer_size parameter in udp.rb file.

I cannot test it now, but I will tell you if it works.

查看更多
Emotional °昔
4楼-- · 2019-02-18 09:45

Logstash's property buffer_size is by default set to 8192. That's why messages sent over UDP to Logstash are truncated at 8192th symbol.

Try increasing UDP buffer_size in Logstash.

References:

查看更多
登录 后发表回答