I'm only outputting my parsed data into a mongodb from logstash, but it there any way to tell when the logs are finished parsing, so that I can kill logstash? As a lot of logs are being processed, I cannot stdout my data.
相关问题
- Correctly parse PDF paragraphs with Python
- R: eval(parse()) error message: cannot ope
- How do I parse a .pls file using PHP? Having troub
-
Create array from the contents of 相关文章
- How do I get from a type to the TryParse method?
- Slow ANTLR4 generated Parser in Python, but fast i
- Parsing JSON in QML [duplicate]
- How do I generate an AST from a string of C++ usin
- JSoup will not fetch all items?
- Content is not allowed in prolog
- How to manage parsing an null object for DateTime
- Make Gson throw exception on parsing JSON with dup
Since you are using a file input, there should be a
.sincedb
file somewhere. That file keeps track of how many lines have already been parsed. As far as I understand it, it is structured this way:The inode number identifies a file (so if you are parsing several files or if your file is being rolled over, there will be several lines). The other number is like a bookmark for logstash to remember what it already read (in case you would proceed the same file in several times). So basically, when this number stops moving up, this should mean that logstash is done parsing the file.
Alternatively if you have no multiline filter set up, you could simply compare the number of lines the file has to the number of records in mongodb.
Third possibility, you can setup another output, not necessarily stdout, this could be for example a pipe to a script that will simply drop the data and print a message when it got nothing new after some time, or some other alternative, see the docs.