I'm using grep to extract lines across a set of files:
grep somestring *.log
Is it possible to limit the maximum number of matches per file to the last n matches from each file?
I'm using grep to extract lines across a set of files:
grep somestring *.log
Is it possible to limit the maximum number of matches per file to the last n matches from each file?
This will list last 10 matches as
tail
by default lists last 10 lines. If you wish to get a different number then the following would help -where
number
can be your number of linesWell I think grep does not support to limit N matches from the end of file so this is what you have to do
Replace
tail -1
-1 with N. (-H options is to print the file name else it wont be printed if you are grep in a single file and thats exactly we are doing above)NOTE: Above soln will work fine with file names with spaces.
For N matches from the start of the file
Replace
-m1
1 with N.Kind of off the cuff here, but read this How to do something to every file in a directory using bash? as a starting point. Here's my take, assuming just the last 20 matches from each file.
Might not be completely correct, don't have files in front of me to check with, but should be a starting point.
Last occurrence of search pattern in every log file under current directory:
First occurrence of search pattern in every log file under current directory:
replace
1
in-n1
with number of occurrences you wantAlternatively you can use
find
's-exec
option instead ofxargs
You can use
-mtime
withfind
to limit down your search of log files to, let's say 5 daysfind . -mtime -5 -name \*log\* | xargs -I{} sh -c "grep --color=always -iH pattern {} | tail -n1"