I've been bumping into a problem. I have a log on a Linux box in which is written the output from several running processes. This file can get really big sometimes and I need to read the last line from that file.
The problem is this action will be called via an AJAX request pretty often and when the file size of that log gets over 5-6MB it's rather not good for the server. So I'm thinking I have to read the last line but not to read the whole file and pass through it or load it in RAM because that would just load to death my box.
Is there any optimization for this operation so that it run smooth and not harm the server or kill Apache?
Other option that I have is to exec('tail -n 1 /path/to/log')
but it doesn't sound so good.
Later edit: I DO NOT want to put the file in RAM because it might get huge. fopen()
is not an option.
I would use file() that reads the file into an array, reverse the array and get the first element or pop the array:
$last_line = array_pop(file($filename));
If you want performance try opening the file and using the file pointer to navigate into it.
Source: http://forums.devshed.com/php-development-5/php-quick-way-to-read-last-line-156010.html
Your problem looks similar to this one
The best approach to avoid loading the whole file into memory seems to be:
You're looking for the fseek function. There are working examples of how to read the last line of a file in the comments section there.
Would it be possible to optimize this from the other side? If so, just let the logging application always log the line to a file while truncating it (i.e. > instead of >>)
Some optimization might be achieved by "guessing" though, just open the file and with the average log line width you could guess where the last line would be. Jump to that position with fseek and find the last line.
This is my solution with only one loop