This script resets after 100,000. What do I need to change to prevent a reset and instead keep counting?
<?php
$filename1 = 'content/general_site_data/total_site_page_loads.txt';
if (file_exists($filename1)) {
$fh = fopen("content/general_site_data/total_site_page_loads.txt", "a+");
if($fh==false)
die("unable to create file");
$filec = 'content/general_site_data/total_site_page_loads.txt';
if (!is_writable($filec))
die('not writable');
$total_site_page_loads = trim(file_get_contents($filec)) + 1;
fwrite(fopen($filec, 'w'), $total_site_page_loads);
echo '------------------------------<br />
Site Wide Page Views: '.$total_site_page_loads.'<br />';
} else {
$fh = fopen($filename1, "a");
$total_site_page_loads = trim(file_get_contents($filename1)) + 1;
fwrite($fh, $total_site_page_loads);
fclose($fh);
echo '------------------------------<br />
Site Wide Page Views: '.$total_site_page_loads.'<br />';
}
?>
Your code may be suffering from a race condition.
Mid way through, you re-open the file in
w
mode, which truncates the file to zero length. If another copy of your script opens and attempts to read the file while it's been truncated but before it's been read, the counter will be reset to zero.Here is an updated version of your code:
Because the initial open is in append-or-create mode, you don't need to handle a case where the file doesn't exist, unless the initial open failed.
With the file locking in place, this code should never reset the counter in the file, no matter how many concurrent requests there are. (Unless you happen to also have other code writing to the file, of course.)
I can't see where any reset would occur but how the script works seems pretty straightforward. Maybe try editing
total_site_page_loads.txt
to something like99990
and watch what happens to that file as you cross over to100000
?