Caching includes in PHP for iterated reuse

2019-05-17 23:17发布

Is there a way to cache a PHP include effectively for reuse, without APC, et al?

Simple (albeit stupid) example:

// rand.php
return rand(0, 999);

// index.php
$file = 'rand.php';
while($i++ < 1000){
    echo include($file);
}

Again, while ridiculous, this pair of scripts dumps 1000 random numbers. However, for every iteration, PHP has to hit the filesystem (Correct? There is no inherit caching functionality I've missed, is there?)

Basically, how can I prevent the previous scenario from resulting in 1000 hits to the filesystem?

The only consideration I've come to so far is a goofy one, and it may not prove effective at all (haven't tested, wrote it here, error prone, but you get the idea):

// rand.php
return rand(0, 999);

// index.php
$file = 'rand.php';
$cache = array();
while($i++ < 1000){
    if(isset($cache[$file])){
        echo eval('?>' . $cache[$file] . '<?php;');
    }else{
        $cache[$file] = file_get_contents($file);
        echo include($file);
    }
}

A more realistic and less silly example:

When including files for view generation, given a view file is used a number of times in a given request (a widget or something) is there a realistic way to capture and re-evaluate the view script without a filesystem hit?

4条回答
叼着烟拽天下
2楼-- · 2019-05-17 23:33

Another option:

while($i++ < 1000) echo rand(0, 999);
查看更多
smile是对你的礼貌
3楼-- · 2019-05-17 23:41

As with Sabeen Malik's answer you could capture the output of the include with output buffering, then concat all of them together, then save that to a file and include the one file each time.

This one collective include could be kept for an hour by checking the file's mod time and then rewriting and re including the includes only once an hour.

查看更多
我欲成王,谁敢阻挡
4楼-- · 2019-05-17 23:45

This would only make any sense if the include file was accessed across a network.

There is no inherit caching functionality I've missed, is there?

All operating systems are very highly optimized to reduce the amount of physical I/O and to speed up file operations. On a properly configured system in most cases, the system will rarely revert to disk to fetch PHP code. Sit down with a spreadsheet and have a think about how long it would take to process PHP code if every file had to be fetched from disk - it'd be ridiculous, e.g. suppose your script is in /var/www/htdocs/index.php and includes /usr/local/php/resource.inc.php - that's 8 seek operations to just locate the files - @8ms each, that's 64ms to find the files! Run some timings on your test case - you'll see that its running much, much faster than that.

查看更多
叛逆
5楼-- · 2019-05-17 23:48

I think better design would be something like this:

// rand.php
function get_rand() {
    return rand(0, 999);
}

// index.php
$file = 'rand.php';
include($file);

while($i++ < 1000){
    echo get_rand();
}
查看更多
登录 后发表回答