It is very common to use include
files. I think it is overused to keep the codes tidy without considering performance. For several include
s, disk should read the files, and as we extremely use disk, it can be a slow process. However, this is not the main slow process or the rate-limiting process, as loading the file with file_get_contents
is few times faster.
I think this is the reason that major websites put javascripts within the html file rather than loading them by file. Alternatively, it can be a good idea to split a large JS file into several small JS files, as parallel http requests can load the entire JS codes faster. But this is different from php files, as php script reads include
files one by one during the process.
Please comment how much serious this problem can be? Imagine a webpage is loaded in 0.60s, can
include
of 10 php files turn it to 0.70s?Although this effect should be negligible, I want to know if there are approaches to speed up this process. I do not mean php caching like
APC
.
P.S. This question is not for practical application (a typical case), but theoretical considerations in general.
PHP has to parse the code no matter if it is in the main php file or an include. Putting it in an include probably make no difference. Disk speed makes no difference either since it will be cached after the first time.
Consider this:
index.php takes ~115 seconds (for me) to process 100,000 iterations while including somefile.php, even though somefile.php has nothing in it.
However:
index.php now takes 0.002 seconds to complete without an include() construct.
index.php takes 0.02 seconds to echo 100,000 iterations of $i.
Of course, this is a pretty extreme example due to the large number of iterations, but it does show that by simply including an include construct, script execution times can be delayed quite exponentially. Consider this the next time you write a process with large numbers of iterations, ie. reading/writing large XML files, etc. It's best to keep your code inline, even if that means it's less "manageable". 'Cause not only are you adding ~115 seconds (~2 minutes) to script execution time at every ~100,000 iterations simply by including an include(), but consider if that include() (somefile.php) had processes of its own to execute. My example is simply adding an include() construct.. the included file contained nothing.
Now, including files here and there for a webpage, the times would be negligible. I was only pointing out that the include() construct does require extra processing regardless of its contents.
So, of course, including files may slow down the php script. Sometimes you need to include files to clarify your coding structure, architecture and design. But there is some methods and solutions to improve the speed of php script that has more includes.
Solutions
You can speed up your script by using these methods:
engine.php
and I'm using it via including it by the router.Yes, it does. The libraries you used to use will bring performance penalty due to a lot of includes underneath. The best approach to improve performance is:
It can speed up your solution by 22 times. Read more Here
include
and its ilk is a necessity. It is similar toimport
in Java and python in that it is used for class and function definitions.include
should be extremely fast, but using it will delay script execution compared to if it was not there.include
is totally different fromfile_get_contents()
. The latter is a function rather than a construct and returns a string.include
will actually execute the code of the included file.Your statement about splitting JS files is incorrect as script downloads from the same domain block parallel downloads and it's generally recommended to have as few includes as possible in general.
I highly doubt that having multiple
include
s, assuming all are necessary, is going to slow down the performance of your page. If you are having performance problems, look elsewhere.If you want to speed up php, look into using a php compiler.