So, I'm developing my own CMS which dynamically adjusts to the settings I have set on a specific domain.
I just developed this whole CMS on my local machine and now facing the problem that I have multiple projects, each on a different domain (same host provider though, don't know if that's relevant?). On my localhost it's no problem because I just point to the specific CMS folder.
__
Example: My CMS Files are in a subdirectory of www.mycms.com like www.mycms.com/cms/
and my project ist hosted on www.project-a.com.
Now I assumed I just need to include my multiple CMS files like www.mycms.com/cms/classes/user.php
, etc.
Since this gives me a permission denied
error and everyone on the web seems to be against allow_url_fopen
because of security risks, I need to know if I have other possibilities to make this work without putting all the files of /cms/
into each domain each time. Because that would force me to upload all CMS files all the time to each domain seperately, even if I just change minor things.
I don't know if this is compareable to an API like the Google Maps API or something where you also just include remote files? (As far as I know)
What are my options? And if there are no options left, should I reconsider using allow_url_fopen
but implementing security methods to avoid attacks, etc?
1.) Don’t do this.
Each site should have independent libraries. Imagine one day you decide to make a change to add a new feature or change a feature in your CMS for one of yours sites. Suddenly you have to test this change on all of your sites before you can roll it out. It’s a better idea to version all your files and then deploy those files to each site as it becomes practical to do so.
2.) If you must do this, and your hosting provider has one file space for all your domains you can do it, no problem. e.g., create a file structure that looks like:
For each website (in a config file just for that site or possibly in the index.php if all traffic routes through that):
Now each site can access CMS files as:
You can also get a little clever here so you don't have to define the path for every site by hand. If you're defining it in the file
~/public_html/project-a.com/index.php
you could useMy answer is based on following assumptions:
allow_url_fopen
seems a viable option to you, beside thatallow_url_include
is what you're actually seeking for;allow_url_include
won't magically connect to the remote server and load the php-files like server-side;allow_url_include
should definitely be disabled, due to security risks;allow_url_include
could also greatly reduce the performance of your project; your idea of what is a webservice is derived from Google Maps API) - no intent to offend!Let me address the issues with
allow_url_fopen
andallow_url_include
:allow_url_fopen
andallow_url_include
would require you to open a potential security risk.:allow_url_*
would require you to expose the source-code of your application via http. Based on my deduced assumption, that you're rather beginner, I imply the risk, that a potential attacker would find a way, how to formulate some hack very easy. Unfair generalization, but this is almost always the case when I review source-code of younger developers.allow_url_fopen
andallow_url_include
, or even a webservice could noticeably reduce the performance of your CMS, since all files, that are subjected to inclusion, would be streamed over network instead of the (usually much better connected) local storage.Solution 1) The most common solution is, to configure your domain's DNS A and AAAA-Records to point all to the same server-ip and configure your webserver (eg apache, nginx/varnish) to direct all traffic to a single virtual-host. Your CMS then has to deal with requests from different origin addresses. You can deliver the appropriate content based on, whats in the super-global
$_SERVER['HTTP_HOST']
-variable. Beware, that this variable could be named differently, if your server-environment is behind a reverseproxy (you would know this; this is not different per visitor then).$_SERVER['HTTP_HOST']
Solution 2) If you want those projects to be physically separated as your requests suggests, you can take a look at various deployment-systems, that would enable to you push changes to multiple destinations. Your CMS should be built in a way, that treats cm-system-core-files (PHP-Files) and actual content differently and is able to update core-files without affecting content. You don't have pretty much options here. You could use SCM-Systems like Git or SVN to sync changes with remote projects easily, but this is rather discouraged.
Solution 3) You can indeed build some kind of webservice (REST is a often used technique these days). So the web-project hosted on project-a.zyx would be a rather simple thin-client, that mostly redirects requests to some rest-endpoint. You would normally also want some kind of https-based authentication here. This would require your clients to be able to request content (not actual source-code!) from another endpoint via HTTP (which is sometimes disabled on shared hosting environments), optionally make some transformation on that content and emit it. Since your requests seems to imply, that this is not the ideal option, you should really look into the first solution.