This question already has an answer here:
-
How to scrape websites when cURL and allow_url_fopen is disabled
4 answers
I've got several functions in my php app that relay on calls to file_get_contents(), file_put_contents, and getimagesize().
The problem is that when allow_url_fopen an allow_url_include are disabled in php.ini I'm getting errors on these critical functions.
Warning: getimagesize() [function.getimagesize]: URL file-access is
disabled in the server configuration in
/home/content/.../html/_sites/mysite/wp-content/themes/mytheme/functions.php
on line 2534
What are the options for working around these issues?
EDIT: These are all local files on the same webserver as the calling php script. Is there a preferred method for reading/writing file contents?
You can use curl to get remote pages. You can store the results of the curl request to a variable and echo() in place of using the url wrapper to fetch content.
In theory, you could also eval() the returned data. But running remotely fetched PHP code is a ENORMOUS security risk, any PHP code included in this way can literally do anything you can. DON'T DO IT! The same goes for allow_url_include too
If you have access to your webserver, you may have to find your php.ini file, for example:
/etc/php5/apache/php.ini
And use these options:
; http://php.net/allow-url-fopen
allow_url_fopen = On
; http://php.net/allow-url-include
allow_url_include = Off
If your using someking of hosting account, these options may be interactively available at a control panel.
The second solution might be to use cURL. I suppose you were trying to call getimagesize with an URL. Documentation here: http://php.net/manual/en/book.curl.php
$ch = curl_init('http://example.com/image.php');
$fp = fopen('/my/folder/flower.gif', 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);