I'm trying to set a cookie for my phpBB forums from a MediaWiki login page. Using the hook after a login to the wiki is successful, I want to run a php script that sets the cookie.
The script works when I run it independently or when I use GET , but for security reasons I want to POST to the script. For this I figured curl
would be the best option.
Unfortunately, even the basic script like this:
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/ForumLogin.php");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
Gives me a 403 Forbidden error. There's no rules in robots.txt that should interfere. What else could I try to get the script to work, or are there any other ways I could run the script from within MediaWiki?
My solution for this was to set the User Agent option, so the cURL can pretend to be a browser . An example of this set up in php is
References:
to act/request like real try "curl/7.39.0" user agent
or try randam user agents from browsers array list like
I'd suspect the justification for this is explicitly to stop automated behaviour - an anti-bot or general security measure. You may wish to look at the source code of the destination site and check for any such measures - a quick search of the code for '403' might offer some insight. It may even be the case that POST requests are not legitimate in that context - and thus prevented for security reasons.
I'm not sure what you mean by 'for security reasons' by the way. POST isn't more secure than GET. They're both open to just as much scrutiny.
For my specific project, the server would throw a 403 error if an error occurs, but still return data. So to get around the issue, I did this:
If you disable the fail on errors, you might still get some data back. Hope that helps.