I have two subsequent forms on my website with POST method.
The first page of my website first.php
contains this code:
<form action="a.php" method="POST" target="_blank">
<input name="value" type="hidden" value="foo"/>
<div class="button"><label><span class="icon"></span>
<input type="submit" class="button-graphic ajax" value="Click Here"></label></div></form>
a.php
can be accessed only via this POST request (otherwise user will get method not allowed 405 error)
Once submitted, this form opens a.php
with an AJAX modal window.
a.php
contains another form:
<form action="b.php" method="POST" target="_blank">
<input name="bar" type="hidden" value="none"/>
<div class="border"><label><input type="submit" class="button-graphic2 tracking" value="Continue"></label></div></form>
When a user clicks Submit in the second form, it will open b.php
,
which can also be accessed only via POST request (otherwise - 405 error).
The only difference I can think about between these forms is that the second one contains a tracking js class (opening an iframe). this is the js code:
$(document).ready(function() {
$(".tracking").click(function(){
var iframe = document.createElement('iframe');
iframe.style.width = '0px';
iframe.style.height = '0px';
iframe.style.display = 'block';
document.body.appendChild(iframe);
iframe.src = '/track.htm';
});
This is done in order to track a conversion using a third party script which is being execuated from track.htm
I noticed that I am having a problem with about 5% of my iPad visitors.
they open a.php
properly with a POST request, but when they go ahead to continue and open b.php
as well, about 5% sends out a GET
request instead of the desired POST
request, causing them to get an 405 error and leave the website.
I know that these are real human users as I can see some of them trying several times to open b.php
and keep getting these 405 errors.
Could this be caused because simultaneously their device is using a GET
request to obtain track.htm
? and this is some glitch?
How can this be solved?
EDIT 4.4.2015:
Since there's a chance that firing the tracking script is causing this, I would like to know if there's another fire to fire it (or track that adwords conversion), without causing these iPad user to use "GET" requests for the form as well.
EDIT 10.4.2015:
This is the jquery code of the ajax
class, that effects both first.php
and perhaps a.php
, as first.php
is the parent frame:
$(document).ready(function() {
$(".ajax").click(function(t) {
t.preventDefault();
var e = $(this).closest("form");
return $.colorbox({
href: e.attr("action"),
transition: "elastic",
overlayClose: !1,
maxWidth: $("html").hasClass("ie7") ? "45%" : "false",
opacity: .7,
data: {
value: e.find('input[name="value"]').val(),
}
}), !1
})
}),
Technically, it shouldn't happen. The iframe created by your tracking script pointed to
/track.htm
, so there shouldn't be anyGET
request to yourb.php
page.On the other hand, just thinking out loud here, there're a few scenario that could happen because of "real world" user.
The users happen to have bookmark the
b.php
page, thus causing them to open it usingGET
when they try to re-open the page using their bookmark.The users tried to refresh the page
b.php
, then get warned about "Form re-submission". Being clueless as most real user are, they canceled the form re-submission, then click on the address bar and clickGO
on their browser with the sole intention of reloading the page. This could also cause theGET
request to send to theb.php
page.Considering the best practice when designing the page flow for form submission, it might be better for you to only "
process
" your form data inb.php
and then return a302 Redirect
to another page that show the result using aGET
request. This will allow users to "refresh" the page without double submitting the form, and also allow user to bookmark the result page too.Try doing the tracking on the callback of the original request to ensure its loaded?
Also you could look into something like ajaxFormPlugin by malsup
your ajax call doesn't contain
method: "POST"
. This can be the cause.i would like to suggest to check the permission of your "b.php" page. Please make sure the page has "w" permission for all users. this is a chance for not making a "POST" request.
I know it's a workaround but if, as I suppose, you have a bunch of checks for the $_POST variables, if you receive a GET request you could try replace the POST with the GET:
since we don't know why this ipads (...apple -.-) have the issue, and between GET and POST there isn't so much difference - at least if you don't need to upload files...
The only way a post form can be sent as get is using script (changing the method attribute directly, or replacing the form behavior for example with an ajax request, binding to the event "submit" another function), so I suggest you to check every script that run in the parent and the children pages.
This doesn't answer your question but as it entails to the GET glitch but as things stand, ~5% of your iPad visitors can't sign up because the code only accepts POST and so far no one can figure this out. So I propose a change of strategy, at least in the mean time.
Preventing CSRF by only accepting POST requests is already known to not work. Your choice of accepting only this request method as a means of security is what ultimately results in the 405. There are better ways.
One example of is using a CSRF token, specifically the Synchronizer Token Pattern.
The idea behind a CSRF token is that when you generate the form, you also generate a "key" which you tie to the form. When that form is submitted, if it doesn't have the key or the key isn't the right one, you don't bother processing the form. The Syncronizer Token Pattern gets fancy in that it changes the expect key each time (in the form field implementation, giving the
<input type="hidden">
field a new name attribute each time) in addition to the value.Have your code in
a.php
generate a random token and store it as a session variable on the server. Output the token in the form as a hidden field.Before processing the request in
b.php
, ensure the token value is in the request data and ensure it has the expected value.You can first check for
$_POST
data and if it is missing, check for$_GET
data. Regardless of which array contains the data, if the data does not have a valid CSRF token, respond with a 4xx error.If the token is good, consume the token and process the request. If the token is missing or is invalid, return a 4xx response code.
Another way would be to set your field names to random values each time the form is generated. So instead of
<input name="value" type="hidden" value="foo"/>
or<input name="bar" type="hidden" value="none"/>
.It doesn't answer your question of why 5% are sending GET Requests but it does solve your overall problem on both a security and user level.
EDIT: To specifically answer OPs questions in comments:
"(1) does this require using cookies? (a session means cookies right?)"
Read up on PHP Sessions and look for a session library. Plenty out there, one heavyweight being Zend(http://framework.zend.com/manual/1.12/en/zend.session.html). You can save to a database instead for protected server-side sessions. I made one similar to Kohana's.
(2) I didn't understand the "another way" part - how does it differ from the method you described at first?
First method is to just add a token to your form and look for the token to have the expected value upon submission. If the form doesn't have it, you throw an error complaining.
Second method dynamically sets the field names upon form generation AND adds a token field. Submitting the proper form data from a program, bot, or outside source now first requires fetching the form since they wont know what field names to use (instead of just posting data with set field names).
"(3) most important, I am less worried about CSRF attacks, I just don't want bots/crawler to crawl into my forms, would this method prevent it from them, as opposed to humans? why? and is there an easier method to achieve that?"
If you mean bots like Google/SEO/respectful web-crawlers,
robots.txt
exists for this purpose. robots.txt is a very simple text file that is placed in your site's root directory. You'll see requests in your webserver's access logs for a /robots.txt. This file tells search engine and other robots which areas of your site they are allowed to visit and index. You can read more on the (Robot Exclusion Standard)4 on many (websites)5.As the second link notes, don't use
robots.txt
to hide information. It is a public file and visible to anyone. Also, malicious bots wont respect the file.I'm not sure if when you say bots you mean just crawlers or spambots (bots trying to submit data) and such. If it's crawlers,
robots.txt
takes care of them. If it's spambots, you can add a hidden field (hidden with CSS not html) with a common name that when filled out you know is invalid, you can add a captcha, etc, etc, etc.