Examining our web logs we find a significant number of clicks are other double-clicks, or repeat-clicks (e.g. when the system is busy and has not reacted quickly enough).
Double-Clicking a SUBMIT button may cause a form to process twice (generally we program against this, but I'd like to avoid possibility of errors that we have not programmed against), but even double clicking a link means that the server has to process the response twice (usually the server will detect a "disconnect" on the first click and abort processing for that - but we still incur the server-time for the effort, which is compounded when the server is under heavy load).
Having said that, there are times when I never get a response to a click, and its only the re-click that works.
One action we do see is a mis-click - click on a link, realise that it was not the desired link, and then click on the correct, adjacent, link - clearly we still need to allow that.
How do you handle this / what do you suggest? and what is the best way to achieve this, generically, across the whole application?
1) We could disable the link/button after click (perhaps for a set period of time, then re-enable)
2) We could hide the "body" of the page - we have done this in the past, just leaving the "banner" pane (which looks the same on all pages) which gives the appearance of the next page loading (but does not play well with the BACK button in some browsers) - this also mucks up users who mis-clicked
You could do this with a combination of delegate
and data
:
$(document).delegate('a, :button', 'click', function(e) {
var lastClicked = $.data(this, 'lastClicked'),
now = new Date().getTime();
if (lastClicked && (now - lastClicked < 1000)) {
e.preventDefault();
} else {
$.data(this, 'lastClicked', now);
}
});
This will prevent constant rebinding, so should have decent performance.
You can set custom attribute once the element is clicked then check for that attribute: if exists, ignore the click.
This will not change the UI of the element, just ignore repetative clicks.
Rough example using pure JavaScript (as you didn't tag your question with jQuery) is available here: http://jsfiddle.net/248g8/
If this is a big concern for you (and if the obvious answer of "make sure your server always responds really fast" isn't possible ;-) I would suggest a modified version of your (2) is the way forward.
The critical thing here is to give the user sufficient feedback that they feel that something is happening - ideally without blocking off the possibility of the user clicking again in those few cases where something genuinely has gone wrong.
Using javascript to make a small swirly "loading..." graphic may be effective here - and it's easy to set this up so that browsers that don't support javascript (or have it disabled) fall back to the standard link behaviour. Though I would only do this for forms where there is an expectation of taking a long time (or where this might scare the user) - it will make the site rather distracting to use, and in any case (a) users are used to links occasionally being slow on the internet, and (b) your server should be powerful enough to cope with the occasional extra hit :-)
You can disable the link or submit button - but this is frustrating for the user in the case where the submission fails for some reason (my bank does this, and TBH it scares me that they don't realise they should instead "program round" the double-submit issue as you described it!).
I certainly wouldn't disable the link and then re-enable it after a timeout - this would be very confusing for the user...
If you're using jQuery, then maybe you can listen for double clicks across the <BODY>
tag and then prevent propagation.
$("body").live('dblClick',function()
{
return false;
});