Like it or not, occasionally you have have to write tests for classes that make internal use of timers.
Say for example a class that takes reports of system availability and raises an event if the system has been down for too long
public class SystemAvailabilityMonitor {
public event Action SystemBecameUnavailable = delegate { };
public event Action SystemBecameAvailable = delegate { };
public void SystemUnavailable() {
//..
}
public void SystemAvailable() {
//..
}
public SystemAvailabilityMonitor(TimeSpan bufferBeforeRaisingEvent) {
//..
}
}
I have a couple tricks which I use (will post these as an answer) but I wonder what other people do since I'm not fully satisfied with either of my approaches.
If you are looking for answers to this problem, you might be interested in this blog: http://thorstenlorenz.blogspot.com/2009/07/mocking-timer.html
In it I explain a way to override the usual behavior of a System.Timers.Timer class, to make it fire on Start().
Here is the short version:
Of course this requires you to be able to pass the timer into the class under test. If this is not possible, then the design of the class is flawed when it comes to testability since it doesn't support dependency injection. You should change its design if you can. Otherwise you could be out of luck and not be able to test anything about that class which involves its internal timer.
For a more thorough explanation visit the blog.
Sounds like one should mock the timer but alas... after a quick Google this other SO question with some answers was the top search hit. But then I caught the notion of the question being about classes using timers internally, doh. Anyhow, when doing game/engine programming - you sometimes pass the timers as reference parameters to the constructors - which would make mocking them possible again I guess? But then again, I'm the coder noob ^^
The ways that I usually handle this are either
I refactor these such that the temporal value is a parameter to the method, and then create another method that does nothing but pass the correct parameter. That way all the actual behavior is isolated and easily testable on all the wierd edge cases leaving only the very trivial parameter insertion untested.
As an extremely trivial example, if I started with this:
I would refactor to this, and unit test the second method:
I realize this is a Java question, but it may be of interest to show how its done in the Perl world. You can simply override the core time functions in your tests. :) This may seem horrifying, but it means you don't have to inject a whole lot of extra indirection into your production code just to test it. Test::MockTime is one example. Freezing time in your test makes some things a lot easier. Like those touchy non-atomic time comparison tests where you run something at time X and by the time you check its X+1. There's an example in the code below.
A bit more conventionally, I recently had a PHP class to pull data from an external database. I wanted it to happen at most once every X seconds. To test it I put both the last update time and the update time interval as attributes of the object. I'd originally made them constants, so this change for testing also improved the code. Then the test could fiddle with those values like so:
I extract the timer from the object that reacts to the alarm. In Java you can pass it a ScheduledExecutorService, for example. In unit tests I pass it an implementation that I can control deterministically, such as jMock's DeterministicScheduler.