Like it or not, occasionally you have have to write tests for classes that make internal use of timers.
Say for example a class that takes reports of system availability and raises an event if the system has been down for too long
public class SystemAvailabilityMonitor {
public event Action SystemBecameUnavailable = delegate { };
public event Action SystemBecameAvailable = delegate { };
public void SystemUnavailable() {
//..
}
public void SystemAvailable() {
//..
}
public SystemAvailabilityMonitor(TimeSpan bufferBeforeRaisingEvent) {
//..
}
}
I have a couple tricks which I use (will post these as an answer) but I wonder what other people do since I'm not fully satisfied with either of my approaches.
This is what I am using. I found it in the book: Test Driven - Practical TDD and Acceptance TDD for Java Developers by Lasse Koskela.
Notice that the default time source, DEFAULTSRC, is System.currentTimeMillis(). It is replaced in unit tests; however, the normal behavior is the standard system time.
This is where it is used:
And here is the unit test:
In the unit test, I replaced the TimeSource with just a number which was set to 1000 milliseconds. That will serve as the starting time. When calling setEndTime(), I input 1020 milliseconds for the finishing time. This gave me a controlled 20 millisecond time difference.
There is no testing code in the production code, just getting the normal Systemtime.
Make sure to call reset after testing to get back to using the system time method rather than the faked time.