I'm not brand new to the concept of unit testing but at the same time I've not yet mastered them either.
The one question that has been going through my head recently as I've been writing unit tests while writing my code using the TDD methodology is: to what level should I be testing?
Sometimes I wonder if I'm being excessive in the use of unit testing.
At what point should a developer stop writing unit tests and get actual work done?
I might need to clarify that question before people assume I'm against using TDD...
What I'm struggling with is the granularity of my test....
- When my app has a config file do I test that values can be retrieved from the file? I lean towards yes....but....
- Do I then write a unit test for each possible config value that will be present? ie check that they exist...and can be parsed to the correct type...
- When my app writes errors to a log do I need to test that it is able to write to the log? Do I then need to write tests to verify that entries are actually made to the log?
I want to be able to use my unit tests to verify the behavior of my app...but I'm not quite sure where to stop. Is it possible to write tests that are too trivial?
If you find yourself spending all your debugging time in the testing routines, you may have gone overboard.
I think this rule should also apply to TDD to prevent excessive unit test.
One thing to note, based upon some of the answers given, is that if you find that you are needing to write numerous unit tests to do the same thing over and over, consider refactoring the root cause of the code in question.
Do you need to write a test for everywhere you access a configuration setting? No. You can test it once if you refactor and create a single point of entry for the functionality. I believe in testing as large an amount of functionality as is feasibly possible. But it is really important to realize that if you omit the refactoring step, your code coverage will plummet as you continue to have "one-off" implementations throughout the codebase.
The TDD methodology is about designing - having a suite of tests for later is a rather welcome side effect. Test driven development reveals completely different code than having tests "just" as an afterthought.
So when you ask in the TDD context: It's easy to excessively design a solution that is "overengineered". Stop when you have the design fixed enough that it will not slip. No need to bolt it down, strap and cover with cement. You need the design to be flexible enough to be changed during the next refactoring.
My personal story for overengineered testing is an over-mocked test where the implementation of some classes was more or less mirrored in 'expect'-calls to respective mock objects. Talk about resistance to adopt to changed requirements...
The point of unit tests - besides providing guidance in design - is to give you feedback on whether you actually did get work done. Remember that old adage: if it doesn't have to work, I'm finished now.
In Lean vocabulary, tests are "necessary waste" - they don't provide any direct value. So the art is in writing only those tests that provide indirect value - by helping us get confidence in that what we produced actually works.
So, the ultimate guide on what tests to write should be your confidence level about the production code. That's where the Extreme Programming mantra "test everything that could possibly break" is coming from - if something could possibly break, we need a test as our safety net, to be able to move quickly in the future, by refactoring with confidence. If something "couldn't possibly break" (as is often said about simple accessors), writing tests for it would be total waste.
Of course you fail in your assessment from time to time. You will need experience to find the right balance. Most importantly, whenever you get a bug report against your code, you should think about what kind of test would have prevented this bug from going out into the wild, and will prevent similar bugs in the future. Then add this kind of test to your collection of tests for code "that could possibly break".
[Update:] Found the concise answer to this question in TDD ByExample - Pg194.
[/Update]
I think the problem prevalent in the current times is the lack of unit testing... not excessive testing. I think I see what you're getting at.. I wouldn't term it as excessive unit-testing but rather.. not being smart about where you focus your efforts.
So to answer your question.. some guidelines.