I'm not brand new to the concept of unit testing but at the same time I've not yet mastered them either.
The one question that has been going through my head recently as I've been writing unit tests while writing my code using the TDD methodology is: to what level should I be testing?
Sometimes I wonder if I'm being excessive in the use of unit testing.
At what point should a developer stop writing unit tests and get actual work done?
I might need to clarify that question before people assume I'm against using TDD...
What I'm struggling with is the granularity of my test....
- When my app has a config file do I test that values can be retrieved from the file? I lean towards yes....but....
- Do I then write a unit test for each possible config value that will be present? ie check that they exist...and can be parsed to the correct type...
- When my app writes errors to a log do I need to test that it is able to write to the log? Do I then need to write tests to verify that entries are actually made to the log?
I want to be able to use my unit tests to verify the behavior of my app...but I'm not quite sure where to stop. Is it possible to write tests that are too trivial?
In unit testing, you would write a test that shows that it is possible to read items from the config files. You'd test any possible quirks so that you have a representative set of tests, e.g. can you read an empty string, or a long string, or a string with escaped characters, can the system distinguish between an empty or missing string.
With that test done, it is not necessary to re-check that capability for every time another class uses the facility you've already tested. Otherwise, for every function you test, you'd have to re-test every operating system feature it relied on. The tests for a given feature only need to test what that feature's code is responsible for getting right.
Sometimes if this is hard to judge, it indicates something that needs refactoring to make the question easier to answer. If you have to write the same test lots of times for different features, this may indicate that those features share something inside them that could be moved out into a single function or class, tested once and then reused.
In broader terms this is an economics question. Assuming you've stopped needless duplicated tests, how much can you afford your tests to be complete? It is effectively impossible to write genuinely complete tests for any non-trivial program due to the combinations of circumstances that can occur, so you have to make the call. Many successful products have taken over the world despite having no unit tests when they originally launched, including some of the most famous desktop applications of all time. They were unreliable, but good enough, and if they'd invested more in reliability then their competitors would have beaten them to first place in market share. (Look at Netscape, who got first place with a product that was notoriously unreliable, and then died out completely when they took time out to do everything the right way). This is not what we as engineers want to hear, and hopefully these days customers are more discerning, but I suspect not by much.
Excessive unit testing often arises when you use code generation to generate really obvious unit tests. However, since generated unit tests do not really hurt anyone (and do not affect the cost-benefit ratio negatively), I say leave them in - they might come in useful when you least expect it.
If two test cases will run exactly the same code, then there's no need to test them separately. e.g., For your example of reading the config file, you only need to test that it is able to correctly read each type of value (and that it fails in the correct manner when asked to read a nonexistent or invalid value).
If you test that it correctly reads in every single value in the config file, then you are testing the config file, not the code.
yes, unit testing can be taken to excess/extremes
keep in mind that it is only necessary to test features; everything else follows from that
so no, you don't have to test that you can read values from a config file, because one (or more) of the features will need to read values from a config file - and if they don't, then you don't need a config file!
EDIT: There seems to be some confusion as to what I am trying to say. I am not saying that unit testing and feature testing are the same thing - they are not. Per wikipedia: "a unit is the smallest testable part of an application" and logically such 'units' are smaller than most 'features'.
What I am saying is that unit testing is the extreme, and is rarely necessary - with the possible exception of super-critical software (real-time control systems where lives may be endangered, for example) or projects with no limits on budget and timeline.
For most software, from a practical point of view, testing features is all that is required. Testing units smaller than features won't hurt, and it might help, but the trade-off in productivity vs improvements in quality are debatable.
Yes, indeed it is possible to write excessive amounts of unit tests. For example,
In practice the problem isn't that people write too many tests, it's that they distribute their tests unevenly. Sometimes you'll see people who are new to unit testing write hundreds of tests for the things that are easy to test, but then they run out of steam before they put any tests where they're most needed.