I believe, that the usage of preprocessor directives like #if UsingNetwork
is bad OO practice - other coworkers do not.
I think, when using an IoC container (e.g. Spring), components can be easily configured if programmed accordingly. In this context either a propery IsUsingNetwork
can be set by the IoC container or, if the "using network" implementation behaves differently, another implementation of that interface should be implemented and injected (e.g.: IService
, ServiceImplementation
, NetworkingServiceImplementation
).
Can somebody please provide citations of OO-Gurus or references in books which basically reads "Preprocessor usage is bad OO practice if you try to configure behaviour which should be configured via an IoC container"?
I need this citations to convince coworkers to refactor...
Edit: I do know and agree that using preprocessor directives to change targetplatform specific code during compilation is fine and that is what preprocessor directives are made for. However, I think that runtime-configuration should be used rather than compiletime-configuration to get good designed and testable classes and components. In other words: Using #defines and #if's beyond what they are meant for will lead to difficult to test code and badly designed classes.
Has anybody read something along these lines and can give me so I can refer to?
IMHO, you talk about C and C++, not about OO practice in general. And C is not Object-oriented. In both languages the preprocessor is actually useful. Just use it correctly.
I think this answer belongs to C++ FAQ: [29.8] Are you saying that the preprocessor is evil?.
I hope this source is authoritative enough :-)
Henry Spencer wrote a paper called #ifdef Considered Harmful.
Also, Bjarne Stroustrup himself, in the chapter 18 of his book The Design and Evolution of C++, frowns on the use of preprocessor and wishes to eliminate it completely. However, Stroustrup also recognizes the necessity for #ifdef directive and the conditional compilation and goes on to illustrate that there is no good alternative for it in C++.
Finally, Pete Goodliffe, in chapter 13 of his book Code Craft: The Practice of Writing Excellent Code, gives an example how, even when used for its original purpose, #ifdef can make a mess out of your code.
Hope this helps. However, if your co-workers won't listen to reasonable arguments in the first place, I doubt book quotes will help convince them ;)
One quick point to tell your coworkers is this: the preprocessor breaks operator precedence in mathematical statements if symbols are used in such statements.
Preprocessor directives in C# have very clearly defined and practical uses cases. The ones you're specifically talking about, called conditional directives, are used to control which parts of the code are compiled and which aren't.
There is a very important difference between not compiling parts of code and controlling how your object graph is wired via IoC. Let's look at a real-world example: XNA. When you're developing XNA games that you plan to deploy on both Windows and XBox 360, your solution will typically have at least two platforms that you can switch between, in your IDE. There will be several differences between them, but one of those differences will be that the XBox 360 platform will define a conditional symbol XBOX360 which you can use in your source code with a following idiom:
You could, of course, factor out these differences using a Strategy design pattern and control via IoC which one gets instantiated, but the conditional compilation offers at least three major advantages:
Bjarne Stroustrap provides his answer (in general, not specific to IoC) here
So, what's wrong with using macros?
(excerpt)
IMO it is important to differentiate between #if and #define. Both can be useful and both can be overused. My experience is that #define is more likely to be overused than #if.
I spent 10+ years doing C and C++ programming. In the projects I worked on (commercially available software for DOS / Unix / Macintosh / Windows) we used #if and #define primarily to deal with code portability issues.
I spent enough time working with C++ / MFC to learn to detest #define when it is overused - which I believe to be the case in MFC circa 1996.
I then spent 7+ years working on Java projects. I cannot say that I missed the preprocessor (although I most certainly did miss things like enumerated types and templates / generics which Java did not have at the time).
I've been working in C# since 2003. We have made heavy use of #if and [Conditional("DEBUG")] for our debug builds - but #if is just a more convenient, and slightly more efficient way of doing the same things we did in Java.
Moving forward, we have started to prepare our core engine for Silverlight. While everything we are doing could be done without #if, it is less work with #if which means we can spend more time adding features that our customers are asking for. For example, we have a value class which encapsulates a system color for storage in our core engine. Below are the first few lines of code. Because of the similarity between System.Drawing.Color and System.Windows.Media.Color, the #define at the top gets us a lot of functionality in normal .NET and in Silverlight without duplicating code:
The bottom line for me is that there are many language features which can be overused, but this is not a good enough reason to leave these features out or to make strict rules prohibiting their use. I must say that moving to C# after programming in Java for so long helps me to appreciate this because Microsoft (Anders Hejlsberg) has been more willing to provide features which might not appeal to a college professor, but which make me more productive in my job and ultimately enable me to build a better widget in the limited time anybody with a ship date has.