I believe, that the usage of preprocessor directives like #if UsingNetwork
is bad OO practice - other coworkers do not.
I think, when using an IoC container (e.g. Spring), components can be easily configured if programmed accordingly. In this context either a propery IsUsingNetwork
can be set by the IoC container or, if the "using network" implementation behaves differently, another implementation of that interface should be implemented and injected (e.g.: IService
, ServiceImplementation
, NetworkingServiceImplementation
).
Can somebody please provide citations of OO-Gurus or references in books which basically reads "Preprocessor usage is bad OO practice if you try to configure behaviour which should be configured via an IoC container"?
I need this citations to convince coworkers to refactor...
Edit: I do know and agree that using preprocessor directives to change targetplatform specific code during compilation is fine and that is what preprocessor directives are made for. However, I think that runtime-configuration should be used rather than compiletime-configuration to get good designed and testable classes and components. In other words: Using #defines and #if's beyond what they are meant for will lead to difficult to test code and badly designed classes.
Has anybody read something along these lines and can give me so I can refer to?
In c# / VB.NET I would not say its evil.
For example, when debugging windows services, I put the following in Main so that when in Debug mode, I can run the service as an application.
This is configuring the behavior of the application, and is certainly not evil. At the very least, its not as evil as trying to debug a service startup routine.
Please correct me if I read your OP wrong, but it seems that you are complaining about others using a preprocessor when a simple boolean would suffice. If thats the case, dont damn the preprocessors, damn those using them in such fashion.
EDIT: Re: first comment. I dont get how that example ties in here. The problem is that the preprocessor is being mis-used, not that it is evil.
I'll give you another example. We have an application that does version checking between client and server on startup. In development, we often have different versions and dont want to do a version check. Is this evil?
I guess what I am trying to say is that the preprocessor is not evil, even when changing program behavior. The problem is that someone is misusing it. What is wrong with saying that? Why are you trying to dismiss a language feature?
Much later EDIT: FWIW: i haven't used preprocessor directives for this in a few years. I do use Environment.UserInteractive with a specific arg set ("-c" = Console), and a neat trick I picked up from somewhere here on SO to not block the application but still wait for user input.
One problem with the preprocessor #ifdef's is that they effectively duplicate the number of compiled versions that, in theory, you should test thorougly so that you can say that your delivered code is correct.
Ok, now I can produce the "Debug" version and the "Release" version. This is ok for me, I always do it, because I have assertions and debug traces which are only executed in the debug version.
If someone comes and writes (real life example)
And they write a pet optimization which they propagate to four or five different classes, then suddenly you have FOUR possible ways to compile your code.
If only you have another #ifdef-dependant code then you'll have EIGHT possible versions to generate, and what's more disturbing, FOUR of them will be possible release versions.
Of course runtime if()'s, like loops and whatever, create branches that you have to test - but I find it much more difficult to guarantee that every compile time variation of the configuration remains correct.
This is the reason why I think, as a policy, all #ifdef's except the one for Debug/Release version should be temporary, i.e. you're doing an experiment in development code and you'll decide, soon, if it stays one way or the other.
The support of preprocessing in C# is highly minimal.... verging on useless. Is that Evil?
Is the Preprocessor anything to do with OO? Surely it's for build configuration.
For instance I have a lite version and a pro-version of my app. I might want to exclude some code on the lite withour having to resort to building very similar versions of the code.
I might not want to ship a lite version which is the pro version with different runtime flags.
Tony
Preprocessor code injection is to the compiler what triggers are to the database. And it's pretty easy to find such assertions about triggers.
I mainly think of #define being used to inline a short expression because it saves the overhead of a function call. In other words, it's premature optimization.
I have no guru statement regarding to the usage of preprocessor directives in my mind and can not add a reference to a famous one. But I want to give you a link to a simple sample found at Microsoft's MSDN.
This will result in the simple
and I think it is not very easy to read because you have to look at the top to see what exactly is defined at this point. This is getting more complex if you have defined it elsewhere.
For me it looks much simpler to create different implementations and inject them into a caller instead of switching defines to create "new" class definitions. (... and because of this I understand why you compare the usage of preprocessor defintions with the usage of IoC instead). Beside the horrible readability of code using preprocessor instructions, I rarely used preprocessor definitions because they increase the complexity of testing your code because they result in multiple paths (but this is a problem of having multiple implementations injected by external IoC-Container, too).
Microsoft itself has used a lot of preprocessor definitions within the win32 api and you might know/remember the ugly switching between char and w_char method calls.
Maybe you should not say "Don't use it". Tell them "How to use it" and "When to use it" instead. I think everyone will agree with you if you're coming up with good (better understandable) alternatives and can describe the risks of using preprocessor defines/makros.
No need for a guru... just be a guru. ;-)