I've been a Java and VB.Net programmer for about 4 years and a C# programmer for about 6 months. I've also used a bunch of dynamic languages like Perl, Python, PHP, and JavaScript.
I've never had a need for a preprocessor.
My question is: why do you see such extensive use of preprocessors in C, C++, and Objective-C but rarely (or never) see it in languages like Java, C#, or Scala?
Because the design and aims of these languages are not the same.
C was built with the preprocessor in mind as a powerful tool, it was used to implement very basic stuff(such as inclusion guards) and developers were able to use it to either optimize their code through macros or to optionally include/exclude certain blocks of code in addition to other things. C++ inherited most of C's idioms, macros are not used for speed anymore(because inline was introduced) but it's still used for plenty of things, see the post What are preprocessor macros good for?
Java was designed to avoid several features that make C++ hard to use.
C# copy ( or inherit ) most of the design decision from Java.
Higher level programming languages avoid this kind of low level artifacts.
Because Gosling and Heilsberg both understand the perils and technical debt incurred with the misuse of preprocessing!
Every language needs a mechanism for separate compilation. Ideally the language distinguishes interfaces from implementations, and a module depends only on the interfaces of the modules it exports. (See, e.g., Ada, Clu, Modula, and so on.)
C has no language construct for interfaces or implementations. Because it's vital that different .c files share a single view of interfaces, the programming discipline evolved of putting declarations (i.e., interfaces) in .h files and sharing those declarations/interfaces using textual inclusion (
#include
). In principle,#define
and#ifdef
could be dispensed with, but#include
could not.Nowadays language designers recognize that textual inclusion is no way to run a railroad, so languages tend to run either to separately compiled interfaces (Ada, Modula, OCaml), to compiler-generated interfaces (Haskell), or to dynamic systems that guarantee interface consistency (Java, Smalltalk). With such a mechanism, there is no need for a preprocessor, and plenty of reasons not to have one (think source-code analysis and debugging).
I don't know Objective-C, so my answer will be about contrasting the use of the preprocessor in C and C++.
The preprocessor was originally necessary for C for several reasons. If I remember correctly, originally C did not have constants, so
#define
was needed to avoid magic numbers. Prior to 1999 C did not have inline functions, so again#define
was used to create macros or "pseudo-functions" to save the overhead of a function call, while keeping code structured. C also doesn't have run-time or compile-time polymorphism, so#ifdef
s were needed for conditional compilation. Compilers were typically not smart enough to optimize away unreachable code, so, again,#ifdef
s were used to insert debugging or diagnostic code.Using the preprocessor in C++ is a throwback to C, and is generally frowned upon. Language features, such as constants, inline functions, and templates can be used in most situations where in C you would have used the preprocessor.
The few cases where the use of a pre-processor in C++ is acceptable or even necessary include the guards for the header files, to prevent the same header from being included multiple times,
#ifdef __cplusplus
to use the same header for both C and C++,__FILE__
and__LINE__
for logging, and a few others.The preprocessor is also often used for platform-specific definitions, although C++ Gotchas by Stephen Dewhurst advises having separate include directories for the platform specific definitions, and using them in separate build configurations for each platform.
Preprocessing is very, very common in the Java world. It's used to compensate for the language's lack of adequate built-in abstraction facilities, which would otherwise lead to endless copied-and-pasted boilerplate code.
The reason many people don't realise this is true is that in the Java world it's called "code generation" rather than "preprocessing", because "preprocessor" sounds like nasty old C, while "code generation" sounds like a professional tool that efficiates mature enterprise processes. It's still preprocessing, though, even if you have to pay a fortune for an incompatible non-standard proprietary tool to do it, instead of just using facilities built into the language.