This question already has an answer here:
- “static const” vs “#define” vs “enum” 17 answers
In many programs a #define
serves the same purpose as a constant. For example.
#define FIELD_WIDTH 10
const int fieldWidth = 10;
I commonly see the first form preferred over the other, relying on the pre-processor to handle what is basically an application decision. Is there a reason for this tradition?
Expanding on R's answer a little bit:
fieldWidth
is not a constant expression; it's aconst
-qualified variable. Its value is not established until run-time, so it cannot be used where a compile-time constant expression is required (such as in an array declaration, or a case label in aswitch
statement, etc.).Compare with the macro
FIELD_WIDTH
, which after preprocessing expands to the constant expression10
; this value is known at compile time, so it can be used for array dimensions, case labels, etc.They're different.
const
is just a qualifier, which says that a variable cannot be changed at runtime. But all other features of the variable persist: it has allocated storage, and this storage may be addressed. So code does not just treat it as a literal, but refers to the variable by accessing the specified memory location (except if it isstatic const
, then it can be optimized away), and loading its value at runtime. And as aconst
variable has allocated storage, if you add it to a header and include it in several C sources, you'll get a "multiple symbol definition" linkage error unless you mark it asextern
. And in this case the compiler can't optimize code against its actual value (unless global optimization is on).#define
simply substitutes a name with its value. Furthermore, a#define
'd constant may be used in the preprocessor: you can use it with#ifdef
to do conditional compilation based on its value, or use the stringizing operator#
to get a string with its value. And as the compiler knows its value at compile time it may optimize code based on that value.For example:
When
SCALE
is defined as1
the compiler can eliminate the multiplication as it knows thatx * 1 == x
, but ifSCALE
is an (extern
)const
, it will need to generate code to fetch the value and perform the multiplication because the value will not be known until the linking stage. (extern
is needed to use the constant from several source files.)A closer equivalent to using
#define
is using enumerations:But this is restricted to integer values and doesn't have advantages of
#define
, so it is not widely used.const
is useful when you need to import a constant value from some library where it was compiled in. Or if it is used with pointers. Or if it is an array of constant values accessed through a variable index value. Otherwise,const
has no advantages over#define
.According to K&R (2nd edition, page 211) the "const and volatile properties are new with the ANSI standard". This may imply that really old ANSI code did not have these keywords at all and it really is just a matter of tradition. Moreover, it says that a compiler should detect attempts to change const variables but other than that it may ignore these qualifiers. I think it means that some compilers may not optimize code containing const variable to be represented as intermediate value in machine code (like #define does) and this might cost in additional time for accessing far memory and affect performance.
The best way to define numeric constants in C is using enum. Read the corresponding chapter of K&R's The C Programming Language, page 39.
Some C compilers will store all
const
variables in the binary, which if preparing a large list of coefficients can use up a tremendous amount of space in the embedded world.Conversely: using
const
allows flashing over an existing program to alter specific parameters.To add to R.'s and Bart's answer: there is only one way to define symbolic compile time constants in C: enumeration type constants. The standard imposes that these are of type
int
. I personally would write your example asBut I guess that taste differs much among C programmers about that.