I've just been going back over a bit of C studying using Ivor Horton's Beginning C book. I got to the bit about declaring constants which seems to get mixed up with variables in the same sentence.
Just to clarify, what is the difference in specifying constants and variables in C, and really, when do you need to use a constant instead of a variable? I know folks say to use a constant when the information doesn't change during program execution but I can't really think of a time when a variable couldn't be used instead.
For one, performance optimization.
More importantly, this is for human readers. Remember that your target audience is not only the compiler. It helps to express yourself in code, and avoid comments.
It's a really easy way to trap a certain class of errors. If you declare a variable
const
, and accidentally try to modify it, the compiler will call you on it.Constant is when you just want to share the memory, and it doesn't change.
Constants have several advantages over variables.
Constants provide some level of guarantee that code can't change the underlying value. This is not of much importance for a smaller project, but matters on a larger project with multiple components written by multiple authors.
Constants also provide a strong hint to the compiler for optimization. Since the compiler knows the value can't change, it doesn't need to load the value from memory and can optimize the code to work for only the exact value of the constant (for instance, the compiler can use shifts for multiplication/division if the const is a power of 2.)
Constants are also inherently static - you can declare the constant and its value in a header file, and not have to worry about defining it exactly one place.
A variable, as you can guess from the name, varies over time. If it doesn't vary, there is "no loss". When you tell the compiler that the value will not change, the compiler can do a whole bunch of optimizations, like directly inlining the value and never allocating any space for the constant on the stack.
However, you cannot always count on your compiler to be smart enough to be able to correctly determine if a value will change once set. In any situation where the compiler is incapable of determining this with 100% confidence, the compiler will err on the side of safety and assume it could change. This can result in various performance impacts like avoiding inlining, not optimizing certain loops, creating object code that is not as parallelism-friendly.
Because of this, and since readability is also important, you should strive to use an explicit constant whenever possible and leave variables for things that can actually change.
As to why constants are used instead of literal numbers:
1) It makes code more readable. Everyone knows what 3.14 is (hopefully), not everyone knows that 3.07 is the income tax rate in PA. This is an example of domain-specific knowledge, and not everyone maintaining your code in the future (e.g., a tax software) will know it.
2) It saves work when you make a change. Going and changing every 3.07 to 3.18 if the tax rate changes in the future will be annoying. You always want to minimize changes and ideally make a single change. The more concurrent changes you have to make, the higher the risk that you will forget something, leading to errors.
3) You avoid risky errors. Imagine that there were two states with an income tax rate of 3.05, and then one of them changes to 3.18 while the other stays at 3.07. By just going and replacing, you could end up with severe errors. Of course, many integer or string constant values are more common than "3.07". For example, the number 7 could represent the number of days in the week, and something else. In large programs, it is very difficult to determine what each literal value means.
4) In the case of string text, it is common to use symbolic names for strings to allow the string pools to change quickly in the case of supporting multiple languages.
Note that in addition to variables and "constant variables", there are also some languages with enumerations. An enumeration actually allows you to defines a type for a small group of constants (e.g., return values), so using them will provide type safety.
For example, if I have an enumeration for the days of the weeks and for the months, I will be warned if I assign a month into a day. If I just use integer constants, there will be no warning when day 3 is assigned to month 3. You always want type safety, and it improves readability. Enumerations are also better for defining order. Imagine that you have constants for the days of the week, and now you want your week to start on Monday rather than Sunday.
Not using const can mean someone in a team project could declare where
int FORTY_TWO = 42
and make it equalFORTY_TWO = 41
somewhere else by another team member. Therefore the end of the world happens and you also loose the answer to life. withconst
although none of this will ever happen. Plusconst
is stored elsewhere in memory, when compared to the storage of normal variables, and is more efficient.