Why is it sensible for a language to allow implicit declarations of functions and typeless variables? I get that C is old, but allowing to omit declarations and default to int()
(or int
in case of variables) doesn't seem so sane to me, even back then.
So, why was it originally introduced? Was it ever really useful? Is it actually (still) used?
Note: I realise that modern compilers give you warnings (depending on which flags you pass them), and you can suppress this feature. That's not the question!
Example:
int main() {
static bar = 7; // defaults to "int bar"
return foo(bar); // defaults to a "int foo()"
}
int foo(int i) {
return i;
}
Probably the best explanation for "why" comes from here:
Systems programming languages don't necessarily need types; you're mucking around with bytes and words, not floats and ints and structs and strings. The type system was grafted onto it in bits and pieces, rather than being part of the language from the very beginning. As C has moved from being primarily a systems programming language to a general-purpose programming language, it has become more rigorous in how it handles types. But, even though paradigms come and go, legacy code is forever. There's still a lot of code out there that relies on that implicit
int
, and the standards committee is reluctant to break anything that's working. That's why it took almost 30 years to get rid of it.A long, long time ago, back in the K&R, pre-ANSI days, functions looked quite different than they do today.
When you call a function like
add_numbers
, there is an important difference in the calling conventions: all types are "promoted" when the function is called. So if you do this:What happens is
x
is promoted toint
,y
is promoted toint
, and the return type is assumed to beint
by default. Likewise, if you pass afloat
it is promoted to double. These rules ensured that prototypes weren't necessary, as long as you got the right return type, and as long as you passed the right number and type of arguments.Note that the syntax for prototypes is different:
A common practice back in the old days was to avoid header files for the most part, and just stick the prototypes directly in your code:
Header files are accepted as a necessary evil in C these days, but just as modern C derivatives (Java, C#, etc.) have gotten rid of header files, old-timers didn't really like using header files either.
Type safety
From what I understand about the old old days of pre-C, there wasn't always much of a static typing system. Everything was an
int
, including pointers. In this old language, the only point of function prototypes would be to catch arity errors.So if we hypothesize that functions were added to the language first, and then a static type system was added later, this theory explains why prototypes are optional. This theory also explains why arrays decay to pointers when used as function arguments -- since in this proto-C, arrays were nothing more than pointers which get automatically initialized to point to some space on the stack. For example, something like the following may have been possible:
Citations
On typelessness:
On the equivalence of integers and pointers:
Evidence for the theory that prototypes were omitted due to size constraints:
See Dennis Ritchie's "The Development of the C Language": http://cm.bell-labs.com/who/dmr/chist.html
For instance,
This typelessness persisted in C until the authors started porting it to machines with different word lengths:
Programming languages evolve as programming practices change. In modern C and the modern programming environment, where many programmers have never written assembly language, the notion that ints and pointers are interchangeable may seem nearly unfathomable and unjustifiable.
Some food for thought. (It's not an answer; we actually know the answer — it's permitted for backward compatibility.)
And people should look at COBOL code base or f66 libraries before saying why it's not cleaned up in 30 years or so!
gcc
with its default does not spit out any warnings.With
-Wall
andgcc -std=c99
do spit out the correct thingThe
lint
functionality built into moderngcc
is showing its color.Interestingly the modern clone of
lint
, the securelint
— I meansplint
— gives only one warning by default.The
llvm
C compilerclang
which also has a static analyser built into it likegcc
, spits out the two warnings by default.People used to think we don't need backward compatibility for 80's stuff. All the code must be cleaned up or replaced. But it turns out it's not the case. A lot of production code stays in prehistoric non-standard times.
EDIT:
I didn't look through other answers before posting mine. I may have misunderstood the intention of the poster. But the thing is there was a time when you hand compiled your code, and use toggle to put the binary pattern in memory. They didn't need a "type system". Nor does the PDP machine in front of which Richie and Thompson posed like this :
Don't look at the beard, look at the "toggles", which I heard were used to bootstrap the machine.
And also look how they used to boot UNIX in this paper. It's from the Unix 7th edition manual.
http://wolfram.schneider.org/bsd/7thEdManVol2/setup/setup.html
The point of the matter is they didn't need so much software layer managing a machine with KB sized memory. Knuth's MIX has 4000 words. You don't need all these types to program a MIX computer. You can happily compare a integer with pointer in a machine like this.
I thought why they did this is quite self-evident. So I focused on how much is left to be cleaned up.
It's the usual story — hysterical raisins (aka 'historical reasons').
In the beginning, the big computers that C ran on (DEC PDP-11) had 64 KiB for data and code (later 64 KiB for each). There was a limit to how complex you could make the compiler and still have it run. Indeed, there was scepticism that you could write an O/S using a high-level language such as C, rather than needing to use assembler. So, there were size constraints. Also, we are talking a long time ago, in the early to mid 1970s. Computing in general was not as mature a discipline as it is now (and compilers specifically were much less well understood). Also, the languages from which C was derived (B and BCPL) were typeless. All these were factors.
The language has evolved since then (thank goodness). As has been extensively noted in comments and down-voted answers, in strict C99, implicit
int
for variables and implicit function declarations have both been made obsolete. However, most compilers still recognize the old syntax and permit its use, with more or less warnings, to retain backwards compatibility, so that old source code continues to compile and run as it always did. C89 largely standardized the language as it was, warts (gets()
) and all. This was necessary to make the C89 standard acceptable.There is still old code around using the old notations — I spend quite a lot of time working on an ancient code base (circa 1982 for the oldest parts) which still hasn't been fully converted to prototypes everywhere (and that annoys me intensely, but there's only so much one person can do on a code base with multiple millions of lines of code). Very little of it still has 'implicit
int
' for variables; there are too many places where functions are not declared before use, and a few places where the return type of a function is still implicitlyint
. If you don't have to work with such messes, be grateful to those who have gone before you.