This question already has answers here:
Closed 7 years ago.
Possible Duplicate:
size of int, long, etc
Does the size of an int depend on the compiler and/or processor?
I'm not sure if similar questions have been asked before on SO (Atleast, I couldn't find any while searching, so thought of asking myself).
What determines the size of int
(and other datatypes) in C
. I've read it depends on the machine/operating system/compiler, but haven't come across a clear/detailed enough explanation on things like what overrides the other, etc. Any explanation or pointers will be really helpful.
Ultimately the compiler does, but in order for compiled code to play nicely with system libraries, most compilers match the behavior of the compiler[s] used to build the target system.
So loosely speaking, the size of int
is a property of the target hardware and OS (two different OSs on the same hardware may have a different size of int
, and the same OS running on two different machines may have a different size of int
; there are reasonably common examples of both).
All of this is also constrained by the rules in the C standard. int
must be large enough to represent all values between -32767
and 32767
, for example.
int
is the "natural" size for the platform, and in practice that means one of
the processor's register size, or
a size that's backward compatible with existing code-base (e.g. 32-bit int
in Win64).
A compiler vendor is free to choose any size with ≥ 16 value bits, except that (for desktop platforms and higher) a size that doesn't work with OS' API will mean that few if any copies of the compiler are sold. ;-)
The size of C data types is constrained by the C standard, often constraints on the minimum size. The host environment (target machine + OS) may impose further restriction, i.e. constraints on the maximum size. And finally, the compiler is free to choose suitable values between these minimum and maximum values.
Generally, it's considered bad practice to make assumptions about the size of C data types. Besides, it's not necessary, since C will tell you:
- the
sizeof
-operator tells you an object's size in bytes
- the macro
CHAR_BITS
from limits.h tells you the number of bits per byte
Hence, sizeof(foo) * CHAR_BITS
tells you the size of type foo
, in bits, including padding.
Anything else is just assumptions. Note that the host environment may as well consist of 10.000 Chinese guys with pocket calculators and a huge blackboard, pulling size constraints out of thin air.
SO does not know everything but Wikipedia, almost...
see Integer_(computer_science)
Note (b) says:
"The sizes of short, int, and long in C/C++ are dependent upon the implementation of the language; dependent on data model, even short can be anything from 16-bit to 64-bit. For some common platforms:
On older, 16-bit operating systems, int was 16-bit and long was 32-bit.
On 32-bit Linux, DOS, and Windows, int and long are 32-bits, while long long is 64-bits. This is also true for 64-bit processors running 32-bit programs.
On 64-bit Linux, int is 32-bits, while long and long long are 64-bits."