#include<stdio.h>
int main()
{
int c;
return 0;
} // on Intel architecture
#include <stdio.h>
int main()
{
int c;
return 0;
}// on AMD architecture
/* Here I have a code on the two different machines and I want to know the 'Is the size of the data types dependent on the machine ' */
Quick answer: Yes, mostly, but ...
The sizes of types in C are dependent on the decisions of compiler writers, subject to the requirements of the standard.
The decisions of compiler writers tend to be strongly influenced by the CPU architecture. For example, the C standard says:
though that leaves a lot of room for judgement.
Such decisions can also be influenced by other considerations, such as compatibility with compilers from the same vendor for other architectures and the convenience of having types for each supported size. For example, on a 64-bit system, the obvious "natural size" for
int
is 64 bits, but many compilers still have 32-bitint
. (With 8-bitchar
and 64-bitint
,short
would probably be either 16 or 32 bits, and you couldn't have fundamental integer types covering both sizes.)(C99 introduces "extended integer types", which could solve the issue of covering all the supported sizes, but I don't know of any compiler that implements them.)
Yes. The size of the basic datatypes depends on the underlying CPU architecture. ISO C (and C++) guarantees only mininum sizes for datatypes.
But it's not consistent across compiler vendor for the same CPU. Consider that there are compilers with 32-bit long ints for Intel x386 CPUs, and other compilers that give you 64-bit longs.
And don't forget about the decade or so of pain that MS programmers had to deal with during the era of the Intel 286 machines, what with all of the different "memory models" that compilers forced on us. 16-bit pointers versus 32-bit segmented pointers. I for one am glad that those days are gone.
see here: size guarantee for integral/arithmetic types in C and C++
Fundamental C type sizes are depending on implementation (compiler) and architecture, however they have some guaranteed boundaries. One should therefore never hardcode type sizes and instead use sizeof(TYPENAME) to get their length in bytes.
It usually does, for performance reasons. The C standard defines the minimum value ranges for all types like
char
,short
,int
,long
,long long
and their unsigned counterparts.However, x86 CPUs from Intel and AMD are essentially the same hardware to most x86 compilers. At least, they expose the same registers and instructions to the programmer and most of them operate identically (if we consider what's officially defined and documented).
At any rate, it's up to the compiler or its developer(s) to use any other size, not necessarily matching the natural operand size on the target hardware as long as that size agrees with the C standard.