I have a doubt why it happens that different compilers give different outputs to a same C program.If there is an standard C then why these famous compiler dont use that completely. the difference in output is caused by 16-bit , 32-bit compilers so what are all those issues which makes the difference.
问题:
回答1:
The language standard leaves several degrees of freedom to the implementations.
Firstly, even if the program is implemented correctly, its behavior might depend on implementation-defined language features. For example, different implementations can have different ranges for basic integer types.
Secondly, the program might simply be broken. A broken program in this context is a program that relies on behavior that is either undefined or unspecified by the language. This program will appear to "work" in some way, but its actual behavior will depend on unpredictable factors and, therefore, will be inconsistent.
Practice show that in many (if not most) cases when people complain about inconsistently behaving C programs, they are actually failing to realize that they are dealing with a broken program.
回答2:
Do you have an example?
The language is standardized, but a lot of aspects of it are implementation-defined or even undefined.
For example, this:
printf("sizeof (int) = %u\n", (unsigned)sizeof (int));
will print different numbers on different systems, depending on how big int
is.
回答3:
There are many things within C that are implementation-defined. This means that the people who create the compilers can choose how they want to handle those situations. In general, for portability it is best in most cases to not rely on undefined behavior, even when most or all compilers handle it the same way. If you provide some code that different compilers are treating differently, perhaps we can tell you why they are doing that and how to fix it.
An example of undefined behavior is referencing an uninitialized variable like this:
int *a;
printf("%d", *a);
In most implementations, you will see some junk integer that doesn't mean anything (along with a warning). It is whatever value happened to be stored in the memory location pointed to by a
; however, technically, the implementation could have specified that referencing an uninitialized variable will always give 0 (or something like that). The reason that this is not the case is because it is much more difficult to implement. Nevertheless, you should still not count on getting whatever happened to be in the memory location pointed to by a
. It may or may not be what you get.
回答4:
For the most part, it's when a programmer uses undefined behaviour and the compiler has to try to guess what you mean.
An example of another potential problem: C++0x just got approved, but the final draft isn't publicly available. As soon as it becomes available, some compilers may add support before others.
Part of a solution: Many compilers have a way to compile to a certain specification. For instance, with GCC, you may compile with "gcc programname.c -ansi" to make sure that your code adheres to the ANSI standard. This will go a ways to ensuring consistency.