I am an amateur C programmer and I encountered this question in a book,can someone give me its valid explanation.
I am getting confused as to what this ^ sign is doing in a C program.
#include <stdio.h>
int main(void)
{
int a;
printf("%d", (3^6) + (a^a));
return 0;
}
int a;
printf("%d",(3^6)+(a^a));
The evaluation of the (3^6)+(a^a)
expresion invokes undefined behavior as a
is not initialized and has an indeterminate value.
(C11, 6.3.2.1p2) "If the lvalue designates an object of automatic storage duration that could have been declared with the register storage class (never had its address taken), and that object is uninitialized (not declared with an initializer and no assignment to it has been performed prior to use), the behavior is undefined."
I am getting confused as to what this ^ sign is doing in a C program.
^
is a logical XOR operator (do not confused with power operator, unfortunately not available in C).
Output of the expression (3^6)+(a^a) in C language?
The output of the program is garbage value because your program's behavior is undefined. Why? Because a
is not initialized.
n1570: Annex J: J.2 Undefined behavior
The behavior is undefined in the following circumstances:
...
— An lvalue designating an object of automatic storage duration that could have been
declared with the register storage class is used in a context that requires the value
of the designated object, but the object is uninitialized (6.3.2.1).1
1. Emphasis is mine.
The output of the program would result in undefined behavior as a
is not initialized and hence the output will result in any garbage value.
^ stands for XOR.
XORing same bit return 0, different bit return 1.
Eg. 1^0 == 1 , 1^1 == 0
Any int variable in C is 16 bit (16 bit compiler) or 32 bit (32 bit compiler). So, in any case whether it is defined or not, a will be a 16/32 bit pattern.
Considering 16 bit compiler
Bit pattern of 3 is 0000 0000 0000 0000 0011
XOR
Bit pattern of 6 is 0000 0000 0000 0000 0110
Result is --> 0000 0000 0000 0000 0101 ---> 5
It doesn't matter whether a is defined or not.
a^a will always be equal to 0. Since we have bit pattern same in both cases.
Therefore (3^6) + (a^a) = 5.
Also if question is (3^6) + (a^~a)
Then
as explained above 3^6--> 5
Considering 16 bit compiler for a as garbage value and integer type
let assume a=1.
then a will be 0000 0000 0000 0001
and ~a will be 1111 1111 1111 1110
so a^~a will be -->1111 1111 1111 1111--> 65535 (Unsigned int)
Therefore (3^6) + (a^~a) = 5+65535 =65540 which is out of range.
As a result it will exceed 5 starting from 0 which will result -->4
Answer=4