I passed .NET quiz, when I met a question like below one.
Char ch = Convert.ToChar('a' | 'e' | 'c' | 'a');
In console we can see that output for ch
variable is g
.
Can someone describe what is happening ? Thanks!
I passed .NET quiz, when I met a question like below one.
Char ch = Convert.ToChar('a' | 'e' | 'c' | 'a');
In console we can see that output for ch
variable is g
.
Can someone describe what is happening ? Thanks!
This is not what it looks like at first spot. It is more of binary calculations on the int
representation of these Char
:
Here is a full article explaining this with examples: Article
So the binary result for the bitwise Or
of these 'a' | 'e' | 'c' | 'a'
is 103
. If you Convert that to Char, it is g
Edit:
I see this answer took more attention than I though it deserves more details.
There is an implicit conversion from char to int (int i = 'a'
compiles), so what the compiler actually does is:
Convert.ToChar((int)'a' | (int)'e' | (int)'c' | (int)'a');
Since these are hard-coded values, the compiler does more work:
Convert.ToChar(97 | 101 | 99 | 97);
and finally:
Convert.ToChar(103); // g
If these were not hard-coded values:
private static char BitwiseOr(char c1, char c2, char c3, char c4)
{
return Convert.ToChar(c1 | c2 | c3 | c4);
}
Using Roslyn you get:
private static char BitwiseOr(char c1, char c2, char c3, char c4)
{
return Convert.ToChar((int)c1 | c2 | c3 | c4);
}
Converted to IL (or
(Bitwise) IL instruction used):
.method private hidebysig static char BitwiseOr(char c1,
char c2,
char c3,
char c4) cil managed
{
//
.maxstack 2
.locals init (char V_0)
IL_0000: nop
IL_0001: ldarg.0
IL_0002: ldarg.1
IL_0003: or
IL_0004: ldarg.2
IL_0005: or
IL_0006: ldarg.3
IL_0007: or
IL_0008: call char [mscorlib]System.Convert::ToChar(int32)
IL_000d: stloc.0
IL_000e: br.s IL_0010
IL_0010: ldloc.0
IL_0011: ret
} // end of method Program::BitwiseOr
"|" is the binary OR operator.
'a' binary representation is 01100001
'e' binary representation is 01100101
'c' binary representation is 01100011
The result of the OR
is 01100111
, whose char representation is g
Go to unicode-table.
'a'
Decimal value is 97 in binary it's 01100001
.'e'
Decimal value is 101 in binary it's 01100101
.'c'
Decimal value is 99 in binary it's 01100011
.'a'
Decimal value is 97 in binary it's 01100001
.Or operator in bit wise is '|'
.
So your expression is equal to:
01100001
OR
01100101
OR
01100011
OR
01100001
and the result for this is
01100111
.
Or results 1 if there is at least one time 1 in the column.
01100001
converting to Decimal is 103.
We will go again to the unicode-table and we will see that 103 in deciaml is equal to 'g'
.
So you asked what that function does, it calculates the binary value then converts it to Decimal value and returns the unicode character of it.