Why does this code prints the 48 (ASCII code for 0) instead of 0?
char c = '0';
Console.WriteLine(Convert.ToInt32(c));
Is there any special reason for this?
Why does this code prints the 48 (ASCII code for 0) instead of 0?
char c = '0';
Console.WriteLine(Convert.ToInt32(c));
Is there any special reason for this?
Because, as MSDN states, Convert.ToInt32 Method (Char) returns:
A 32-bit signed integer that represents the UTF-16 encoded code point of the value parameter.
and code point means:
In character encoding terminology, a code point or code position is any of the numerical values that make up the code space. For example, ASCII comprises 128 code points in the range 0h to 7Fh
and from just logic point of view, what result would you expect from Convert.ToInt32('a')
? Convert is for type conversion, not for interpretation. Interpretation can be done by Parse-kind methods.
Because that's what the overload you're calling does. You're expecting it to behave the same as Convert.ToInt32(string input)
but actually you're invoking Convert.ToInt32(char input)
and if you check the docs, they explicitly state it will return the unicode value (in this case the same as the ASCII value).
Docs here; http://msdn.microsoft.com/en-us/library/ww9t2871(v=vs.110).aspx
you should do this
char c = '0';
Console.WriteLine(Convert.ToInt32(Convert.ToString(c)));
To be fancy
char c = '0';
Console.WriteLine(c - '0');