var numberFormat = new NumberFormatInfo();
numberFormat.NumberDecimalSeparator = ".";
numberFormat.NumberDecimalDigits = 2;
decimal a = 10.00M;
decimal b = 10M;
Console.WriteLine(a.ToString(numberFormat));
Console.WriteLine(b.ToString(numberFormat));
Console.WriteLine(a == b ? "True": "False");
In console: 10.00 10 True
Why is it different? More important, how do I call ToString() to ensure same output no matter how a variable is initialized?
The question of how to make it output consistently has been answered, but here is why they output differently in the first place:
A
decimal
value contains, internally, fields for a scale and a coefficient. In the case of10M
, the value encoded has a coefficient of 10 and a scale of 0:In the case of
10.00M
, the value encoded has a coefficient of 1000 and a scale of 2:You can sort of see this by inspecting the values in-memory:
Outputs
(0xA is 10 in hex, and 0x3E8 is 1000 in hex)
This behaviour is outlined in section 2.4.4.3 of the C# spec:
The
NumberDecimalDigits
property is used with the"F"
and"N"
standard format strings, not theToString
method called without a format string.You can use:
Try this:
The output will have always 2 decimal cases. More examples here:
http://www.csharp-examples.net/string-format-double/