I have the following code:
#include <iostream>
#include <limits>
int main()
{
std::cout << std::numeric_limits<unsigned long long>::digits10 << std::endl;
return 0;
}
- GCC 4.4 returns 19
- MS VS 9.0 returns 18
Can someone please explain Why is there a difference between the two? I would have expected such a constant would be the same regardless of the compiler.
If Visual C++ 2008 returns
18
forstd::numeric_limits<unsigned long long>::digits10
, it is a bug (I don't have Visual C++ 2008 installed to verify the described behavior).In Visual C++ (at least for 32-bit and 64-bit Windows),
unsigned long long
is a 64-bit unsigned integer type and is capable of representing all of the integers between zero and 18,446,744,073,709,551,615 (264 - 1).Therefore, the correct value for
digits10
here is 19 because anunsigned long long
can represent 9,999,999,999,999,999,999 (19 digits) but cannot represent 99,999,999,999,999,999,999 (20 digits). That is, it can represent every 19 digit number but not every 20 digit number.When compiled with Visual C++ 2010, your program prints the expected 19.
numeric_limits::digits10 specifies the number of decimal digits to the left of the decimal point that can be represented without a loss of precision. So, I guess it will differ from compiler to compiler depending on their implementation detail.
This is not correct. A compiler can implement any value as long as it comforms to the standards. For example some weird compilers on a 32 or 64-bit computer may have
CHAR_BIT = 9
andunsigned long long
wouldn't be 64 bit any more, or it may use different 1's complement or some other number encodings, so the result may vary between compilers.I've just check and VS 2008 still returns 19. Maybe because of one of the hotfixes when updating. VS2008 result http://imageshack.com/a/img826/6228/6qo4.png