This question already has an answer here:
Why does bitset store the bits in reverse order? After strugging many times I have finally written this binary_to_dec. Could it simplified?
int binary_to_dec(std::string bin)
{
std::bitset<8> bit;
int c = bin.size();
for (size_t i = 0; i < bin.size(); i++,c--)
{
bit.set(c-1, (bin[i]-'0' ? true : false));
}
return bit.to_ulong();
}
Bitset stores its numbers in what you consider to be "reverse" order because we write the digits of a number in decreasing order of significance even though the characters of a string are arranged in increasing index order.
If we wrote our numbers in little-endian order, then you wouldn't have this confusion because the character at index 0 of your string would represent bit 0 of the bitset. But we write our numbers in big-endian order. I'm afraid I don't know the details of human history that led to that convention. (And note that the endianness that any particular CPU uses to store multi-byte numbers is irrelevant. I'm talking about the endianness we use when displaying numbers for humans to read.)
For example, if we write the decimal number 12 in binary, we get 1100. The least significant bit is on the right. We call that "bit 0." But if we put that in a string,
"1100"
, the character at index 0 of that string represents bit 3, not bit 0. If we created a bitset with the bits in the same order as the characters,to_ulong
would return 3 instead of 12.The bitset class has a constructor that accepts a
std::string
, but it expects the index of the character to match the index of the bit, so you need to reverse the string. Try this:EDIT: formatting and return type.