I have a byte array generated by a random number generator. I want to put this into the STL bitset.
Unfortunately, it looks like Bitset only supports the following constructors:
- A string of 1's and 0's like "10101011"
- An unsigned long. (my byte array will be longer)
The only solution I can think of now is to read the byte array bit by bit and make a string of 1's and 0's. Does anyone have a more efficient solution?
Guys, I have spent a lot of time by writing a reverse function (bitset -> byte/char array). There it is:
you can initialize the bitset from a stream. I can't remember how to wrangle a byte[] into a stream, but...
from http://www.sgi.com/tech/stl/bitset.html
There's a 3rd constructor for
bitset<>
- it takes no parameters and sets all the bits to 0. I think you'll need to use that then walk through the array callingset()
for each bit in the byte array that's a 1.A bit brute-force, but it'll work. There will be a bit of complexity to convert the byte-index and bit offset within each byte to a bitset index, but it's nothing a little bit of thought (and maybe a run through under the debugger) won't solve. I think it's most likely simpler and more efficient than trying to run the array through a string conversion or a stream.
Here is my implementation using template meta-programming.
Loops are done in the compile-time.
I took @strager version, modified it in order to prepare for TMP:
Modified version with loops in a run-time:
TMP version based on it:
client code:
Well, let's be honest, I was bored and started to think there had to be a slightly faster way than setting each bit.
This is indeed slightly faster, at least as long as the byte array is smaller than 30 elements (depending on your optimization-flags passed to compiler). Larger array than that and the time used by shifting the bitset makes setting each bit faster.
Something like this? (Not sure if template magic works here as I'd expect. I'm rusty in C++.)