Treating enum
s as flags works nicely in C# via the [Flags]
attribute, but what's the best way to do this in C++?
For example, I'd like to write:
enum AnimalFlags
{
HasClaws = 1,
CanFly =2,
EatsFish = 4,
Endangered = 8
};
seahawk.flags = CanFly | EatsFish | Endangered;
However, I get compiler errors regarding int
/enum
conversions. Is there a nicer way to express this than just blunt casting? Preferably, I don't want to rely on constructs from 3rd party libraries such as boost or Qt.
EDIT: As indicated in the answers, I can avoid the compiler error by declaring seahawk.flags
as int
. However, I'd like to have some mechanism to enforce type safety, so someone can't write seahawk.flags = HasMaximizeButton
.
Note if you are working in Windows environment, there is a
DEFINE_ENUM_FLAG_OPERATORS
macro defined in winnt.h that does the job for you. So in this case, you can do this:Maybe like NS_OPTIONS of Objective-C.
Easiest way to do this as shown here, using the standard library class bitset.
To emulate the C# feature in a type-safe way, you'd have to write a template wrapper around the bitset, replacing the int arguments with an enum given as a type parameter to the template. Something like:
You are confusing objects and collections of objects. Specifically, you are confusing binary flags with sets of binary flags. A proper solution would look like this:
I'd like to elaborate on Uliwitness answer, fixing his code for C++98 and using the Safe Bool idiom, for lack of the
std::underlying_type<>
template and theexplicit
keyword in C++ versions below C++11.I also modified it so that the enum values can be sequential without any explicit assignment, so you can have
You can then get the raw flags value with
Here's the code.
Note (also a bit off topic): Another way to make unique flags can be done using a bit shift. I, myself, find this easier to read.
It can hold values up to an int so that is, most of the time, 32 flags which is clearly reflected in the shift amount.