I have the following enum:
enum Enum: UInt8
{
case A = 0x00
case B = 0x01
case C = 0x10
}
I use the following code to convert it to NSData
:
var value = Enum.C
let data = NSData(bytes: &value, length: 1)
Naturally, I expect data
to contain 0x10
, however, it contains 0x02
.
For Enum.A
it contains 0x00
and for Enum.B
it contains 0x01
.
To me, it looks like it stores an index of the value instead of its actual raw data. Could someone explain this behavior?
P.S. If I use rawValue
it works perfectly. However, I want to understand the reason behind, since because of it I cannot create a generic function to convert values to NSData.
Each enum case is given a standard order value. Which is what you are getting when you don't use
.rawValue
.For example, if you change your enum to:
Then, when executing
data
will be3
, becauseA = 0
,B = 1
,B2 = 2
,C = 3
The Swift ABI is still work in progress (expected to be fixed with Swift 4). The way enums are represented in memory is described here.
Your case is a "c-like enum" because is has...
Quoting the ABI document:
The crucial information here is the "minimal number of bits". This means that (for your case) instances should fit into two bits (as you have three cases). The rawValue
0x10
would need five bits—which would be in conflict with the ABI.The compiler probably uses static tables to convert between instances of
Enum
and theirrawValue
(and back).Here's an example that highlights this characteristic of the ABI: