Below is a code block, that is supposed to test to see if a dictionary is null, and if it isn't, pull out the correct object. However, for some reason, despite the fact that the if
check fails, the code still executes. Is there some quirk with how NSNull
works that I don't understand, or is this an Apple bug?
if (svcUser && !(svcUser == (id)[NSNull null])) {
return [svcUser objectForKey:@"access_level"];
}
Console response:
(lldb) print svcUser && !(svcUser == (id)[NSNull null])
(bool) $0 = false
(lldb) continue
-[NSNull objectForKey:]: unrecognized selector sent to instance 0x2b51678
Using @JE42's approach gives me a warning as of Xcode 5.1. Instead cast it:
You can check it by using:
NSNull
is a class. And like with all classes, you must useisEqual:
, not==
to see if two objects represent the same value.Simply check for:
This is the approach that Apple mentions in their docs.