The following shows that "0"
is false in Javascript:
>>> "0" == false
true
>>> false == "0"
true
So why does the following print "ha"
?
>>> if ("0") console.log("ha")
ha
The following shows that "0"
is false in Javascript:
>>> "0" == false
true
>>> false == "0"
true
So why does the following print "ha"
?
>>> if ("0") console.log("ha")
ha
Tables displaying the issue:
and ==
Moral of the story use ===
table generation credit: https://github.com/dorey/JavaScript-Equality-Table
The "if" expression tests for truthiness, while the double-equal tests for type-independent equivalency. A string is always truthy, as others here have pointed out. If the double-equal were testing both of its operands for truthiness and then comparing the results, then you'd get the outcome you were intuitively assuming, i.e.
("0" == true) === true
. As Doug Crockford says in his excellent JavaScript: the Good Parts, "the rules by which [== coerces the types of its operands] are complicated and unmemorable.... The lack of transitivity is alarming." It suffices to say that one of the operands is type-coerced to match the other, and that "0" ends up being interpreted as a numeric zero, which is in turn equivalent to false when coerced to boolean (or false is equivalent to zero when coerced to a number).It is all because of the ECMA specs ...
"0" == false
because of the rules specified here http://ecma262-5.com/ELS5_HTML.htm#Section_11.9.3 ...Andif ('0')
evaluates to true because of the rules specified here http://ecma262-5.com/ELS5_HTML.htm#Section_12.5