Booleans can be declared either as
Boolean primitive types. To correctly evaluate a value from any of these types, we can safely use the following:
The long winded version
Assessing a Boolean can be unnecessarily complicated. But a Boolean is either
false isn’t it? Well yes, and no. A Boolean is a data type defined with one of two states; true or false.
Boolean(false) // false
Boolean(true) // true
Similarly, declaring with integers convert without trouble
Boolean(0) // false
Boolean(1) // true
But we get into murky waters when dealing with Strings.
Boolean("false") // true
Boolean("true") // true
An empty String evaluates to false when cast as a Boolean, and a non-empty String, with length > 0, evaluates true.
This can be problematic, if for example, the concrete type of a provided Boolean is out of our control.
ReactJS uses pseudo type checking, via it’s
Number primitive types, evaluating whether they are Truthy or Falsey, is easy. We can just wrap our value in
Boolean(value), or a convenient shortcut to
Boolean simply by using Not Not syntax
But with a String boolean, is there a better way of evaluating the result, than first asserting the type of the primitive, and subsequently handling in the appropriate fashion?
var boo = typeof val == 'string' ? val == 'true' : !!val
Well this works, but it ain’t pretty. Or concise.
eval() global correctly converts a
"true" to a Boolean. But having been taught that
eval() is the root of all evil, I persisted for a better solution.
It turns out that
JSON.parse("true") also has the desired affect on Strings, but doesn’t like Ints. So simply prefixing with our
!! covers all bases.
!!JSON.parse(true) // true
!!JSON.parse(1) // true
!!JSON.parse("true") // true