A brief history of Null and Undefined in JavaScript

Stephen Curtis
2 min readJan 13, 2018

--

Null and undefined in JavaScript are actually values and types created to simulate errors and keywords common in other programming languages.

When a variable is `undefined`, or unitialized, in most programming languages it means that a space in memory has been assigned to a variable name, but the programmer has not yet done anything with that space in memory. This usually results in a compile time error.

When a variable is `null` in other programming languages, null is typically a keyword to indicate the space in memory is a pointer (reference), and that pointer is pointing to an invalid memory address (usually 0x0). This is usually used when a programmer is done using the value of a variable and wants to purposefully clear it by literally pointing it to nothing.

In JavaScript, `null` and `undefined` are values and types. Just like numbers and characters, `null` has a specific configuration of 1’s and 0’s that indicates it’s type is `null` and that it’s value is `null`. Same with `undefined`. These are used in JavaScript to act as placeholders to let the programmer know when a variable has no value.

What's the difference?

Undefined is supposed to mean a variable has no value (or a property does not exist) because the programmer has not yet assigned it a value (or created the property).

Null is supposed to signal that a variable has no value because the programmer purposefully cleared the value and set it to `null`.

So you should only ever set variables to `null`. Never to `undefined`.

...UNLESS you are working with existing code or a specific api or library explicitly checking for `undefined`, then of course to force the behavior you want you’d have to make an exception.

What about typeof?

But why does `typeof null === 'object’`? Well, all reference types (pointers) in JavaScript are objects. In early JavaScript, null was meant to simulate a null pointer (reference), ergo it was hard coded to return 'object' for it’s type. At least that’s my best guess, according to Brendan Eich it’s a bug. I think he just forgot.

--

--

Stephen Curtis

I'm a full time developer. I've been writing and loving code for over 20 years. My opinions are my own. Find me on LinkedIn at linkedin.com/in/stephenlcurtis/