Immutability in React and Redux

Nitish Kumar
8 min readDec 29, 2017

--

According to Redux, you should never mutate the state tree directly. Instead return new state objects by passing the intent (actions) through pure functions (reducers).

Already a mouthful? Let’s take a step back and understand how we compare data structures in JavaScript. Comparing primitive types is pretty straight forward. Just put a === between the variables you want to compare. If the type and value match, they are equal.

var x = 5,
y = 5
x === y // true

It is a bit different in case of objects though.

var object1 = { x: 1, y: 2 },
object2 = { x: 1, y: 2 }
object1 === object2 // false

But they look the same. Why are they not equal 🤔. If you think about equality of things, you can in fact think about two things:

  • does a thing mean the same as the other thing?
  • is a thing exactly the same thing as the other thing?

For the above objects, they are the same in terms of keys and values but as they are initialised separately, they are not the same in terms of reference. The result would have been different if they would have been pointing to the same address.

var object1 = { x: 1, y: 2 }
var object2 = object1
object1 === object2 // true

This has some interesting consequences though.

object1.x = 12object1.x // 12
object2.x // 12

In this case, we did assign x key of an object object1. But since object1 and object2 points to the same reference, the change is visible through both variables. Every complex data structure in JavaScript follows the principles of reference equality. This includes arrays and objects. In fact, arrays are objects too: typeof([1,2,3]) // "object"

Comparing objects based on values

Value equality answers the question: Does a thing mean the same as the other thing?

var object1 = { x: 1, y: 2 }
var object2 = { x: 1, y: 2 }
valueEqual(object1, object2) === true

Implementing such equality would be harder with nested data structures. Objects can have arbitrary keys and values. It can contain other objects within them. To make such equality for objects, you’d need to follow the following algorithm:

// Input: object1 and object2
// Output:
true if object1 is equal in terms of values to object2

valueEqual(object1, object2):
object1keys = <list of keys of object1>
object2keys = <list of keys of object2>

return false if length(object1keys) != length(object2keys)

for each key in object1keys:
return false if key not in object2keys
return false if typeof(object1[key]) != typeof(object2[key])

if object1[key] is an object:
keyEqual = valueEqual(object1[key], object2[key])
return false if keyEqual != false

if object1[key] is a primitive:
return false if object1[key] != object2[key]

return true

For deeply nested structures, it can make thousands of equality checks to compare two objects. Such equality checks are commonly called deep equality checks.

Get rid of mutations

Whenever your object would be mutated, don’t do it. Instead, create a changed copy of it

You can achieve this in a number of ways. Listing a few

  • Object.assign({}, …)
  • [].concat
  • Array.prototype.slice()
  • … operator (spread)
  • External libraries (ImmutableJS, immutability-helper, icepick, etc.)

Using ImmutableJS

From the docs:

Immutable.js provides many Persistent Immutable data structures including: List, Stack, Map, OrderedMap, Set, OrderedSet and Record. These data structures are highly efficient on modern JavaScript VMs by using structural sharing via hash maps tries and vector tries as popularised by Clojure and Scala, minimising the need to copy or cache data.

Seems like something we could use. But lets google some pros and cons before.

Pros

Cons

  • Difficult to interoperate with. For example, instead of myObj.prop1.prop2.prop3, you would use ImmutableMap.getIn([‘prop1’, ‘prop2’, ‘prop3’])
  • Can spread throughout the codebase, as the codebase must know what is, and what is not, an Immutable.JS object
  • Difficult to debug (dev tools are available though)
  • No De-structuring or Spread Operator support(only if you like them)
  • Not suitable for small values that change often

There are other really good immutability helpers out there. They have their own benefits and trade-offs though. Listing a few:

Pros

  • Seamless interoperability with JavaScript
  • Suitable for small and simple JavaScript objects
  • Easy to debug

Cons

  • Performance issues for big datasets
  • API may take time to get familiar with. Code gets harder to read when updating, deleting values using the array like syntax

Can we create a pure JS library of our own where immutable operations look like mutable ones. This might sound familiar to people using async/await for performing asynchronous actions by writing synchronous looking code or if you are familiar with the go-routines concept in GoLang. Ok, coming back to the topic. Yes, we can try to create something. But, prior to that, I would like to discuss the importance of immutability in a React application.

Immutability and ReactJS

  • When state changes, a React component re-renders. React can’t assume anything about your state. That’s why setting state always re-renders the component — even if it’s not necessary at all, as React doesn’t deeply compare props/state by default. (The implementation is different for Pure components though)
  • For a complicated state (nested objects), you’d be forced to make a deep equality check, because when objects are compared using reference equality you can’t be sure whether the next state is changed.

If you want to optimise React to perform updates only when necessary, you would need to make hundreds of value equality checks if your state is huge

shouldComponentUpdate

It is a lifecycle optimisation method which allows your Component to exit the Update life cycle if there is no reason to apply a new render

  • Out of the box, shouldComponentUpdate() is a no-op that returns true
  • Implementation for Pure React components
/**
* Performs equality by iterating through keys on an object and returning false
* when any key has values which are not strictly equal between the arguments.
* Returns true when the values of all keys are strictly equal.
*/
function shallowEqual(objA: mixed, objB: mixed): boolean {
if (objA === objB) {
return true
}

if (typeof objA !== 'object' || objA === null ||
typeof objB !== 'object' || objB === null) {
return false
}

var keysA = Object.keys(objA)
var keysB = Object.keys(objB)

if (keysA.length !== keysB.length) {
return false
}

// Test for A's keys different from B
var bHasOwnProperty = hasOwnProperty.bind(objB)
for (var i = 0; i < keysA.length; i++) {
if (!bHasOwnProperty(keysA[i]) ||
objA[keysA[i]] !== objB[keysA[i]]) return false
}

return true
}
function shallowCompare(instance, nextProps, nextState) {
return (!shallowEqual(instance.props, nextProps) ||
!shallowEqual(instance.state, nextState))
}
function shouldComponentUpdate(nextProps, nextState) {
return shallowCompare(this, nextProps, nextState)
}

Performing deep equality check within shouldComponentUpdate is not a great idea in case you have a large enough state and props to compare. Sometimes re-rendering can be faster than comparing and escaping re-renders.

Shallow comparison is easy on eyes, but it uses === to compare objects. If the current state is mutated, it will return true as the reference didn’t change, resulting in no re-render.

This is why Redux requires pure methods for reducers. If you need to change nested data you have to clone the objects and make sure a new instance is always returned. This allows for shallowCompare() to see the change and update the component. Usage of ImmutableJS and friends enforces immutability at any level in the object. We can leverage the modest === within shouldComponentUpdate() and have it verify that props and state have changed.

Going back to Redux docs

The key to updating nested data is that every level of nesting must be copied and updated appropriately.

  • Mistake 1 — New variables that point to the same objects
  • Mistake 2 — Only making a shallow copy of one level
  • Correct Approach — Copying All Levels of Nested Data

The process of correctly applying immutable updates to deeply nested state can easily become verbose and hard to read. Here’s what an example of updating state.first.second[someId].fourth might look like:

function updateVeryNestedField(state, action) {
return {
...state,
first : {
...state.first,
second : {
...state.first.second,
[action.someId] : {
...state.first.second[action.someId],
fourth : action.someValue
}
}
}
}
}

Coming back to creating an immutability helper

The issue that we had with some of the helpers out there was the not so obvious api. We wanted something mutable looking but immutable 😎. Lodash provides with a method called toPath which can convert value passed to a property path array.

import { toPath } from 'lodash'toPath('a.b.c')// => ['a', 'b', 'c']toPath('a[0].b.c')// => ['a', '0', 'b', 'c']

If we can leverage this and create a package which can help us implement the above requirement in the below manner, it would be a good start.

For the bundlesize purists, impact of Lodash to the size of bundle can be aggressively controlled by using babel-plugin-lodash and lodash-webpack-plugin

import { setIn } from "some-package"function updateVeryNestedField(state, action) {
const newState =
setIn(state,
`first.second[${action.someId}].fourth`,
action.someValue)
console.log(state === newState) // false
return newState
}
import { deleteIn } from "some-package"function deleteVeryNestedField(state, action) {
const newState =
deleteIn(state,
`first.second[${action.someId}].fourth`)
console.log(state === newState) // false
return newState
}

Lets try to design the setIn method

As setIn expects a state which can be an Object or an Array, path string and a value, the signatures can look like below

Flow is used to add type annotations

const setIn = (state: Object | Array<*>, field: string, value: any)  => setInWithPathArray(state, value, toPath(field), 0)const setInWithPathArray = ( state: Object | Array<*>, value: any, path: string[], pathIndex: number): Object | Array<*>

setInWithPathArray takes path values as array and the current usable array index, which can be used to traverse the object and reach the final node that need to be updated. We can use a recursive approach by getting the head of the array and calling the function again with the next object(calculated from first key) and remaining path list. The object to be returned can be calculated by using the spread operator with the current object, key to be updated and the state returned by the recursive call.

if (pathIndex >= path.length) {
return value
}
const first = path[pathIndex]const firstState = state && (Array.isArray(state) ?state[Number(first)] : state[first])const next = setInWithPathArray(firstState, value, path, pathIndex + 1)return { ...state, [first]: next }

There are a few edge cases and early returns that need to be handled. Also, other basic methods like deleteIn and getIn could be implemented by following similar approach. For brevity, I would not be explaining them here.

If you got inspired enough, an immutability helper which solves your purpose is not that hard to make. In case you are feeling lazy, take a look at the library I published.

redux-immutable-ops

API

  • setIn(state: Object | Array<*>, path: string, value: any): Object | Array<*>
  • getIn(state: Object | Array<*>, path: string): any
  • deleteIn(state: Object | Array<*>, path: string): ?(Object | Array<*>)
  • deleteInRecursive(state: Object | Array<*>, path: string): ?(Object | Array<*>)
  • pop(array: Array<*>): Array<*>
  • other array utilities

If the library helped you, please 🌟 the repository. If the article helped you, make sure to clap👏 , follow me on twitter, and share with your friends!

--

--

Nitish Kumar

Full-Stack Engineer at @nutanix | Red Devils | CrossFitter