Should I use ImmutableJS with Redux?

Redux is a state management library that often gets paired with React. What’s less talked about is how ImmutableJS can benefit Redux — and even React. I’ll focus on the pros and cons of using Immutable first and then I’ll go into how to integrate it into a Redux/React project in another article.

The Good

  • Gets rid of unnecessary data copying. Vanilla Redux reducers use copying to achieve functional purity, often in the form of something like newState = Object.assign({}, oldState, { newThing: true });. While copying may seem innocuous, it can significantly increase the amount of memory needed while running the app. (Note: shallow copying via Object.assign() only copies the top-level of the object, so nested objects aren’t copied. However, shallow copies don’t isolate nested objects from accidental mutation.) Luckily for us, garbage collection in the javascript engine will de-allocate that for us during runtime. But you can see how this can be problematic as the application scales. Immutable data structures give the same benefits of deep copying while only modifying the parts of the structure that changed.
  • Efficient and effective change detection. Normally, if you test javascript object equality using the triple equals operator oldObject === newObject you’re only testing if the entire object reference was changed. But this has no bearing on whether the actual nested values are the same or not. Conversely, ImmutableJS updates the object references to reflect the changes made to the nested values. That meansoldObjectImmutable !== newObjectImmutable really indicates that one or more of the nested values are not equal — as well as oldObjectImmutable === newObjectImmutable means the values are equal. The only reliable why to do deep change detection in native javascript is through expensive methods like using JSON.stringify() on all the input parameters and performing a string comparison.
  • Enforces immutability in language versus achieving it by convention. Redux requires the reducers to never mutate the previous state object, but this is only by convention. There is nothing in the language or library preventing a developer from accidentally doing this. You can implement various processes and tools to help mitigate unintentional mutation like unit tests, thorough peer review, and even static analysis, yet wouldn’t it be nice to ensure that never happens in the code itself? With ImmutableJS it’s impossible to mess up. It forces the operation to always return new collections — collection is the base structure that maps, lists, records, etc. inherit from — that are completely isolated from the original.
  • Provides convenient ways to modify deeply nested properties. Updating the state tree at the top-level is pretty easy with Object.assign(), but nested updates are more complicated and error-prone. Usually, it’s just easier to create a potentially expensive deep copy of the state, mutate the copy, and return it at the end of the reducer function. That doesn’t sit well with me. Immutable provides .setIn() and getIn() methods that provide easy ways to walk down the tree to the exact node you’re trying to fetch or update. What’s great is ImmutableJS always returns a relatively cheap copy of the entire collection so you can chain the updates in a pretty clean way — once you get past the string usage, which I present as a downside below.
[SOME_ACTION]: (state, action) => 
.set('loading', false)
.setIn(['down', 'we', 'go'],
.setIn(['this', 'is', 'easy'], action.payload.response),
  • Can be slowly introduced into an existing codebase*. I’m putting an asterisk next to this because with Redux this is trivial, but with React not necessarily so — for reasons I’ll get to down below. With Redux all you need to do is initialize the state to an Immutable record or map, preferably record.
import { handleActions } from 'redux-actions';
import Immutable from 'immutable';

import {
} from './modalDialogActions.js';

const ModalDialogRecord = new Immutable.Record({
show: false,
titleText: '',
bodyComponent: undefined,
busy: false,

const initialState = new ModalDialogRecord();

const actions = {
[SHOW_MODAL_DIALOG]: (state, action) =>
.set('show', true)

export default handleActions(actions, initialState);

The Bad

  • Increases the intellectual load required for developers to implement a feature. This is real. As frontend developers we already have to juggle dozens (hundreds?!) of libraries and tools. Do we need to add another one to bring on complete analysis paralysis? I used to latch onto the latest and greatest <insert new library>.js, but after a while it starts to become untenable to maintain. That’s why the cost of the new shiny thing has to outweigh the cost of managing lots of shiny things.
  • Syntax for getting and setting is kind of ugly. To access different properties in Immutable map you need to use strings for all properties. If one of them is mistyped then undefined is returned. Trying to create string constants for all keys seems impractical as well. If you buy into using Immutable records then this is mitigated because records support dot notation to access properties — and inherit all the methods and properties of maps. But let’s say you need to get a deeply nested value in a list of records. You’re probably gonna use strings anyway state.getIn(['aListOfRecords', 0, 'someProp'], 7). So, in order to use Immutable, you kinda have to accept the string syntax as a reality. (Note: with a static typing tool like Flow or Typescript, the keys that are referenced using the string syntax are validated.)
  • A little more cumbersome to debug code. The main contributor here is Immutable structures are wrapped in other stuff than just what they’re storing. So when you’re debugging and hit a breakpoint the values aren’t immediately readable. You have to call .toJS() in the console to see them. Now, there are plugins and tools to overcome this, but they add more intellectual load and you don’t always have access to them.
  • Often incompatible with existing React components and without some refactor. Any component that was dealing with native javascript structures will probably need to be refactored to some degree. Using Immutable records mitigates the refactor effort a lot, but you can’t get around Immutable lists being different. Thankfully, Immutable embraces ES6 conventions so if you’re already following them then this isn’t too bad. If you use libraries like Lodash, the refactor effort is greater, but still tractable. Usually, it’s just stuff like this:
// Lodash
const foundSomething = _.find(stuffInArray, o => === someId);
// versus
// ImmutableJS/ES6
const foundSomething = stuffInList.find(o => === someId);
  • May not have significant performance improvement and can actually slow down the application in some cases. Remember when I was saying copying is a bad thing? Well, it doesn’t really have a perceivable impact if the size of the data structure is small. This may be most of the cases you’re working with. So is the overhead of Immutable worth it? Probably not. Additionally, there will be cases when you’re forced into using .toJS() — this should not be the norm however — to make it compatible with another library or something. Using .toJS() is quite expensive and will be way slower than copying.

Honestly, most downsides of ImmutableJS could be overcome by building immutable data structures into the javascript language. If there were easy ways to log values and have most libraries handle immutable data then there wouldn’t be any mental context switching. (There are times when purely functional transpile-to-js languages look tempting, but there are still issues with compatibility and support.) Frontend engineers are doomed to live a life of constant, controlled chaos.

At work, we have a large enterprise web app that manages a lot of state and scalability was a big concern for us. We also wanted the ability to optimize the performance of rendering components using cheap change detection. That’s why we chose ImmutableJS, but it may not be right for you and your team.

Edit: Added more clarification on data copying.

Edit: Provided correction and clarification of the effects of shallow copying via Object.assign()