How to use React Suspense and Time Slicing to visualize large datasets

Make your slow apps better

Plus it Smells like React 17 is coming at React Conf in October 🤞

Time to learn what the new React Suspense and Time Slicing features are all about!

You can listen to this email 🎧

React Suspense and Time Slicing are supposed to be all about optimizing heavy workloads for slow devices. Slow networks, bad CPUs, keeping things smooth and working well.

What better way to test if that’s true than a visualization of 199,222 San Francisco bike share rides from July 2018? It’s a public dataset, it’s got a lot of data, rendering all those DOM nodes is gonna make even your $3500 MacBook Pro cry.

👌

PS: A more in-depth version of this will be a new chapter in React + D3 2018, which you can still preorder

NOTE: This is a cross-post from my newsletter. I publish each email two weeks after it’s sent. Subscribe to get more content like this earlier right in your inbox! 💌

There’s a lot of data in there, but I figured 2 dimensions is all we need: trip duration, rider birth year.

Older riders take shorter rides?

Eh, not a strong correlation. But you can see that younger riders take more rides. Maybe because San Francisco skews young? Median age is 36 according to Google.

Hard to say. Not the question we care about anyway.

Our question is this: Can your computer and browser handle that scatterplot?

Yes! Here’s how 👇

Don’t believe me? Try it: https://dist-exhowcijhf.now.sh/

The visualization uses a few tricks:

  1. Using React Suspense, we show a loading screen while 40MB of CSV data downloads to the browser
  2. Split the dataset into 4 shards and display with overlapping scatterplots to avoid large components
  3. Use async rendering to progressively display more and more data

Why and how React Suspense?

The first new feature is React Suspense. Its goal is to help you build interfaces that work well on slow networks.

You can think of it as an easy way to build loading states into your components. Promise didn’t return fast enough? Okay, let’s show a loading state while we wait.

<React.Placeholder delayMs={500} fallback={<div>🌀 Loading like 40 megs of CSV</div>} > <LazyViz /> </React.Placeholder>

React.Placeholder is the placeholder component you've always wanted. Now built right into React itself.

No more messing about with componentDidMount and convoluted loading procedures or complicated Redux shenanigans.

The final API might still change. We’ll see. You can read this as “Dear Placeholder, if LazyViz doesn’t resolve in 500 milliseconds, render the fallback.

delayMs specifies how long you wanna wait, fallback is a render prop that shows loading state, and <LazyViz> is a React component that returns a Promise.

That’s the fun new part. Components that suspend rendering while they wait for a Promise to resolve.

Right now the best (only?) way to try it out is using Dan Abramov’s experimental simple-cache-provider package. It works as a cache of loadable resources so that subsequent requests render faster than initial requests. I think.

I’m sure more implementations and cool ideas are gonna come of this soon.

For now, here’s how you can use it when loading data for your visualization 👇

const getData = createResource( () => d3 .csv( "https://s3-us-west-1.amazonaws.com/swizec-datasets/201807-fordgobike-tripdata.csv", d => ({ duration_sec: Number(d.duration_sec), member_birth_year: Number(d.member_birth_year) }) ) .then(data => props => <Dataviz data={data} {...props} />), key => key ); const LazyViz = props => { return getData.read(cache)(props); };

We create a new resource called getData. The createResource function takes a loading method, which is a d3.csv call in our case, and a hash map. Our hash map returns a key for key, which seems pointless, but simple-cache-provider doesn't work otherwise. ¯_(ツ)_/¯

Our loader is meant to return a Promise, which resolves with a React component.

So we take the loaded data and return a functional component that takes props and returns our main Dataviz component. That's this part 👇

data => props => <Dataviz data={data} {...props} />

The key is to pass data into our visualization component, not try to load it within a componentDidMount lifecycle. Separating loading and presentation like this is a good idea anyway.

Now, to use all this in a React.Placeholder, we have to wrap it in another functional component called LazyViz.

const LazyViz = props => { return getData.read(cache)(props); };

It uses getData to create a resource and start loading, reads from an instantiated cache, and passes any props into the returned component.

All this is a little convoluted, and it took me a while to grok, and I’m still not sure I fully get it.

Fundamentally, it means you have a <Dataviz> component that takes data and renders stuff. You can load that data any way you want, but if you wrap it in Promises and this cache machinery, React knows how to handle that for you via React.Placeholder.

Why and how Time Slicing?

After we’ve loaded our data, it’s time to render it all in one big scatterplot because scatterplots are easy.

The problem with scatterplots, however, is that when you have a lot of data, your browser starts struggling. A few hundred thousand DOM nodes on your page is… hard.

In my experiments, it could take up to a few minutes to render the whole dataset. That’s if the browser tab didn’t just crash instead.

Our problem is twofold:

  1. Rendering many nodes at once is slow
  2. We block the UI thread while rendering, making the whole page feel slow

We can solve both those problems with Time Slicing! 🤘

Time Slicing lets us move React rendering work into the background. Async renders.

Don’t worry, you’re not taking over the job of React’s scheduler. You’re just letting it know that, hey, this stuff isn’t so important, you can do it whenever. Scheduler will figure out the rest.

I sharded the dataset into 4 shards and displayed them using overlapping scatterplots. This makes each React component a little smaller and seems to improve performance.

But the time slicing part is this 👇

class Dataviz extends React.PureComponent { componentDidMount() { this.showMoreData(); } componentDidUpdate() { this.showMoreData(); } showMoreData = () => { const { N, chunkSize } = this.state; if (N < chunkSize) { requestAnimationFrame(() => this.setState({ N: N + 1000 })); } };

showMoreData uses this.setState to increase the amount of data we're showing in increments of 1000. The bit that creates time slicing is that this.setState is wrapped in a requestAnimationFrame call, which makes it async.

I also experimented with requestIdleCallback, which is what the React team recommends, but that makes the visualization slower. Not sure why.

Calling showMoreData in componentDidUpdate creates a loop that keeps running as long as showMoreData triggers updates. Calling it in componentDidMount starts the process.

If you look at the animation, you can see that progress slows down as more and more nodes are added to the screen. But the UI never blocks 💪

You can also try it live.

For good measure, I also wrapped the entire app in <React.unstable_AsyncMode>, but I have no evidence that changed anything. Async time sliced updates seemed to work just as well without it.

Try it yourself

The best way to play with React Suspense and Time Slicing right now is to use @palmerhq’s react-suspense-starter setup. Gives you a basic Parcel setup and a future version of React.

Start messing with the code and see what happens.

You can also look at the full code of my experiment on GitHub.

In other news…

Progress on the new React + D3 2018 edition has been… good? It’s hard to say. Time slips between my fingers a lot.

Like, a lot. How is it September already? Wasn’t it just July? O.o

But the research part is going well. I promised we’d cover new React features, and here they are. The VR stuff is not quite there yet, and I have to tweak some React Native examples.

Also gotta update the canvas and Konva example because that API changed, and I think I want to add a section on using Gatsby for server-side rendering. 🤔

A lot of work ahead, but once the research is done, the writing is fast. You’ll start getting the new content soon :)

In the mean time, here’s a fun little story about debugging email layouts where none of your usual tools can do the job 👇

The most frustrating debugging experience I’ve had all year

Cheers,

~Swizec


P.S. If you like this, make sure to subscribe, follow me on Twitter, buy me lunch, and share this with your friends 😀

Like what you read? Give Swizec Teller a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.