Code Cracked for Code-Splitting + SSR in Reactlandia: React Universal Component + Webpack Flush Chunks and more

James Gillmore
Reactlandia
Published in
13 min readJun 8, 2017
webpack-flush-chunks: use this shit!

The code has been cracked for a long time now for server-side rendering and code-splitting individually. Until now — bar Next.js — there has been no easy to use common abstraction to achieve both simultaneously for the greater NPM community. This article presents several packages I’ve worked on for a long time now that I hope to be the final simple + idiomatic solution to this problem.

import { flushModuleIds } from 'react-universal-component/server'
import flushChunks from 'webpack-flush-chunks'

const app = ReactDOMServer.renderToString(<App />)
const { js, styles } = flushChunks(webpackStats, {
moduleIds: flushModuleIds()
})

res.send(`
<!doctype html>
<html>
<head>
${styles}
</head>
<body>
<div id="root">${app}</div>
${js}
</body>
</html>
`)

UPDATE (July 26th): the official demo repo is now: https://github.com/faceyspacey/flush-chunks-boilerplate-webpack-chunknames …we will continue to support flushing module IDs but will focus on flushing chunk names.

THE STORY

James Kyle about 2 months ago laid the groundwork for what needed to be done in his now famous article ending with “use this shit,” which even inspired Arunoda with his “dynamic” component contribution to Next.js 3.0. And so React Loadable was born.

React Loadable ultimately offered a solution that perhaps many had made something similar to, in one way or another. And which similarly had the same problem: it only solved the problem of having an async client-side component. What was most important was James Kyle’s discovery near the end of the article regarding how even though you may have an async component, you need it to render both asynchronously and synchronously, and that buffering/flushing the IDs of modules that were synchronously loaded on the server could lead to figuring out what initial chunks to send to the client.

As such, it turned out that the asynchronous aspect in fact was the easier task.

Rendering the client synchronously as you did on the server requires lots of labor.

You have to render synchronously on the server (in either a babel or webpack server environments) AND synchronously again on the client on initial load of the page so that React checksums match up with what was received from the server, and so an additional rendering did not occur (and so you don’t lose precious milliseconds or seconds loading modules asynchronously). Even this turns out to be just the beginning of the problem — because what about actually discovering which chunks the module IDs or paths correspond to? What about serving them? What about stylesheets? Are you just going to always render your style sheet as one big main.css instead of breaking it up into chunks like your js OR have to use the new and exciting tools which come with their own caveats: they don’t generate static cacheable sheets and eat up cycles continually at render time both on the client and server? What about HMR? The list goes on. To do code splitting plus SSR right requires a laundry list of things checked off to do with ease, and more importantly: idiomatically.

AND SO WEBPACK-FLUSH-CHUNKS WAS BORN

And so I studiously went on my way to solve a problem I had put up with in the past by getting the benefits of code-splitting without SSR + SEO. If you’ve done this, you know the drill: create a map of possible components you want to load asynchronously where each key is the name of a component and the value is an object containing an asynchronous value and a synchronous one (for the server). Then when you’re done you can dynamically call require.ensure and choose which component to load, and more importantly toggle between rendering the synchronous one on the server and the asynchronous one on the client. And when you were done, you/I just settled for no server-side rendering and no SEO benefits.

Basically, after hearing all the excitement about code-splitting, I couldn’t believe how little we all were getting for it, and how much farther it had to go.

Anyway, everything is a progression and ultimately that’s the really exciting thing about everything what’s happening in Reactlandia. I’m not even gonna say the words “Javascript F***gue” to paint the picture, but to me it’s the exact opposite. It’s all of us evolving the perfect [software] world in a decentralized way. The future is here. Or rather, it’s just begun.

MOTIVATION: THE PAST PROBLEMS /w CODE SPLITTING

So what were the precise problems? What have been the shortcomings of code-splitting?

Webpack long ago introduced the capability of code-splitting. However, it’s been an enigma for most (me at least). Firstly, just grokking the original require.ensure API and how to create your Webpack configuration to support it wasn't a natural thing for many. More importantly, if you were just to read how developers were using it, you'd think it was a done deal. But anyone who's tried to take this feature full circle and incorporate server-side rendering were left scratching their head (to me, it was surprising how little-talked-about this was).

What's the point of code-splitting when on your initial page-load--if code from additional chunks was evaluated server-side--it required an additional request to get them?

Sure, as your users navigate your single-page app it came in handy, but what about SEO? What about not showing loading spinners after your initial page loaded? If you're like me, you ended up only using code-splitting for a few areas of your app where SEO wasn't important--often lesser used portions of your app. There has been no off-the-shelf solutions to handle this besides Next.js, which requires you making a commitment to a “framework.” Coming from the Meteor world, and having left it a year and a half ago, I was not going back to Framework land.

So, React can synchronously render itself in one go on the server. However, to do so on the client requires all the chunks used to perform that render, which obviously is different for each unique URL, authenticated user, etc. While additional asynchronous requests triggered as the user navigates your app is what code-splitting is all about, it’s sub-optimal to have to load additional chunks in the initial render. Similarly, you don’t want to just send all the chunks down to the client for that initial request, as that defeats the purpose of code-splitting. In addition, if your strategy is the former, checksums won’t match and an additional unnecessary render will happen on the client.

As a result, the goal became to get to the client precisely those chunks used in the first render, no more, no less.

SOLUTION

By now you probably get that the solution revolves around somehow triangulating the data you have available such as the rendered module IDs to determine what chunks to spit out from the server. In general I’m breaking the whole solution down into 2 parts:

  • frontend
  • backend

James Kyle pioneered the “frontend”: React Loadable when used on the server, skips the loading phase and synchronously renders your contained component, while recording the ID of its corresponding module.

React Loadable may be used multiple times and therefore may record multiple split points.

What Webpack Flush Chunks (i.e. the “backend”) does is cross-reference those module IDs (or paths if using a Babel server) with your Webpack stats to determine the minimal set of “chunks” required to re-render those modules/components on the client. The “chunks” themselves contain ALL the files corresponding to the chunk (js, css, source maps, etc). So from there Webpack Flush Chunks outputs strings, React components, or plain arrays containing the precise javascript files (and CSS files) to embed in your HTML response. Though it has a lower level API available to you, it also automatically handles your main, vendorand possible bootstrap chunks, putting them in the correct order. It even creates React components you can pass to renderToStaticMarkup. Perhaps the most important thing it provides is simply the precise and detailed laundry list of things you must do to setup your Webpack config.

It’s not the hardest thing (though for whatever reason, remained arcane for so long). What it takes is getting extremely familiar with all the stats spit out by Webpack. It also requires figuring out how to match Babel paths to Webpack module IDs if you’re using a Babel server. In short the code isn’t too complex and is very maintainable, but required a lot of thought and trial and error to figure out.

That’s the end of the story. Use Webpack Flush Chunks. “Use This Shit” in the words of James Kyle.

NEXT: CSS

Well not quite. There’s also CSS. I have a lot of opinions on this one. But I’ll save most for the readme to Extract CSS Chunks Webpack Plugin.

I also have answers. It boils down to:

  • the fact that Using CSS Modules already is “CSS-in-JS”
  • Wasting cycles on both the client and server rendering CSS is just that — a waste!
  • Cacheable Stylesheets are just that — cacheable!
  • HMR is a must
  • And: guess what? If you can chunkify your CSS just like your JS you don’t need to generate “render-path” CSS to send the least amount of bytes over the wire. In fact, you send less!

See with truer CSS-in-JS solutions like StyleTron, Aphrodite, etc, all your CSS is represented in code anyway, aka javascript. So you may be sending the smallest amount of CSS possible, but you’re sending it all down in the form of javascript IN ADDITION and NO MATTER WHAT.

It also turns out that if you can statically chunkify your CSS, you’ve achieved the 80–20 rule: you’ve achieved the sweet-spot of 80% optimization in how little CSS you send over the wire. See, the real problem is, for example, sending over the CSS of your private user panels to your public-facing site or vice versa. If you have many sections/panels, your CSS exponentially grows and is sent everywhere. However, if you have a straight-forward mechanism to breakup your CSS into chunks by sections you’ve solved 80% of the problem, if not more.

Again, you can read the Extract CSS Chunks Webpack Plugin readme for a lot more thoughts on this. It boils down to static determination of what CSS to send being a damn good solution. Going after that last 20% by sacrificing render cycles and having to use custom hocs — IMHO — is nitpicking and results in diminishing returns.

Did I mention when you do in fact request async chunks, those chunks have the CSS ready for injection embedded in the JavaScript again? See, it creates two js chunks: one without css which is sent in initial requests along with real stylesheets AND another for async requests which has CSS injection as usual! This gives you the smallest possible initial js bundles.

Did I mention — unlike the original Extract Text Webpack Pluginit supports HMR! “USE THAT SHIT!”

MORE

Yes, I got more. You wanna make your own Async Component because neither React Loadable nor React Universal Component serves your needs?

Well, the core aspect for all this “universal rendering” goodness has been abstracted/extracted into its own package:

And what you make with it will flush chunks along with Webpack Flush Chunks just as easily as React Loadable and React Universal Component.

This time, I really won’t say anymore. USE THAT SHIT!

REACT UNIVERSAL COMPONENT

But let’s not get ahead of ourselves. A lot has been put into React Universal Component to make it the be-all-end-all — what I’m calling “Universal” — component. React Loadable kicked ass and this is its spiritual successor after all!

Basically everything under the sun (from PRs, issues, other packages, etc) has been included in it. With discretion of course ;)

I’m trying to think of a few noteworthy capabilities to point out (as just reading its readme probably is the best thing to do once again). Well, lets look at some code:

import universal from ‘react-universal-component’

const UniversalComponent = universal(() => import(‘./Foo’), {
loading: Loading,
error: Error,
timeout: 15000,
minDelay: 300,
chunkName: ‘myChunkName’,
onLoad: module => replaceReducers({ ...reducers, module.bar })
key: ‘Foo’ || module => module.Foo,
path: path.join(__dirname, ‘./Foo'),
resolve: () => require.resolveWeak('./Foo')
})

export default ({ isLoading, error }) =>
<div>
<UniversalComponent isLoading={isLoading} error={error} />
</div>

et voila!

A 2 argument API like Next.js’ dynamic, and an options argument with a super clean surface like Vue’s, which itself is also inspired by React Loadable.

Not all those options are required. In fact they are all optional. You could even have an async-only component which as you’ll read at the end of the readme — thanks to Async Reactor — may very well be the basis for an even further evolution in universal rendering.

There’s still a lot more than meets the eye. For one, you can wrap the resulting <Universal Component/> in a HoC that does data-fetching (i.e. some separate async work) and re-uses the Loading component (DRY) via the isLoading prop, etc. It makes for a perfect match for Apollo and the like.

Both promises will run in parallel, and the same loading spinner will show.

You can use onLoad to utilize other exports from the module to do related work: replacing reducers, updating sagas, perhaps something with animation, etc.

If 15000milliseconds is reached, the error component will show. Thank you Vue.

The minDelay is different than what React Loadable has. It results in a more responsive component. Instead of waiting a few ms to see anything, it always shows the spinner immediately. And you can set the minimum amount of time before the async component can show. This also helps with animations. Say your page with the spinner in it slides in and the sliding animation takes 500ms — well, now youcan avoid rendering jank from messing up your sliding animation by prolonging the page update until the sliding animation is done. It also better solves the original problem of avoiding flashing between the loading spinner and the component since, no matter what, they could appear around the same time without a minimum delay you can control. The readme has you covered there as well. This is just off the top of my head.

It has support for HMR which React Loadable has yet to attain. Same with all your async CSS if you’re using Extract CSS Chunks Webpack Plugin.

Lastly, instead of being restricted to promises with import() you can use a function that calls require.ensure with a callback, which gives you the additional capabilities of require.ensure. You can actually do all the stuff Async Reactor does including data-fetching in it. More importantly, the props are passed as an arg so you can determine what data to fetch dynamically. This is a story for another day, but checkout Async Reactor as you review this stuff. Even if you’re very familiar with React Loadable, but you haven’t checked that, it will likely throw you for a loop [in a very good way].

The interface proposed by Async Reactor has a lot of potential for becoming the idiomatic future of combination async/sync “universal” rendering.

Basically it has potential to be the greater NPM community’s answer to Next.js’ getInitialProps if it has a recursive promise resolution system like Apollo. Read the end of the readme to hear me go off on what I think is the future.

And did I mention: you don’t have to use it in a sever-rendered scenario. That’s just where it shines. If you read the readme (and compare it to Async Reactor), you can do some pretty cool things with the async component argument. Async-only is a primary use case for this package as well.

CONCLUSION

USE ALL THIS SHIT. THERE’S FAR MORE YOU CAN DO AS YOU’LL READ IN THE READMES. SO MANY BASES ARE COVERED. GOODBYE.

PS.

Did I mention Webpack’s “magic comments” feature which just came out is fully supported as well? Just use it and name your chunks and call flushChunkNames instead of flushModuleIds and pass chunkNames to universal(asyncWork, { chunkNames }) to make it work. It will save your server some cycles from doing all I had to do to jump through hoops to cross-reference module IDs with stats.

Richard Scarrott’s Webpack-Hot-Server-Middleware (HMR on the Server) is world class. Examine its usage in the boilerplates. It’s important and related because this is supposed to be the most modern [non-restricting] React/NPM developer experience for serious apps. The boilerplates themselves — I might add — are pristine. Developer experience goodness everywhere, hopefully you find it all to be idiomatic. Enjoy!

> For more idiomatic javascript in Reactlandia, read:

Tweets and other love are much appreciated. Find me on twitter @faceyspacey Want to stay current in Reactlandia? Tap/click “FOLLOW” next to the FaceySpacey publication to receive weekly Medium “Letters” via email 👇🏽

--

--

Responses (4)