Eliminating Next.js ISR Builds with Client Side Rendering

Sandeep Dinesh
7 min readMar 7, 2021

--

One of the best Next.js features is Incremental Static Regeneration. ISR combines the performance benefits of static generation with the flexibility of server rendered content. In a nutshell, ISR lets you create new static pages on the fly. As new requests come in, content on the page is refreshed on the server and the cached static page is updated. The first time an ISR page is visited, the user is shown a static “fallback” page until the first version of the page is built. After that, users are served a static page that automatically updates in the background!

At Kbee, we let users turn their Google Docs into an ultra fast wiki. A big part in making it fast is making sure our users are served static html from a CDN. Our users might have thousands of documents and Kbee doesn’t know when content changes, so Kbee uses ISR to build static snapshots of the docs and dynamically refreshes content as new requests come in.

It’s a perfect fit for ISR, and Vercel makes all of this as simple as a git push! The rest of this blog post assumes you are using Vercel as other providers might have different quirks and nuances.

(Note: We are live on ProductHunt today! Come check it out!)

The Downside of ISR

But like all good things in life, there is a price to pay for this magical experience. The initial load time or ISR content is significantly slower than SSR or even CSR content. Like orders of magnitude slower!

Here you can see two versions of the same page loading. One is SSR, the other the initial ISR page.

WOW!

Wow that’s a HUGE difference!

Like HUGE!

At this point, you might be thinking: “Yes, but with ISR pages only have to be built once! After that they are cached and are super fast!”

Unfortunately, Vercel completely wipes the cache on every deployment. If you had thousands, tens of thousands, or even millions of pages of static pages, all those pages need to be rebuilt. If you are deploying multiple times per day, users are definitely going to run into slow loads all the time!

Solution attempt #1: Deploy time builds

The solution to this problem is to rebuild these pages at deploy time and pre-cache them. To do this, you use getStaticPaths to query your database, get all the current pages, and build them all. Now pages that have already been built stay fast, and new pages go through the normal ISR process!

So we are done right? Well not so fast. This might be fine if you have a few hundred pages, but if you have millions of pages or pages that are expensive to render, your build times can quickly balloon to hours.

For us, builds went from 6 minutes (which is slow but not terrible) to 20 minutes! And we only had a handful of customers!

This method was not going to scale. We had to do something different.

Why is it slow?

To make something fast, we need to understand why it is slow. The obvious place to look was the actual building of the pages. Kbee makes multiple calls to Google Drive and Docs APIs to build the page, all of which can add up to 100s of milliseconds. Still, each page was taking a maximum of half a second to build. This was in line with the SSR load times, but the ISR load times were significantly slower.

Even stranger, calls to navigational pages which only made a single fast query into Firestore were still taking multiple seconds to render in ISR mode.

So why is ISR so slow? The clue was a single line in the build logs:

This uploading step was taking longer and longer the more pages were statically built at deploy time. And then it all made sense. The initial ISR load time was slow because of this upload step!

With server side rendering, the process looks something like this:
1. Client makes request
2. Server renders page
3. Client displays page

With client side rendering, the process looks something like this:
1. Client makes request
2. Server sends static html
3. Client runs javascript to fetch data

4 Page is rendered

With ISR, the initial process looks like this (I am making an educated guess here):
1. Client makes request

2. Server sends fallback page
3. Server renders page
4. Server uploads page to CDN

5. Server tells client the CDN is ready
6. Client makes request to CDN
7. Client replaces fallback with content from the CDN

So it’s not the building of the page that’s slow, it’s the back and forth with the CDN!

Solution attempt #2: Client Side Rendering on Fallback

If the fetching of the data is relatively fast, can we build the page faster on the client side than waiting for the CDN dance to complete?

The answer is a resounding yes.

The way this works is you create an API route that returns the same data as the getStaticProps method used to get the data for the ISR path. If the page is in fallback mode, you call the API route and render the page using traditional client side rendering. Once the CDN dance is complete, the page will automatically switch to the static html. Any new requests will load data from the CDN directly and not do any client side rendering.

For Kbee, I split the data fetching into two calls. The first one quickly gets metadata like the style, theme, and logos stored in Firestore to build the page scaffolding, and the other makes the slower request for the actual Google Docs content.

Here is a basic reproduction of the code to render a page like this:

import { useRouter } from 'next/router'import { useEffect, useState } from 'react'
export async function getStaticPaths() {
return { paths: [], fallback: true }}export async function getStaticProps({ params }) { // Run your data fetching code here
const data = 'something cool'
return { props: { data }, revalidate: 30, }}export default function Page({ data }) { const [clientData, setClientData] = useState(null) const { isFallback } = useRouter()
useEffect(() => { if (isFallback && !clientData) { // Get Data from API
fetch('/api/getPageData').then(async resp => {
setClientData(await resp.json()) }) } }, [clientData, isFallback])
if (isFallback || !data) return <RenderPage data={clientData} /> else return <RenderPage data={data} />}

There is a price to pay here because you are loading the same data twice. Once for the ISR page, and once for the CSR page. In my opinion, this is a small price to pay, and much smaller than the price you would pay for traditional SSR.

But there is still an issue, there always is 🙄

Next.js has a really cool <Link> component that looks and feels like a traditional anchor tag (think <a href=’’>) but instead of doing a full page reload, it does SPA style client side routing which makes it super fast and efficient. It also pre-fetches pages which lets you “warm up” the cache before the user visits the page which is awesome.

Except for one small problem. When routing to an ISR page, Next.js does not show the fallback and instead waits for the CDN dance to finish before showing the new page!

As pages take 5–6 seconds to load, the user thinks clicking the link did nothing and the website is broken! You can see folks complaining about this in this Github issue.

Solution attempt #3: The Fast SPA Abort

The first thing to do here is to let the user know that something is happening and the website isn’t broken. The easiest way to do this is to add a loading bar, and you can follow the steps in this Stack Overflow answer to add a nprogress loading bar to your pages.

While a progress bar looks nice, it still takes a long time to navigate to the new page. What we really want to do is force Next.js to show the fallback page.

The naive solution is to replace all the <Links> with standard <a> tags. This will remove the client side routing and force a full page reload on every navigation. This solution really sucks, especially for folks that don’t have great internet. It feels janky and slow, and you lose one of the really powerful features of Next.js

So, I came up with a wacky solution I’m calling the “Fast SPA Abort”. It’s super simple. If a page navigation takes longer than 100ms to complete, abort the SPA routing and do a full page refresh. That’s it!

In your _app.js, add the following code:

import NProgress from 'nprogress'import Router from 'next/router'
Router.onRouteChangeStart = (url) => { if (url !== window.location.pathname) { window.routeTimeout = setTimeout(() =>
window.location = url, 100)
NProgress.start() }}Router.onRouteChangeComplete = (url) => { clearTimeout(window.routeTimeout) NProgress.done()}

When the route starts to change, we create a timeout that will refresh the page in 100ms. When the route change finishes, we cancel this timeout. 99% of the time when doing a SPA navigation, the ISR pages are cached in the CDN so the routing will be instant and this timeout will never get called. However, if the page is not cached in the CDN, then the routing will be aborted and the page will be refreshed, showing the fallback!

Boom, now you get the best of all worlds. Fast SPA routing, dynamic static pages with ISR, and zero deploy-time builds with CSR!

P.S. If you read this far, check out Kbee on ProductHunt! I’d love your support ♥

--

--