<Lazy> rendering in Vue to improve performance
Last year, a lot has been in happening on the web in terms of performance. Unsurprisingly, everyone agrees the web should be fast. Google introduced its Web Vitals metrics to stress that the web applications should load fast even on slower devices and slow networks. Svelte showed that web apps can be significantly smaller in their bundle size. React introduced the Fiber algorithm that allows pausing the rendering process so that the app stays interactive and doesn’t lag even if it means making the rendering take longer overall. Vue didn’t stay behind in this area at all — Vue 3 got very slim and performant without sacrificing the developer experience.
Yet in this day and age, these libraries still cannot do all the performance optimisations for you and make sure your app is fast in all circumstances. If you want to have next-level performance, you have to do optimisation on the application level by yourself.
When it comes to improving performance in our apps it basically comes down to three things:
- identifying and dealing with specific performance bottlenecks — identifying expensive 3rd party JS plugins, specific issues found via performance profiling, and so on.
- Reducing the amount of work. Effective code-splitting so that relevant work is done only on relevant pages. Using
v-if
instead ofv-show
when possible, simplifying your components so that they don’t use too many elements. - Scheduling work to happen at the right time. Delaying lower priority work to be done in the background.
In this article, I’ll focus on the 3rd category and I’ll try to do it via introducing a universal <Lazy>
component.
Let’s get <Lazy>
I’ve been dealing a lot with perf optimisations lately. Over time I found out I have several recurring performance topics:
- Fast page transitions (in a single page app)
- Smooth (infinite) scrolling
- Fastest boot possible
Over time I found out I can deal with all of these with one configurable higher-order component.
First, let’s deconstruct how we can save and schedule work in each of these areas. In page transitions, we want to render above-the-fold content as soon as possible and delay rendering of the off-screen content for later. In infinite scrolling, we want to render content just in time when it’s about to enter the viewport. At boot, we can identify less critical parts of the layout to be rendered a bit later — footer, some parts of sidebar and navigation menu, and so on.
It’s all about delaying and scheduling rendering. Let’s start with a very simple example of a higher-order component that can delay rendering.
Few things are happening here. A ref shouldRender
is declared that is very soon set to true in nextTick
. The callback passed to nextTick
is called in the next DOM update cycle and so the rendering of this component is delayed to the next rendering cycle. That means the other content can display sooner on the page and does not have to wait for this delayed one to finish. This lazy component can be used like so:
<Lazy>
<SomeLessCriticalContent />
</Lazy>
Like this, it’s very simple but it could already cover certain cases. For example, wrapping the off-screen content to speed up the page transitions. On a transition from page to page, you render the top part of the page in the first render cycle so the user sees something ASAP and delay the rest a little bit.
But there’s at least one problem. Although we can render something in the first render cycle and delay other things to the next one if there’s too much stuff scheduled for the next render cycle it might cause jank. It’s just two levels of scheduling: ASAP and “a bit later” and for the “a bit later” everything starts happening at the same time. Surely it can be scheduled better.
Intersection Observing
IntersectionObserver is an amazing browser API that lets us know when a certain DOM element has entered the viewport. It’s not available in ancient browsers like IE11 but it can be polyfilled.
You can quickly use it with the help of libraries like VueUse or you can create your own helper function, thanks to the flexibility of composition API.
For simplicity, I’ll showcase it with the VueUse composable:
Few things are happening here:
- We set up a template ref to be used by the
useIntersectionObserver
utility.IntersectionObserver
needs to observe a real DOM element. We pass this template ref to the wrapping div element of our<Lazy>
component. useIntersectionObserver
calls the provided callback function when the element has entered the viewport. With some destructuring we accessisIntersecting
and if it's true we change theshouldRender
to true. We also call immediatelystop
to tell the intersection observer to stop observing from that point onwards. So far once the element enters the viewport, we render the content and leave it that way. No need to observe anymore.- We set
rootMargin
to 600px. Usually, it’s good to have some extra margin for a good user experience. This root margin makes sure the callback fires some time before the content enters the viewport. If possible, users shouldn’t be waiting for content to render if they don’t scroll too fast.
Like this we can use our <Lazy>
component several times for different sections of the page:
<Lazy style="min-height: 400px"><FirstSection /></Lazy>
<Lazy style="min-height: 600px"><SecondSection /></Lazy>
<Lazy style="min-height: 400px"><ThirdSection /></Lazy>
Like this, the content will render lazily as the user is scrolling.
You probably noticed the min-height
setting. That’s a little bit annoying part of lazy rendering. Since the content might not be rendering sequentially and different parts of the content might render in a different order (especially if multiple of these sections fit into the viewport) it’s important to preserve space ahead of time so that the lazy rendered content doesn’t jump around in the HTML document.
Fortunately, these numbers often don’t have to be precise, especially with prerendering.
Prerendering
With prerendering, we’re coming back to our initial example where the rendering was just slightly delayed. It had some downsides but what if both approaches could be combined in a smart way?
If the content is scrolled to it’s rendered ASAP but if not, it’s still rendered in the background with a delay. Only this time we’ll use a bit more lazy callback then nextTick
. We’ll use requestIdleCallback.
We’ve added a simple function onIdle
that either uses requestIdleCallback
or setTimeout
with a noticeable delay.
The difference from the previous example is that requestIdleCallback
by itself will be called later when the browser is truly ready to do extra work, which should prevent jank. Also when it’s called several times, the browser might schedule these callbacks in a smart way to keep the FPS high:
In this case. If the user starts scrolling very quickly after the first render the intersection observer might fire before the idle callback. But usually, the idle callback is firing before the user starts scrolling. This improves UX as it prevents waiting for content even for faster scrolling speeds. But not always do you want to prerender content like this. If there’s a very large amount of these lazy sections it could be just too much work to do in the background, making the web app consume unnecessary CPU power and memory. In that case, it’s better to render just in time before the content enters the viewport. Actually, in terms of cases like infinite scrolling, you might need to start dealing with unrendering content.
Large lists and Unrendering content
Imagine you have a very large amount of content. It might be a case of infinite scrolling where additional content loads as you hit the bottom of the page (and that might be also implemented via an IntersectionObserver) but it might just be simply a very large dataset or something like that.
It might happen that there’s a lot of rendering happening and it comes to the point where keeping everything rendered and present in the DOM is no longer suitable. There are two aspects of this:
Memory: Each component has its own state. It also uses various watches, computed values, and perhaps other data structures that accumulate. At some point, it might be too much memory. The most memory-intensive case would probably be video or audio (think of Twitter, Facebook, or especially Tiktok). So at a certain point, the device can run out of memory and start lagging or crashing.
Performance (cleanup work): Cleanup can actually be costly. This is less true of Vue 3 than Vue 2, but even for Vue 3 unmounting components is still work that can add up. If you’re doing too much clean-up at the same time, like leaving the page that has lots of content loaded and navigating to a different page, the unmounting and cleaning can cause jank. So from a certain point, it’s good to start doing clean-up gradually so it doesn’t accumulate into so much work that it would take more than 300ms and cause jank.
The most common way to deal with this is to use a virtual scroller. Virtual scrollers are cool and the algorithm is designed specifically to deal with these issues. They can also recycle DOM elements so that it takes less work to render additional items in the list. But virtual scrollers can also be a bit inflexible in certain ways and if you set up something a bit wrong, it can glitch nicely. I found out I can get quite far with my <Lazy>
component without the need for a virtual scroller. And as opposed to a virtual scroller, which is a bit of a black box, the <Lazy>
component is quite simple and tweakable.
So let’s say we have a large list that’s being rendered like this:
<Lazy v-for=”user in users” style="min-height: 300px">
<User :user="user />
</Lazy>
If there are thousands of users, in the beginning only a couple of them will render until the viewport is filled. As the user is scrolling, more and more render. So far so good.
But at a certain point, they’ll have to unrender. Let’s adjust our component to take care of it.
Because the component is growing in size I added a few comments to the code. But I’ll summarize the final changes here anyways:
There’s two timers: renderTimer and unrenderTimer. Both rendering and unrendering is delayed. Unrendering is delayed significantly — by default by 10s — so 10s after the component leaves the viewport it’s unrendered.
But rendering is delayed too, only by 200ms. This is so that the component has to stay at least 200ms in the viewport (+ rootMargin) to start rendering. This prevents the rendering of components when the user is scrolling very fast. Render timer is cancelled on leaving the viewport and unrenderTimer is cancelled if the user scrolls back up to a component that was set to unrender.
Another thing that had to be added was the handling of minHeight
. In previous examples min-height
was set on the <Lazy>
component via style from the outside. If we’re unrendering content handling of min-height have to be more precise. Once certain elements are removed from the DOM, the reserved space has to be precise otherwise the content would start jumping as soon as the component would start to be removed. Removing content below the viewport is no big deal, but removing components above the viewport can change the position of all the elements below it.
But we can do one trick: at first, we provide a minHeight
estimate via a prop. This is used for the initial min-height of the component for reserving space below the viewport. And after the component renders and is set to unrender, just before we set shouldRender
to false we measure the content and save it to a ref. That way any kind of content jumps are prevented.
In the end we render our list like so:
<Lazy v-for=”user in users” :unrender="true" :min-height="300">
<User :user="user />
</Lazy>
And that’s it! Such <Lazy>
component can be used all over the place for various purposes and It can be tweaked further to fit particular needs. If you wan to see it in action, feel free to play with it in this sandbox.