Speeding up next.js application (server side in-memory caching via LRUcache) — PART 1

Adam Liberski
2 min readSep 10, 2018

--

Haven’t heard about next.js? next.js is a really flexible, extensible, robust SSR REACT framework with encouraging learning curve, constantly growing community and is very well maintained & developed. (We have chosen it build our event website.)

“cougar on brown rock formation” by Priscilla Du Preez on Unsplash

This is the first episode of quite long (as i am thinking of it) series about how to build i18n app supporting SSR with pre-loaded external data.

disclaimer: there might be some ready-to-go solutions out there (looking forward to read comments) but as i decided to use external i18n SaaS/API provider (localise.biz) and multiple language-based cache i chose my own way of doing things

Our goals:

  • we will host our app via zeit.co as it just works :)
  • we will use SSR as we care about SEO
  • we will pre-fetch some API data to really benefit from server-side rendering
  • when using server-side cache, we should have versions of page/view in all supported languages

Consequences:

  • zeit.co is auto-scaling, distributed infrastructure with no permanent, built-in storage we could use so we will use in-memory cache only
  • SSR is a pretty standard thing for a next.js app
  • all pre-fetched requests to external API endpoints would add about 500–800ms client response overhead so we must use server-side cache
  • it’s quite typical for SPA/JS/REACT apps to have all user-specific content generated client-side,…but if we could really determine all types of content versions we could have limited set of server-side caches, right?
  • in order to serve different cached views/pages depending on user language pref we have to retrieve the preference server-side (so no local storage). this way we will prevent additional re-render client-side when having language mismatch

ready? let’s start!

(In part 1 we will just cover the basics… sorry if you will feel dissappointed)

Let’s create custom server implementation (we would need it not only for dynamic (= with params) routes but for also for server-side caching)

Nothing exciting above… and nothing to explain really. This implementation doesn’t give you any benefits comparing to built-in server yet…

Let’s use LRUCache for caching… have an instance of it and create routing for handling our homepage in totally different manner…

We are now supporting caching of our homepage (pages/index.js).

In next parts we will dive into details about how to retrieve user preferences in middleware and pass it to the routing declarations etc.

Thank you! Stay tuned.

--

--