Optimize Performance By 10X For A React Single Page Application

Photo by Alternate Skate on Unsplash

In this article, I would like to introduce how we optimize the frontend (react + mobx + webpack) performance for Cominsoon. It is quite essential for small Sass like us who can’t afford of high performance, expensive server.

Previously, it took more than 25s loading the landing page which rendered by the pug and react components, after a series of optimization, it only takes 2s under the same network environment.

Steps we will talk about

Just in case you’ve already known them:

  1. Update webpack config for compression and caching
  2. Compress images by TinyPNG
  3. Update the backend to work with gzip

Webapck configuration

We are using react-router and webapck to build our SPA and having implemented code splitting. For production, we still need to set up a specific webpack config to make use of HTTP caching and CDN.

1. Load large third-party libraries from CDN

You can always find CDN for widely used libs(e.g., React) from reliable hosts. Loading them as the externals will dramatically reduce the size of your webpack outputs.

For your users, loading assets from CDN are usually much more quickly than from your server no matter where he/she is.

externals: {
'react': 'React',
'react-dom': 'ReactDOM',
'd3': 'd3',
'lodash': 'lodash'

2. Extract small, widely used libraries into the common chunk

You don’t have to load every third-party lib from CDN because it means a large number of HTTP requests. Packing widely used libs into one file, let’s call it vendor.js, and loading it at the beginning may increase the initial loading time, but it usually acceptable because we’ve already loaded large libs from CDN, thus the vendor.js won’t have a huge size.

On the other hand, thanks for the caching strategy we will mention in the next, vendor.js would be saved in client side as a cache which means they won’t be requested from the server next time opening the App.

entry: {
app: ...,
vendor: ['mobx', 'mobx-react', 'classnames', 'prop-types', 'moment', 'superagent']
plugins: [
new webpack.optimize.CommonsChunkPlugin({
name: ['vendor', 'manifest'],
minChunks: Infinity

3. Keep chunks’ file name unchanged if they didn’t update

The browser implements caching strategy on the assets with the same name. For example, if the browser has already loaded a file named a.js, next time when loading a file b.js, the browser will take b.js as a new asset and skip caching strategy to ask the server to fetch b.js even though the content of b.js is exactly as same as a.js.

output: {
filename: '[name].[chunkHash].js',
plugins: [
new webpack.HashedModuleIdsPlugin(),

4. Compress JS and extract CSS file

We use uglify.js for JS compression and extract-text-webpack-plugin for extracting CSS out of JS which is necessary because CSS can load in parallel with JS.

const autoprefixer = require('autoprefixer')
const ExtractTextPlugin = require('extract-text-webpack-plugin')
const extractSass = new ExtractTextPlugin({
filename: '[name].css',
allChunks: true,
disable: process.env.NODE_ENV === 'development'
module: {
rules: [{
test: /\.scss$/,
use: extractSass.extract({
fallback: 'style-loader',
use: [{
loader: 'css-loader',
options: {
minimize: process.env.NODE_ENV !== 'development'
}, {
loader: 'postcss-loader',
options: {
plugins: () => [autoprefixer]
}, {
loader: 'sass-loader'
}, ...]
plugins: [
new webpack.optimize.UglifyJsPlugin({
mangle: true,
compress: {
warnings: false,
pure_getters: true,
unsafe: true,
unsafe_comps: true,
screw_ie8: true
output: {
comments: false
exclude: [/\.min\.js$/gi]
new webpack.IgnorePlugin(/^\.\/locale$/, /moment$/),
Tips: using webpack-bundle-analyzer for output analyzing and webpack-merge for construction

Image compression

The best suggestion for image compression is not using images, kidding :P. We reduced the dimension of the images to fit the specific HTML element instead of using background-size: contain/cover everywhere and used tinyPNG to compress almost every image.

Work with gzip

You can find the definition of gzip from Google. In practice, using gzip can reduce the time consumption of HTTP request significantly by compressing the assets.
Gzip can be supported in Nginx or backend server. We added it to our express.js backend by just adding a middleware compression

const express = require('express')
const compression = require('compression')
const app = express()


You will never want to see your users staring at the loading spinner and waiting when they are trying your product.

Optimization for backend and database querying might be needed when the product getting more and more popular, but improving the frontend is the first step to get there, good luck :)