Client-side rendering vs. server-side rendering

Performance

Diagram of how server-side rendering works
Diagram of how client-side rendering works
  • With client-side rendering, the initial page load is going to be slow. Because communicating over the network is slow, and it takes two round trips to the server before you can start displaying content to the user. However, after that, every subsequent page load will be blazingly fast.
  • With server-side rendering, the initial page load won’t be terribly slow. But it won’t be fast. And neither will any of your other requests.
<html>
<head>
<script src="client-side-framework.js"></script>
<script src="app.js"></script>
</head>
<body>
<div class="container"></div>
</body>
</html>
var pages = {
'/': '<html> ... </html>',
'/foo': '<html> ... </html>',
'/bar': '<html> ... </html>',
};

SEO

var request = require('request');
request.get('reddit.com', function (error, response, body) {
// body looks something like this:
// <html>
// <head> ... </head>
// <body>
// <a href="espn.com">ESPN</a>
// <a href="news.ycombinator.com">Hacker News</a>
// ... other <a> tags ...
});
var request = require('request');
request.get('reddit.com', function (error, response, body) {
// body looks something like this:
// <html>
// <head> ... </head>
// <body>
// <a href="espn.com">ESPN</a>
// <a href="news.ycombinator.com">Hacker News</a>
// ... other <a> tags ...
request.get('espn.com', function () { ... });
request.get('news.ycombinator.com', function () { ... });

});
var request = require('request');function crawlUrl(url) {
request.get(url, function (error, response, body) {
var linkUrls = getLinkUrls(body);
linkUrls.forEach(function (linkUrl) {
crawlUrl(linkUrl);
});
});
}
crawlUrl('reddit.com');
<html>
<head>
<script src="client-side-framework.js"></script>
<script src="app.js"></script>
</head>
<body>
<div class="container"></div>
</body>
</html>

Prerendering

https://webmasters.googleblog.com/2009/10/proposal-for-making-ajax-crawlable.html
  • When the request comes from a crawler, we could serve <div class="container"> ... </div>.
  • When the request comes from a regular human, we could just serve <div class="container"></div> and let the JavaScript insert content inside.

Smarter crawlers

<div class="container"></div>
<div class="container">
...
...
...
...
...
</div>

Less smart crawlers

https://www.netmarketshare.com/search-engine-market-share.aspx?qprid=4&qpcustomd=0

Best of both worlds

  1. Use server-side rendering for the first page load.
  2. Use client-side rendering for all subsequent page loads.
  • For the first page load, it doesn’t take two round trips to the server before the user sees content.
  • Subsequent page loads are lightening fast.
  • Crawlers get their simple HTML. Just like the old days. No need to do the work of running JavaScript. Or dealing with _escaped_fragment_.

Discussion

  • Roughly 2% of users have JavaScript disabled, in which case client-side rendering won’t work at all.
  • About 1/4 of web searches are done with engines other than Google.
  • Not everyone has fast internet connection.
  • People on their phones usually don’t have fast internet connection.
  • A UI that is too fast can be confusing! Suppose the user clicks a link. The app takes them to a new view. But the new view is only subtly different from the previous view. And the change happened instantaneously (as client-side rendering folks like to brag about). The user may not notice that a new view actually loaded. Or maybe the user did notice, but since it was relatively subtle, the user had to apply some effort to detect whether or not the transition actually happened. Sometimes it’s nice to see a little loading spinner and full page re-render. It prevents us from having to squint to see changes.
  • To some extent, it makes sense to program to where the performance puck is going to be. Your users are going to be a mix of people who live in the year 2017, 2019, 2020, etc. It doesn’t make sense to pretend that they all live in the year 2017. Yes, you could update your app next year once the improvements in speed happen… but taking the time to do so certainly isn’t free.
  • Caching is a thing. So with server-side rendering, often times the user doesn’t actually have to go all the way out to the server. And sometimes they only need to go to a server nearby, rather than the “official” one across the ocean.
  • #perfmatters.
https://www.slideshare.net/phaithful/seo-and-js-new-challenges
  • Actually, with regards to performance, sometimes It Just Doesn’t Matter. Sometimes the speed is Good Enough, and marginal increases in speed don’t really make life any better.

--

--

--

Rationality, effective altruism, startups, learning, writing, basketball, Curb Your Enthusiasm

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Unit Testing with NestJS and Jest: A comprehensive tutorial

Snake Game With Rust, JavaScript, and WebAssembly

What is a Java Full Stack Developer and How Do You Become One?

The Coffee Threshold for Javascript Compilation

What is gcc in linux ?

Lit Sheet Music

How to debug angular applications efficiently? Part — II

Understanding JavaScript Code Coverage

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Adam Zerner

Adam Zerner

Rationality, effective altruism, startups, learning, writing, basketball, Curb Your Enthusiasm

More from Medium

Code a blog with Sanity.io and Next.js

code a blog in next.js and Sanity as a CMS by Sabir Hussain

Journey of Cloning practo.com

What is virtual DOM?

Edit form in Todo List React Redux App (with hooks)