Social Sharing With React and Vue Without Pre-Rendering or SSR

Send crawlers simplified HTML files with relevant meta-tags

Axel Wittmann
Mar 22 · 6 min read
Photo by Patrick Tomasso on Unsplash

You just built your Vue.js or React.js app including a nice API, deployed it, and want your users to share a URL on Facebook, Twitter, etc.? Whoops, you are in for a letdown if you think the crawlers used by popular social media sites can crawl your JavaScript built single-page application.

Unfortunately, they cannot. Googlebot is the only major crawler that doesn’t stop at your index.html file. All other crawlers don’t execute your JavaScript as of early 2020 and end up with a “blank page” and just the link that is shared.

A screenshot from the Facebook URL checker, loading the main image, title, and link… Facebook can only crawl information from an SPA if you provide it in a static HTML file with relevant meta-tags

Pre-Rendering and SSR Are Not the Only Solutions to Solve This Problem

Server-side rendering

Once you google this problem, you will find that the solution most bloggers and tutorials suggest is frameworks such as Next.js and Nuxt.js and server-side render (SSR).

SSR is a technique in which all the URLs of your app are rendered on your server whenever a user or a crawler/bot hits your back end to ask for a specific URL.

Instead of sending the index.html that references your JavaScript packages and CSS, your back end sends a rendered, static HTML that is created on the fly by running a virtual browser.

A crawler stops at this point, as it just wanted to get information from one single URL. A normal user will see the static HTML, then their browser loads your app and from that point onwards, uses the JavaScript run version of your site.

Sounds easy? Well, it can be, in particular for lightweight apps and websites that are built from scratch in Next.js or Nuxt.js. For more complex apps or apps originally built outside such frameworks, you can, however, run into various issues implementing SSR.

Pre-rendering or dynamic rendering

Hence, you might look into the other option typically employed to solve the problem: pre-rendering or dynamic rendering. For all static pages you use that aren’t changed at all, pre-rendering them before deploying your app is a very simple solution.

As these pages don’t change unless you do so manually and redeploy, you can pre-render them every time you deploy and serve them just as if they were server-side rendered.

Unfortunately, users who share content on social media sites typically don’t share simple static pages such as your terms and conditions or your landing page.

Users want to share content they found — a blog post, an article or anything else that is interesting to let others know they found it. If these pages are user content, their URLs are necessarily created dynamically. Which means pre-rendering whenever you deploy your app is not a viable solution.

The answer for user-generated dynamic content is a variation on pre-rendering — dynamic rendering. Third-party sites such as prerender.io allow you to pre-render 10,000s of URLs and add new dynamically created URLs on the fly.

These third-party sites also regularly crawl all these URLs to see if they are updated. It’s not a perfect solution compared to SSR (because content that is pre-rendered is inevitably lacking behind the current state of a page by a few days), but for most SPAs it is easy enough to set up and works well to have social sharing work without problems.

As an added bonus, you can have search engine crawlers be directed to these pre-rendered pages — specifically to cover search engines such as Bing or Yandex which, unlike Google, are not executing JavaScript to indexing sites.


A Third Solution That Does Not Involve SSR Frameworks or Pre-Rendering/Dynamic Rendering

When I searched this topic to implement our first iteration of Epiloge, I came across a reference on Stack Overflow that pretty much said: “Just give social media crawlers what they want — the meta tag information.”

I was intrigued that this might be an even easier solution to the problem. However, it has both drawbacks and benefits.

If you do not worry about search engines other than Google, and only want social sharing on large social media sites such as Facebook and Twitter, this solution might be for you.

It is pretty much a simplified server-side rendering solution that does not require Next.js or Nuxt.js. You create HTML pages with a <head> tag, a simplified body, and all required meta tags that the social sharing crawler can read.


Implementing This Third Solution in a Node Back End

We are using Node.js with Express as our back-end server at Epiloge. If you are not using Node.js, you should still be able to use the principles described below. They are straightforward to integrate into any back end.

1. Serve your index.html from your back end for this solution to work

If you have your Vue.js, React.js, or other SPA deployed and accessed by users directly from a cloud platform such as Amazon S3 without having the user first access your back end, the solution presented here won’t work.

You need all traffic accessing your website, in our case, our domain www.epiloge.com, to hit your Node.js server first. This way, you can check the user agent of incoming traffic for crawlers — while you serve your index.html as a static file to all other users.

The code above is easily explained. Insert a new express route calling it something like nonSPArouter that only deals with crawlers hitting your server. You can check the user agent of crawlers for the most common ones.

In the example above, I only included Facebook and Twitter, but you could add a lot more such as LinkedInbot, Quora, Pinterest, Slackbot, Whatsapp, Telegram, etc.

2. For static pages, hardcode the page and serve it with your preferred template engine

We are using Jade as a templating engine. In our Node.js server file, we install Jade with npm install jade and add app.set(‘view engine’, ‘jade’) to the server.js file.

This is all you need to integrate Jade and have it work as a template engine. Create a bot.jade file under /views in your back-end structure and you are good to go to serve static HTML files.

3. For dynamic pages, you need to get data from your database first and then include it in the HTML you render with Jade

For our app, we wanted social sharing to particularly work for URLs of projects, papers, and articles people share on Epiloge.

For this to work, we first need to get the title, the start of the text included in the body, and the relevant cover image, if any, to set meta tags in our bot.jade template file.

The Jade template engine will render the template filled with the information we get from the database. IT will then send the rendered static HTML to the crawler which can parse any required information for social sharing from its content.

And that’s it, no further code or changes to your site necessary (Photo by Yerlin Matu on Unsplash)

And We Are Already Done

And that’s it, without a lot of code and without any third-party framework, plugin or service, we can make social sharing sites access data from URLs that are shared. The trick is to simply hand a rudimentary HTML file with meta tags from which they can scrape information.

Is this solution suitable and recommended for every app you build and deploy?

Probably not. If you have a more complex app in mind with social sharing and search engine indexing for all search engines as a critical part of success, you should probably look into frameworks such as Nuxt.js or Next.js.

If you have only a few hundred or at most a few 10,000s dynamic URLs and don’t want to use SSR… look into dynamic rendering.

Oh, and by the way, dynamic rendering is something Google actually specifically endorses, so don’t worry that you aren’t allowed to do that, you are.

If you already have a Vue.js or React.js built, don’t want to invest a lot of time and effort to integrate SSR into your back end, don’t want to bother with dynamic rendering, or need crawlers to access current data at all times, this third solution may be for you.

Better Programming

Advice for programmers.

Thanks to Zack Shapiro

Axel Wittmann

Written by

Co-founder and CTO of epiloge.com — interested in all things tech, startups and political & cultural changes in society

Better Programming

Advice for programmers.

More From Medium

More from Better Programming

More from Better Programming

More from Better Programming

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade