Googlebot is the new IE

Googlebot is stuck on Chrome 41, which lacks support for many modern JavaScript features. SPA and library authors haven’t escaped the legacy browsers yet.

Matt Perry
DriveTribe Engineering
5 min readMay 11, 2018

--

One of my favourite new features in JavaScript is Proxy. As a library author, it’s exciting to think of the possibilities it opens for new APIs.

In Pose, I use it to make a simple syntax for defining animated versions of any DOM element (e.g. posed.div) without maintaining a comprehensive list of tag names.

I use it in Vekta to make a vector unit type that can be used as a normal array while supporting GLSL-style swizzling (e.g. pos.xy = pos.yz).

There are other ways of achieving these capabilities, but they require either a compromise on filesize and/or syntax.

Unlike many post-lightspeed JavaScript features, Proxy can’t be transpiled, and it can’t be effectively polyfilled. So, browsers that don’t support Proxy don’t support these libraries.

Until yesterday, I thought this was no problem! For many website authors, IE usage is practically a statistical error (check your stats!) All major browsers support Proxy, so I felt confident implementing it and allowing consumers to decide whether to support legacy browsers or not.

Except…

Of course, in web development things are rarely that simple. The other day, I received this GitHub issue. It read:

I learn that googlebot is using chrome 41 for rendering websites and if it encounters not supported syntax without polyfill it won’t render the page if it’s a SPA without prerenderd content.

The implication being, sites that rely exclusively on client rendering and wish to be indexed by Google are blocked from using any JavaScript feature not supported in Chrome 41, unless it can be transpiled or polyfilled.

By extension, if you’re a library author who wants your library to be widely consumed, you can’t use these features either.

This means no Proxy, at all. And it has wider implications, still.

For instance, the popular Babel env preset is used to transpile in line with browser feature sets using rules like "last 2 versions". This has always felt, to me, like a reasonable setting to use, but at the time of writing Chrome 41 is 15 versions ago!

There’s a long list of popular features that Chrome 41 doesn’t support:

  • Proxy
  • IntersectionObserver
  • class
  • Rest parameters
  • Arrow functions
  • Array methods find, findIndex and includes
  • Object.values, Object.assign
My heart

If your SPA relies on any of the above, and these features aren’t transpiled or polyfilled, Google will be unable to index it.

🙋🏽‍ I write an SPA, what can I do?

First, check your site with Fetch as Google. If you write a SPA and it isn’t being indexed properly, you can use this tool to render your site as Google does.

If your site looks broken, it’s possible your site isn’t transpiling or polyfilling a JavaScript feature and Chrome is throwing an error. You can debug this by downloading Chrome 41 from Google.

Server-side rendering

If your SPA is dependent on Google indexing, then it’s likely to be a content site. Personally, I think that content sites should be rendered server-side (SSR). It’s better for performance and reliability, so it’s better for users.

SSR was once a black art, but since the advent of Next.js there’s almost no excuse. If you write a React app, Next.js makes it trivial to write and deploy a version that can be rendered on the server and then enhanced by the client.

If your site doesn’t rely on sessions or dynamic data, you can even use Next.js to output a static site, which will be even more performant still.

🙋‍ I write a library, what can I do?

People who write libraries have a choice to make. You, as an author, need to think about who your audience is.

I want Pose to be enjoyed by as many people as possible, so after receiving this GitHub issue I made the compromise towards increasing filesize and added an array of all valid DOM nodes. Over time, I’ll revise this list to remove the esoteric elements and signpost the existing-but-undocumentedposed('caption') syntax for lesser-used DOM elements.

On the other hand, Vekta is more of a personal project that I made simply to probe the capabilities of Proxy. Its unique syntax is difficult to replicate without it. So I’m happy to leave it as-is.

Whatever you choose, knowledge about Googlebot’s limitations with SPAs isn’t as commonplace as browser compatibility. So, if you use a feature like Proxy, or a library that relies on an optional polyfill, mention this in your readme.

Conclusion

I thought that with the impending burial of IE11 we were close to leaving behind some of the limitations around adopting new JavaScript technologies.

Sadly it seems like SPA and library authors will be stuck in 2015 for another year or more. I searched for any reference to when a Googlebot upgrade might occur, and I could only find this tweet from Ashley back in March:

Wah indeed.

Recommendations recap:

If you’re an SPA author and Google indexing is important to you, ensure it can see your site properly by using Fetch as Google and debug with Chrome 41. In the longer arc, consider SSR and the benefits that could bring your users on slower connections.

If you’re a library author, think about the degree of adoption you want your library to enjoy, and how that might affect your choices in how you implement features. If you choose to use unsupported features, signpost this in your documentation so developers of SPAs don’t have to find out via the scenic route.

Finally, if you work near Whitecross St in London, the extra spicy pork, chorizo and guac burrito from Luardos is an absolute banger. Thank me later.

--

--