JavaScript & SEO Crawlability: Solutions

In the beginning, JavaScript and SEO were at odds, but now they can be best friends with the right planning.

The use of JavaScript is generally accepted as one way to speeding up perceived performance of a site, and there’s an overwhelming amount of data and studies that show performance has significant business ramifications. Just follow #perfMatters for a few hours.

So from developers to business stakeholders, there’s a strong desire to use a modern JS Heavy framework like Angular, Ember, React/Flux, Backbone, and more. But even with all the benefits from revenue, user experience, and code maintainability, many companies that generate a lot of their revenue online are slow to change. They’re afraid to risk their large and valuable organic search traffic due to past advisories from search engines and SEOs that search engine crawlers were not capable of rendering out JavaScript created content.

However, Google has been gaining confidence in their abilities to render JavaScript at scale. They’re so confident that they’ve recently announced that they’re deprecating the heavily used AJAX Crawlability specification they first introduced back in October of 2009. Although this is exciting news, there are caveats. Primarily, Googlebot still isn’t unpacking the entire JS heavy framework to discover all interactions, state, and content changes yet; but more importantly this is only Google. Even though bing and Yahoo! combined have a smaller marketshare, there’s still significant engagement, audience, and revenue to be had through those search engines, as well as, other engines for other markets; and we still have to consider social behemoths like Facebook, Twitter, and Pinterest.

Below is a 22 minute talk I recently gave at Pubcon 2015 in Las Vegas entitled “SEO and JS: New Challenges” (slides), where I dive into the core issues and review potential solutions.

For the impatient, TL;DR You’ll want to select the appropriate JS heavy framework to easily server-side render your content. But if you’re in a rush or don’t care about other sources of traffic crawling and indexing your content then check out our test site, JSCrawlability.com to understand how Google and other crawlers perceive popular JS heavy frameworks.

Following Up

The next steps for JSCrawlability.com is to test more commonly used design patterns within React, Angular, and Backbone, as well as, expand into other frameworks like Ember, Meteor, or interesting Flux-like flavors, such as, redux or catberry. Additionally, emerging frameworks like Accelerated Mobile Pages(AMP) that utilize web components and enforce design constraints to improve the performance of sites would also be interesting to understand.

Lastly, if SEO is something you think is important and performance is a priority, consider this.

SEO is NOT a marketing discipline. SEO is product management where the use cases and job stories begin with “When a customer uses a search engine …”. If you keep to these 2 core principles:

  1. Build the best user experience for users who are navigating to your site through a search engine
  2. Make all content crawlable, indexable, and understandable by search engine crawlers

You’ll find organic search growth and the audience you’re looking for, so long as you have worth while products, content, and / or services.