Rendering, Two Waves, and the Future of JavaScript SEO

Ziemek Bućko
Onely
Published in
6 min readSep 11, 2019

In August, Onely’s CEO and a leading authority in all-things JavaScript SEO Bartosz Góralewicz went to Zurich to meet with Google Webmasters John Mueller and Martin Splitt. Together, they hosted an English Google Webmaster Central office-hours Hangout.

After answering a couple of intriguing questions asked by the developers who tuned in, they spent some time discussing the details of modern JavaScript SEO.

And what I mean by that is, things got really geeky.

Like we could expect anything else when these three experts get together.

Here are the key takeaways from their conversation about the near future of JavaScript SEO.

1. Every new domain goes through rendering by default

The specifics of how Google is indexing the web and particularly new domains have always been a bit of a mystery.

During the Hangout, Splitt revealed that every new website gets rendered by default in the process of indexing.

He specifically stated: “There’s no indexing before it hasn’t been rendered.”

After the initial rendering of the HTML file, the algorithm decides whether to keep rendering the domain or turn the rendering off.

The next takeaway goes into more detail about how this decision is made.

2. Google looks for differences in the DOM pre- and post-rendering

Document Object Model (DOM) is an interface which visually represents the structure of an HTML or XML document in the form of a logical tree. As it turns out, Google uses the DOM to see if they need to turn rendering on long-term for a particular website.

Bartosz Góralewicz: So one of the factors for you is, like, the difference between the –

Martin Splitt: — initial crawl –

BG: — initial HTML, whatever, and then the rendered DOM.

MS: Yeah, crawled DOM and rendered DOM.

In other words, Google generates a DOM before and after rendering the document, and if there is new content in the rendered DOM that wasn’t there before, it’s a reason for them to keep the rendering on.

3. Two-waves indexing is “less of a factor”

At Onely, we have been experimenting a lot with JavaScript websites to determine why Google indexes some in two waves and some not. Bartosz wanted to get some insider knowledge on the topic and was curious to hear what Splitt had to say.

Asked about the factors which determine two waves of indexing, Splitt mentioned the difference between the content detected in the initial HTML and what is visible after rendering:

MS: What we do is we do an HTTP request, and we get something back, right — some HTML, maybe it’s a barebone HTML and all it does is load the JavaScript and run the JavaScript. Then, this HTML that we got from the original HTTP GET request from the crawl, goes into rendering. Rendering runs JavaScript — boom!, a lot of content happens that wasn’t there before — so we’re like, aha! Ok, so this needs to be rendered.

Meaning, if running the JavaScript results in new content in the DOM, the website is more likely to be rendered in two waves.

Additionally, it turns out that to make indexing less resource-consuming, more and more non-JavaScript using websites are also going through the render phase.

MS: [Skipping rendering] is not as frequently happening anymore. For many websites, even if they do not run JavaScript, they might still go through the render phase, because it doesn’t make that much of a difference. (…) It’s cheaper for us.

. . .

The internals of that are very complicated, and I still haven’t fully, like, grasped what exactly triggers the [two-waves indexing] heuristics.

Splitt also mentioned Google’s plans to make the whole indexing process more seamless, which in the end could make the two waves of indexing a non-issue.

MS: I expect [that] eventually, rendering, crawling, and indexing will come closer together.

You can watch Bartosz’s specific thoughts on this takeaway in this video:

“Is Google’s Two Waves of Indexing Over?” from Onely’s YouTube Channel.

4. JavaScript SEO is here to stay

The discussion continued with an exploration of the future of JavaScript SEO.

Since Google is getting better at rendering modern websites, and the indexing of JavaScript is less and less of an issue, JavaScript SEO is likely to evolve in a different direction. As Mueller said:

JM: With normal, technical SEO, it’s already hard, and it’s something that a lot of people struggle with, whether it’s, like, the internal linking and you have unique URLs and all of these things. And, with JavaScript it’s all hidden away, so you really have to know how JavaScript works, and when something goes wrong, that’s really not gonna be trivial to find, and new frameworks, like, new elements in Chrome, all of these things, they kind of come together . . .

Splitt gave a great example of how modern web solutions may need to be optimized for crawling and indexing algorithms, regardless of whether they are using the most advanced browsers or not:

MS: We have to make a decision [on] what to index . . . let’s say I go to a website that has a web component, and there’s something in the shadow DOM, then I see the shadow DOM content. [But] if I run Internet Explorer 10, I see light DOM content, which gets overwritten . . . That’s still something that you need to know and be aware of.

Mueller added:

JM: . . . things like shadow DOM and light DOM, it’s like, how are you gonna figure that out unless you already know that this is a thing, or things like, you’re using <canvas> to put content out there and we think, oh, <canvas> is an image so we index it as an image . . .

Wait — Shadow DOM?

Since shadow DOM and other web components are relatively new concepts, let’s take a moment to understand what they are, and why these new features are relevant for the future of JavaScript SEO.

Shadow DOM is one of the web components. Using shadow DOM allows developers to create self-contained elements which remain uninfluenced by rules or styles dictated in the main document.

It allows for working on independent elements of an HTML document without having to worry about any conflicting global settings. These elements are then heavily reusable as they don’t depend on any external code, and tinkering with them does not influence other parts of the document.

Browsers have been using the shadow DOM in the past to encapsulate the inner structure of an element like <video> or <textarea>.

For instance, when a browser renders the <video> tag, it automatically attaches it to the shadow DOM to display the video controls default for the browser.

Similarly, a developer can use JavaScript to attach a shadow root (the top element of the shadow DOM tree) to any element and apply scoped CSS along with other properties.

Since elements in the shadow DOM are self-contained, they can be easily used across many projects. They never get overwritten by global CSS properties, nor do they trigger naming conflicts. This allows for a convenient, segmented web design process.

But what about JavaScript SEO?

JavaScript SEO will get even more technical, as it will have to address the performance of modern web platform improvements, and not only the rendering issues.

MS: . . . Right now, it’s more figuring out what’s going wrong probably, and helping troubleshooting, and it’s gonna turn more into, like, there’s 10 ways of doing this in JavaScript, 9 of them are terrible because — it’s like, while developers are trying to figure out the right way –

That’s one of the reasons why I want developers and SEOs to sit at the same freaking table. Because developers are like, “OK, so this is really hard for us, this is making everything slower,” and they are not necessarily thinking about, “Can Google index this, or can, can search engines see this.”

You can watch Bartosz’s specific thoughts on this takeaway in this video:

“JavaScript SEO is Dead, Long Live JavaScriopt SEO!” from Onely’s YouTube Channel.

Wrapping Up

To sum up, here’s what Googlers think: JavaScript SEO is not going away, but it’s bound to evolve along with the technical possibilities of web development.

With various web design solutions available, SEOs will have to work more closely with developers to strive for web design that works well with search engines and, at the same time, provides a quality user experience.

--

--