More Proof That Longer Clicks Matter for SEO & How to See It Yourself

What’s the real impact of machine learning — and specifically Google RankBrain — on SEO? This has been one of the biggest debates within SEO over the last year.

I won’t lie: I’ve been a bit obsessed with machine learning over the past year. Because my theory is that RankBrain and/or other machine learning elements within Google’s core algorithm are increasingly rewarding pages with high user engagement.

Basically, Google wants to find unicorns — pages that have extraordinary user engagement metrics like organic search click-through rate (CTR), dwell time, bounce rate, and conversion rate — and reward that content with higher organic search rankings.

Happier, more engaged users means better search results, right?

So, essentially, machine learning is Google’s Unicorn Detector.

Machine Learning & Click-Through Rate

Many SEO experts and influencers have said that it’s totally impossible to find any evidence of Google RankBrain in the wild.

That’s ridiculous. You just need to run SEO experiments and be smarter about how you conduct those experiments.

That’s why, in the past, I ran an experiment that looked at CTR over time. I was hoping to find evidence of RankBrain (or other machine learning elements).

What I found: results that have higher organic search CTRs are getting pushed higher up the SERPs and getairbnb alting more clicks:

Click-through rate is just one way to see the impact of machine learning algorithms. Today, let’s look at another important engagement metric: long clicks.

Time on Site Acts as a Proxy for Long Clicks

Are you not convinced that long clicks impact organic search rankings (whether directly or indirectly)? Well, I’ve come up with a super easy way that you can prove to yourself that the long click matters — while also revealing the impact of machine learning algorithms.

In today’s experiment, we’re going to measure time on page. To be clear: time on page isn’t the same as dwell time or a long click (or, how long people stay on your website before they hit the back button to return to the search results from which they found you).

We can’t measure long clicks or dwell time in Google Analytics. Only Google has access to this data.

Time on page really doesn’t matter to us. We’re only looking at time on page because it is proportional to those metrics.

Time on Site & Rankings (Before RankBrain)

To get started, go into your analytics account. Pick a time frame before the new algorithms were in play (i.e., 2015).

Segment your content report to view only your organic traffic, and then sort by pageviews. Then you want to run a Comparison Analysis that compares your pageviews to average time on page.

You’ll see something like this:

These 32 pages drove most of my organic traffic in 2015. Time on site is above average for about two-thirds of these pages, but it’s below average for the remaining third.

See all those red arrows? Those are donkeys — crappy pages that were ranking well in organic search, but in all honestly had no business ranking well. They were out of their league. Time on page was half or a third of the site average.

Time on Site & Rankings (After RankBrain)

Now let’s do the same analysis. But we’re going to use a more recent time period when we know Google’s machine learning algorithms were in use (e.g., the last three or four months).

Do the same comparison analysis. You’ll see something like this:

Look what happens now when we analyze the organic traffic. All but two of my pages have above average time on page.

This is kind of amazing to see. So what’s happening?

Does Higher Dwell Time = Higher Search Rankings?

It seems that Google’s machine learning algorithms have seen through all those pages that used to rank well in 2015, but really didn’t deserve to be ranking well. And, to me, it certainly looks like Google is rewarding higher dwell time with more prominent search positions.

Google detected a lot of donkeys (about 80 percent of them!) and terminated them. Now nearly all the pages with the most traffic are unicorns.

I won’t tell you which pages on the WordStream site those donkeys are, but I will tell you that they shouldn’t be ranking for the terms they currently do. Those pages were simply created to bring in traffic (mission: successful), but honestly the alignment with search intent isn’t great.

This report also revealed something ridiculously important for us: these are our two most vulnerable pages in terms of SEO. In other words, these two pages are the most likely to lose organic rankings and traffic.

What’s so great about this report is you don’t have to do a lot of research. Just open up your analytics and look at data yourself. You should be able to compare changes from a long time ago to recent history (the last three or four months).

What Does It All Mean?

This report is basically your donkey detector. It will show you the content that is most vulnerable for future incremental traffic and search rankings losses from Google.

That’s how machine learning works. Machine learning doesn’t eliminate all your traffic overnight (like a Panda or Penguin). It’s gradual.

It also doesn’t seem to matter how many links and keywords point at these donkey pages. In the new ranking system, dwell time seems to act as a final arbiter of rank (like a quality assurance checker), vetoing rankings if the users return to the search result page too frequently.

What should you do if you have a lot of donkey content?

Prioritize the pages that are most at risk — those that are below average or near average. Put these at top of your list for rewriting or fixing up so they align better with user intent.

Now go look at your own data and see if you agree that time-on-site plays a role in your organic search rankings.

Don’t just take my word for it. Go look! Run your own reports and let me know what you find.

Originally published on The WordStream Blog.

About The Author

Larry Kim is the CEO of Mobile Monkey and founder of WordStream. You can connect with him on Twitter, Facebook, LinkedIn and Instagram.