Photo by Maksym Kaharlytskyi on Unsplash

Why you should tell Google to not index a lot of page urls

If you have a lot of urls on your site, you should think about selecting which pages Google should index

You want Google to index all relevant pages with relevant content you have, not every last one.

Making sure Google finds all urls for indexing

How GoogleBot works — it indexes all pages it finds links too, unless the page specifies ‘noindex’ or ‘nofollow’
Photo by Tim Mossholder on Unsplash

Search volume and impressions went up, but our ‘site authority’ suffered

Changing our mindset — adding noindex to all pages that aren’t relevant as landing pages

//this is the default, you don't need to add this to any page
<meta name="robots" content="index,follow">
//this will prevent a page from being indexed, but links will still be followed (recommended)
<meta name="robots" content="noindex">
//if you sure a page is irrelevant including links
<meta name="robots" content="noindex,nofollow">

The criteria for having Google index a page should be as follows: For an outside user, is this page a relevant landing page?

Our coverage curve — now we want to only have relevant pages indexed and over time have Google deindex less relevant page urls

Google’s crawl budget

Photo by Fabian Blank on Unsplash

Site or domain authority

To sum up, indexing all your urls on Google is great, indexing only relevant content is greater

Co-founder and CTO at http://epiloge.com — interested in all things tech, startups and cultural changes in society

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store