What is Cloaking in SEO & Should You Do It in 2018? (Whitehat Edition)

Right when we hear the word “cloaking”, it triggers a negative perception in our mind. Like in the old times, It triggers an image of assassin in white cloak, trying to con someone 😄.

Don’t try to hide

And if you are a digital marketer, it will easily be perceived as one of the well-known blackhat strategies in SEO world. But what if I say that we can still use certain aspect of SEO cloaking to our benefit? So in this blogpost I am not only going explain you SEO cloaking but also provide you my outlook on the problems we face as a publisher and how can we use certain aspect of SEO cloaking in 2018 to our benefit. So stay tuned!

What is SEO Cloaking?

To put it in simple words, it is a technique in which the content presented to the search engine crawler is different from the one presented to the user’s browser for better indexing.

So how is this made possible?

Each person visiting a website has an IP address based on the location and internet service. So what can be done is, we can use our reverse DNS records (available in our cPanel with the hosting company) to identify IP addresses and setup up a .htaccess redirect for them. Usually the users are redirected to a particular desired page through a page with good SERP ranking that has optimized content, headings, title, etc.

Why redirect users through a good SERP ranking page? Because good SERP ranking page results into high traffic volume and this traffic is then redirected to another desired page, which is not ranked or indexed by search engine robots.

Page 1 : Indexed with good SERP ranking ; Page 2 : Not indexed

So for the search engine the screen will be page 1 and for the user the screen will be page 2.

The above process can be used vice versa to redirect robot or user and can also be accomplished using other kinds of filter such as user agent, scripts, etc. So how does google diagnose this? Recent google updates state that Google algorithm is real time, it means that google crawls your website various times. With multiple crawls, it possible to crawl using different IP addresses or user-agents which can be difficult identify for redirect filters.

Using this technique is violation of Google Webmaster guidelines and can result into permanent ban for your domain.

So what is the problem we face as a publisher and how is cloaking related to it?

Growing internet services has increased the consumption of content by the daily users. With increased consumption, a publisher is compelled to cater to the increasing demand, which gives rise to personalization. Personalization gives results but it is expensive. Imagine providing personalized content for thousands and millions of readers on daily basis? Its expensive. Providing content through a management system with recommendations, reviews, support etc requires good backing. So publishers usually instate a subscription model for the users. A monthly fee is charged for premium subscription, with which a user can receive content as per his likes and dislikes. So what’s the problem in this?

From an SEO perspective, the premium content provided to users becomes accessible through SERPs. Following are some examples:

SERP result
Le Monde.fr

Another good example of content being available through SERPs would be magazines:

SERP results
National Geographic Magazine

Possible solutions

Free First Read

Keeping the first premium read as free and any other read preceding that will require paid subscription. This can be maintained using user cookies or IP addresses. This will not only allow the user to read but also allow the publisher to rank for keywords based on the content of a premium article.

Summary Read

Provide a summary read of the article and keep the full article available only after paid subscription. This is another great solution, because you can hook the user for a premium read using a well-crafted summary. This can be unsatisfying as well but this can help you rank for keywords using the summary content.

Landing Page

Keep premium content unindexed and provide a landing page for subscription, outdated but if you consider your content very precious!

With these techniques, you are cloaking in an ethical manner to make sure your content is similar and is available for both the users and the crawlers.

Thank you for the read, if you have any questions please do not hesitate to leave a comment below, I will reply asap.

Cheers 😉