Understanding SEO (Part II) : SEO techniques that are useful while setting up a website (Technical SEO)

An article explaining what is technical SEO, which techniques it include and how to use them while setting up a new project

We are back with the second article of the SEO series. In the last article, we got acquainted with the term Search Engine Optimization, we understood the logic behind it, how it works and why it is important for your website. In this article, we will take a step forward and I will introduce you with some basic but highly efficient SEO techniques that are useful while setting up a website. We will also come across, what are the best practices of SEO that will benefit your website traffic.

I have divided this article into 5 main sections, which are important with respect to SEO and which we can start working on as soon as we set up our website.

P.S. Even if you already have a project started, you should go through this article and add your missing links at any point in time. I assure it will definitely benefit your web ranking.

So let’s get started.

Website’s Canonical URL: www or no www

While setting a website, the first thing you do is deciding on the website’s domain name. When you decide your domain name, one of the important point is deciding whether your domain name should have www or not in front of it.

I know what you are thinking, “Is it really the case?? Does it even matter? Every website has www by default. Isn’t it?”

You will surely get answers to this. Many of us don’t even notice the difference between these two. Look at the following links to understand this difference

http://sample.com | https://sample.com | http://www.sample.com | https://www.sample.com

Though this is the same website link, Google will understand it as four different websites with the duplicate content. This will lead to poor ranking and bad SEO.

To solve this problem, you need to decide which URL you are using, the one with www or without www. Both of them works similarly. So it can be completely your choice, but sticking to that choice for a lifetime of your website will be a good practice for optimizing its search engine performance.

Now, once you have selected your URL, this preferred URL will be called as a canonical domain. When we select our canonical domain, the next step is to tell the search engine about it. This can be done by adding a canonical tag to your source code. Canonical tag will be added into the head tag of every page of your website and it will look something like this -

<link rel=”canonical” href=”https://www.sample.com/"/>

Where href attribute will contain your canonical domain. But make sure the link is pointing to exactly the same URL which you have decided.

It is highly NOT recommended to change this canonical domain once decided. But in some case, if you want to change it, make sure to use server-side 301 redirects and get your whole traffic to one point.

Robots.txt: Which pages of the site are available for search engines?

Next part of the process is adding a file called robots.txt. But what function this file performs, let’s look into that.

Search engine spiders try to access all the pages of your website which are publicly available. But, at times, you may want some pages to be available for a certain group of people only, maybe to your subscribers. In such cases, you need to tell spiders or crawlers not to read those pages and this task is done by robots.txt.

Robots.txt is a text file which we need to add to the root directory of your website. This file gives the instruction to the web crawlers about which pages they should add in their indexing and crawling. It includes a list of all those pages which you want to keep private from crawlers.

While creating a robots.txt for your website, you need to be very careful. A problem or misconfiguration in your robots.txt can cause critical SEO issues that can negatively impact your rankings and traffic. Many times we don’t want to block any webpage from crawling, but it is a good practice to add default robots.txt to your website

You can only specify a name of directives in robots.txt. This file will not hide any content from crawlers. Also, this is a case-sensitive file. So, be very careful with writing names of your directives. For example, Sample.html will not block sample.html file from crawling.

Below is the example of robots.txt

User-agent: *
Allow: /products
Allow: /careers
Allow: /services
Allow: /research
Disallow: /test
Disallow: /following
Disallow: /subscribers
Sitemap: https://sample.com/blog-sitemap.xml

Let’s understand the meaning of all these. In this, user-agent specifies which crawlers should be considered. “User-agent: * ” tells that we have considered all search engine crawlers. Allow specifies which directives are allowed to access by the crawlers and disallow shows which directives to be skipped by crawlers. Sitemap directive is supported by all the search engines and it is used to specify the location of your XML Sitemap. You can generate your own sitemap or some websites will do it for you free.

We have understood what robots.txt is. But now the question is, how robots.txt is created?

To create a robots.txt, just use any text editor like notepad and create a file with name robots and extension .txt. You can add default robots.txt as follows

User-agent: *
Allow: /
Sitemap: https://example.com/sitemap.xml

Add this file to the root directory(www or public_html folder) of your website and you are done with robots.txt.

SEO friendly URLs

URL i.e. Uniform Resource Locator of a web page is a link that we use to go on that page. For every web page, this URL is a combination of two parts, the first part is a domain name of the website and second part is URL for that page. For example, from the following URL

http://www.sample.com/this-page-is-a-sample-page.html 

first part is website domain i.e. http://www.sample.com and second part is page URL /this-page-is-a-sample-page.html.

When web crawlers search different pages, they use this URL as a reference to understand the content within that page. Hence, when we try to make a URL SEO friendly, we should include maximum keywords in a URL. Making use of keywords in the URL will help you in many ways. Its first benefit is, your users will be able to understand the context of your webpage. When you will add a link of this website to some anchor tags, these keywords will help in indexing your webpage.

When we create a URL, make sure that it is neither too lengthy nor too short. Search engine prefers shorter URL, so use the minimum words required. Also try to place the keywords at the beginning of your URL, so that web crawlers can crawl it easily and effectively.

While using keywords, many people tend to use one keyword more than once to get ranked by search engines. But trust me, this is the worst practice. Search engines will consider it as a keyword stuffing and you will not get ranked!

While creating any URL, use hyphens instead of underscore or white spaces. When you use underscore, Google will not differentiate keywords. For example, for a link this_is_a_url, Google will understand it as thisisaurl. White spaces get replaced by %20, So the URL “this is a URL”, Google will take it as this%20is%20a%20url. Both ways are neither user friendly nor search engine friendly. Hence to avoid this, use hyphens in URL than anything else.

Following these practices, you can easily create URLs which will benefit your search rankings.

User Experience, Page Speed, and Mobile First Approach

User experience, page speed and mobile first approach, though these are 3 different aspects of your website, they all are integrated with each other. Following their best practices will make a significant difference in SEO.

For understanding these three, we will start with the user experience (UX). How long any user will stay on your website is decided by UX of your website. With bad user experience, the bounce rate of your website becomes high and any search engine will never ignore that. To avoid high bouncing rate, your website should feel engaging to users. How your website handles errors, how it serves users, how easy it is to understand, these are all basics of having good web design. For e.g websites with creative 404 error pages.

Many such small factors creating good user experience will make your website get indexed better.

Page speed is one of the known ranking factors of the Google Ranking Algorithm. If your website is loading faster, or if your website is taking less than 3–5 sec to load, you will get its advantage in ranking your website.

But how page speed is improved??

This question has vast focus, but keeping it short, let’s understand the factors affecting page speed. Good website structure, less inline styling, javascript files kept separately, optimized images, minified stylesheets and HTML markup, these are some simple factors which can improve your page speed dramatically.

Improving page speed is mostly a developer’s job. If the structure of the website is well maintained, it will take less time to load, improving the speed and improving the rankings.

We all know that more than 60% of web browsing is performed on mobile devices. This means that your website has to be mobile friendly and offer a great experience to users on mobile. For better SEO ranking, test your website on different mobile devices and ensure that it loads fast and easy to use.

SSL Secure websites

Web security, a term which is highly trending these days and SSL secure website is something related to that only. SSL stands for Secure Socket Layer. It is a security protocol that enforces encrypted communication between the web browser and the web server.

Well! If this sounds too complicated to you just consider SSL as a security layer to your website.

This means that any information that is transmitted between your website and a web server like usernames and passwords, credit card information and any other data submitted by users, is secure and encrypted.

To make your website secure with SSL, you need to purchase an SSL certificate and integrate that with your site. Websites that have an SSL certificate installed and configured can be accessed using https://www.sample.com instead of the traditional non-secure way of http://www.sample.com.

From 2014, it was Google’s initiative to make the web more secure and introduced the slogan “https everywhere”. They started by making all google searches https and also announced that https websites would gain a very small ranking boost.

It is something, which everyone has to do sooner or later. But sooner you do it, easier will be your migration from HTTP to https. Hence, it will be always in a high recommendation to include SSL certification to your website.

That’s it for this article!

Conclusion

Techniques we studied in this article belongs more to technical SEO than on-page SEO. But these are very important when you are starting off a new project or when you are starting SEO for a project. This will create a base for your on-page SEO work.

In the next article, we will be learning on-page SEO techniques, what best practices you should follow while writing a markup of your website which helps to boost your search engine ranking, some do’s and don’ts for the on-page SEO and much more.

Till that time, stay tuned!!