A Simple Guide to Robots.txt

John Allen
Aug 9 · 3 min read

If you are a blogger, or a web developer, or even a website owner, the primary goal of your website is to appear as top as possible; when a customer type a relevant query on a search engine. To achieve that, you want to make your website S.E.O (Search Engine Optimization) friendly.

But wait there’s more than S.E.O!

There are times when you want to tell the search engine what to show and what it shouldn’t. For example you may have some important file hosted on you web server, or any other page that you might don’t want the search engines to show in the results. This is where the Robots.txt comes into play!

With Robots.txt having configured in your site, you can easily communicate directly with search engine crawler about which pages does it ignore and shows the rest. Which means, you have all your files in your web server, but only those can be appeared on search engine results which aren’t barred through Robots.txt.

How to create Robots.txt?

So here comes the meaningful purpose, how to create the Robots.txt file? Well, as the file got .txt extension, one can easily estimate that it’s all base on the simple text file. Yeah, that’s right!

So here’s yo know how to create the Robots.txt file :

1- Go to your CPanel

Log in to you CPanel server where the resources of all your website has been served. Then go to your root directory. Click the “File” option which is at the top left of the screen.

2- Create a new file Robots.txt

After clicking the “File” option, a popup will appear at your screen to write a filename, where you will write robots.txt in the field.

And congrats!!! you’ve finally created the robots.txt.

3- Edit and configure Robots.txt

Just when you created the robots.txt file, right click and choose the edit option which will lead you the file editor in another tab where we’ll edit/configure and save the file.

So let’s go…

In the below picture, you can see the multiple lines; don’t worry if you don’t understand. I am here to make you understand.

Line 1 : If you ever want to create a sitemap in robots.txt file, simple write “Sitemap:” followed by the link where your sitemap has been served.

Line 2 : “User-agent: *” This is used to target the specific bot or crawler or any other user agent, and below which the rules for this user are defined. For other user agent list click here.

Line 3 : So from below the User agent, it’s now all up to you to allow any link or disallow any page. All you have to do is to write ALLOW or DISALLOW: followed by the link or allow or disallow the crawler to appear these links in the search engine.

[Read More: Can and Can’t of Artificial Intelligence]

Thanks for reading!

Don’t worry if you don’t want to put yourself in this fuss! Here, we at Logo Sight offers you S.E.O, as well as Video Animation, Web Development, Digital Marketing. So never forget to hire the top digital company!

This post was originally published at Logosight.com.

John Allen

Written by

Director at Logo Sight

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade