SEO guide from developer’s perspective

Ljamov
Codeart
Published in
12 min readJun 15, 2021

My journey with SEO started before I became a web developer. In fact, the reason I decided to become a web developer has strings with my knowledge and work experience with SEO. I know that in the world of web development, developers are not required or even encouraged to have SEO knowledge, but in the era of the modern web, a modern web developer is much more than a coder, and a website is much more than a URL- it is the foundation and most important part of your business. And SEO has a significant role in making your business stand out. SEO and web development are thoroughly connected, like having a solid foundation is key to building a house, and having a solid SEO-friendly website design is a similar analogy.

Before moving further, let’s have a brief introduction to the SEO process.

What is SEO?

*Please, don’t confuse SEO with CEO.

“SEO stands for Search Engine Optimization, which is the practice of increasing the quantity and quality of traffic to your website through organic search engine results.” — Moz

Why is SEO important?

  • SEO enables you to get a better ranking for your site on the web
  • SEO gets more clicks than PPC and you don’t need to pay for ads in SEO
  • With SEO we have better speed on our site and with that better User Experience

Technical SEO

“Technical SEO is the process of ensuring that a website meets the technical requirements of modern search engines with the goal of improved organic rankings. Important elements of Technical SEO include crawling, indexing, rendering, and website architecture.” — Backlinko

Technical SEO allows search engines like Google to know that you have a website of high value. It’s really important because it can prompt the search engines to rank you higher.

SEO ABC

There are plenty of things you can do to make your website optimized for search engines, but I will focus on things that every developer that builds websites needs to know.

HTML Elements

Title Tag

The title tag is an HTML code tag that allows you to give a web page a title. This title can be found in the browser title bar, as well as in the search engine results pages (SERP). It’s crucial to add and optimize your website’s title tags, as they play an essential role in terms of organic ranking. There is no official length for this tag, but the best practices are 50–60 characters.

TIP Don’t forget to add the most important keywords at the beginning of the title.

Meta Description Tag

Meta description tag is the second important thing when showing results in SERP. It is an HTML attribute that provides a brief summary of a web page. Best practices need to be around 160–180 characters and they’re sufficiently descriptive. Keep in mind that the “optimal” length will vary depending on the situation, and your primary goal should be to provide value and drive clicks.

TIP A well-written title and meta description will influence CTR (click-through rates)

Images

Images are a big part of how we experience a web page. That matters for SEO because Google’s algorithm pays attention to behavioral metrics that reflect user experiences, like bounce rates and the amount of time visitors spend on a web page.

After we set the path of the image, we need to provide a good description for that image so Search Engines know what it is about. That is called an Alt tag. The required alt attribute specifies an alternate text for an image if the image cannot be displayed.

The alt attribute provides alternative information for an image if a user for some reason cannot view it (because of slow connection, an error in the src attribute, or if the user uses a screen reader).

TIP We put a full stop at the end of the sentence so that screen readers can stop reading.

Providing image height and width will improve your page speed and layout shift.

Hyperlink

Text Link

In our normal Text Link, we have just an anchor link and text of anchor.

rel=”nofollow”

A no-follow attribute has multiple uses mostly for paid links or some distrusted content (e.g. comment System, User-generated content, embeds, or if we don’t want to be seen as endorsing a link to another site.)

But as SEO grows, new attributes are created that will replace the purpose of no-follow which is general, with more specific tags:

rel=”sponsored”

We can mark with this attribute links that are advertisements or paid placements (paid links) with the sponsored value.

rel=”ugc”

Using rel=”ugc” attribute is recommended when there is user-generated content (UGC) link, such as comments and forum posts.

TIP rel=”nofollow” attribute was previously recommended for these types of links and it’s still an acceptable way to flag them. You can also use the nofollow value when other values don’t apply. For internal links best practices are to use robots.txt

There are some misunderstandings about some often confused technical terms that are completely different. Let’s talk about noreferrer and noopener.

rel=”noreferrer”

The noreferrer tag can be added to a link tag when for some reason you don’t want other sites to know that you are linking to them, in this case, will be referral, so all traffic from this link in Google Analytics will be as Direct Traffic instead of referral.

TIP Do not use rel=”noreferrer” attribute on internal links, it can cause confusion in your analytics reports.

Noreferrer also has no impact on affiliate links. The reason for that is that the majority of affiliate programs do not rely on Referral Traffic to award a conversion but on the affiliate ID which is included in every link.

rel=”noopener”

rel=”noopener” is an attribute that can be added to external links. It prevents the opening page to gain any kind of access to the original page for security reasons.

How do these attributes affect SEO?

The difference between nofollow and noreferrer is that noreferrer does not pass any referral information to the browser but the link is followed. With nofollow, the referral information is passed to the browser but the link is not followed.

So, they are not the same thing. Use nofollow on links that you don’t trust and use noreferrer if you don’t want the other site to know that you have linked to them.

On the other hand, noopener has zero impact on your SEO so you can safely use it to enhance the security of your website.

TIP If you want to make an external link to another website and want to be secure and not shown as a referral, you can use a combination of both attributes.

Robots Standard

There are 3 types of robots standard: robots.txt, Meta Robots, and X-Robots-Tag.

Robots.txt

Location of file: https://yourdomain.com/robots.txt

Robots.txt always needs to be in your root directory and have few tags that we can use: User-agent, Disallow, Allow, Crawl-delay, Sitemap.

Technical syntax:

  • User-agent: The specific web crawler to which you’re giving crawl instructions (usually a search engine).
  • Disallow: The command used to tell a user-agent not to crawl a particular URL. Only one “Disallow:” line is allowed for each URL.
  • Allow (Only applicable for Googlebot): The command to tell Googlebot it can access a page or subfolder even though its parent page or subfolder may be disallowed.
  • Crawl-delay: How many seconds a crawler should wait before loading and crawling page content. Note that Googlebot does not acknowledge this command, but the crawl rate can be set in Google Search Console.
  • Sitemap: Used to call out the location of any XML sitemap(s) associated with this URL. Note this command is only supported by Google, Ask, Bing, and Yahoo.

Pro tips:

Now, there is a big difference between “Disallow:” and “Disallow: /”.
For non-SEO practitioners, these directives can seem confusing.

  • Disallow: (without the forward slash) means that all search engine spiders and user agents can crawl the site without issue from the site root down.
  • Disallow: / (with the forward-slash) means that everything from the site root down will be fully blocked from search engine indexing access.

TIP Only Meta Robots and X-Robots-Tag can remove URLs from Search Results.

Meta Robots

The <meta> robots tag, commonly known as “meta robots” or colloquially as a “robots tag,” is part of a web page’s HTML code and appears as code elements within a web page’s <head> section:

You can also be more specific instead of being general with “robots” (User-Agent), you can use “Googlebot” to specify rules just for Google Bot.

Below is a list of User Agents and Parameters that you can use.

Important User Agents:

Googlebot (can be used as default for most Google crawlers)

• Googlebot-News

• Googlebot-Image

• AdsBot-Google

• Mediapartners-Google (Mobile Adsense) or Mediapartners

• Googlebot-Video

• Bingbot

• Yandexbot

• Baiduspider

• FacebookExternalHit

• Applebot

• Slurp

• Twitterbot

• Rogerbot

• Dotbot

• Wildcard for all robots: *

Not enough? Here is a bigger list where you can find more user-agents

Important Parameters:

  • Index: Tells a search engine to index a page. Note that you don’t need to add this <meta> tagit’s the default.
  • Noindex: Tells a search engine not to index a page.
  • Follow: Even if the page isn’t indexed, the crawler should follow all the links on a page and pass equity to the linked pages.
  • Nofollow: Tells a crawler not to follow any links on a page or pass along any link equity.
  • Noimageindex: Tells a crawler not to index any images on a page.
  • None: Equivalent to using both the noindex and nofollow tags simultaneously.
  • Noarchive: Search engines should not show a cached link to this page on a SERP.
  • Nocache: Same as noarchive, but only used by Internet Explorer and Firefox.
  • Nosnippet: Tells a search engine not to show a snippet of this page (i.e. meta description) of this page on a SERP.
  • Unavailable_after: Search engines should no longer index this page after a particular date.

If the robots <meta> tag is not defined, the default is “INDEX, FOLLOW”
Don’t block noindex URLs in robots.txt, they need to be crawled to be respected.

X-Robots-Tag

While the meta robots tag allows you to control indexing behavior at the page level, the x-robots-tag can be included as part of the HTTP header to control indexing of a page as a whole, as well as very specific elements of a page.

To use the x-robots-tag, you’ll need to have access to either your website’s header .php, .htaccess, or server access file.

How to Create an SEO Friendly URLs

  • Choose shorter, human-readable URLs with descriptive keywords
  • Exclude dynamic parameters when possible (see “Canonicalization” and “Pagination”)
  • When possible, place content on the same subdomain to preserve the authority
  • Leave out stop words. “Stop words” are common words that search engines often filter or ignore (such as “and,” “but,” “the,” etc.)
  • Avoid automated numeric labels
  • Connect the URL to the page title. Match the title of the page and the URL structure as closely as you can.

TIP Don’t include special characters. Special characters and symbols can create problems in URLs and cause links to break, so don’t use unsafe characters while writing an SEO-friendly URL. For your reference, check characters on this list by Perishable Press. In the example, you can see how “&” was used in the page title but not in the URL to fit this best practice.

Recommended: https://example.com/blog

Less ideal: https://blog.example.com

Subdomain

Any time you utilize a sub-domain, you are telling Google, in essence, that it is a second physical web property. Compared to using a sub-folder, which is typically an extension of a site, the sub-domain will be considered a separate web property and can be harder to manage when it comes to certain SEO tasks like link acquisition.

Canonicalization

Common Duplicate Homepage URLs:

The preferred URL will be “https://example.com/“ so what we need to do is to put the rel=canonical tag in <head>.

A canonical tag (aka “rel canonical”) is a way of telling search engines that a specific URL represents the master copy of a page. Using the canonical tag prevents problems caused by identical or “duplicate” content appearing on multiple URLs.

Sitemap

A sitemap is a file that contains information about the pages, videos, and other files on your website, and the relationships between them.

Search engines read this file and crawl a website.

<loc> stands for location on page and <lastmod> when last time is modified.

A sitemap cannot contain over 50.000URLs. If you have a comprehensive website you should use multiple sitemaps listed in a single sitemap index file.

TIP Don’t forget to submit your sitemap to Search Engines for example Google via Google Search Console.

Mobile

Since July 2018, Google has prioritized mobile page load speed as a key metric when determining a website’s search result ranking. If your mobile website is slow, it comes with a hefty penalty to your overall ranking across all devices.

For Best Practices you can follow these few things:

  • Your mobile version should display the same content as your desktop site
  • Page title tags & meta descriptions should be the same
  • Use the meta name=”viewport” tag at the head of your page to tell the browser how to adjust the content

TIP Increase speed by optimizing the critical rendering path, using HTTPS & HTTP/2, eliminating render-blocking resources, removing unused CSS, and deferring off-screen images.

Tool: Mobile-Friendly Test

Performance

When it comes to the performance of our website, we need to do extra steps so we are sure we have an optimized website. There are 3 things we need to keep in mind.

  1. Page Speed
  • Compress and minify your code
  • Reduce page redirects
  • Remove render-blocking JavaScript
  • Use Tree Shaking
  • Leverage browser caching
  • Use a CDN (Content Delivery Network)
  • Leverage preconnect, prefetch, and preload

2. Test your Page Speed with:

3. Image Optimization

  • Compress your images and experiment with quality settings ( Lossless / Lossy )
  • Remove unnecessary image metadata
  • Lazy Loading
  • Leverage SRCSET for different screen sizes
  • Ensure that your images have alt text

Modern JavaScript Sites

  • Keep JavaScript bundles small (especially for mobile devices). Small bundles improve speed, lower memory usage, and reduce CPU costs.
  • Use server-side or pre-rendering to improve site speed, user experience, and crawler accessibility.
  • Stuck with client-side rendering? Try pre-rendering to help Googlebot get a more immediate HTML snapshot of your page.
  • Use Chrome Dev Tools “Performance” tab to test your runtime performance and network “throttling” to simulate different device capabilities.

Tools you can use to optimize, compile and minify Code and Images: Grunt, Gulp, and Webpack.

****

From an SEO perspective, it can be difficult to communicate the value and importance of search-related initiatives and to get them prioritized in development pipelines. From a developer’s point of view, SEO can seem like a never-ending source of tickets and annoyance that delays them from delivering their work on time.

As web technologies become more advanced, SEO is becoming more technically sophisticated, which means it is increasingly important that we as developers actively examine ways to work more harmoniously with SEO practices, and all for the benefit of the website.

--

--