Let’s bust the myth: SEO is non-technical practice

Bhavya Saggi
Engineering @ Housing/Proptiger/Makaan
7 min readOct 25, 2018

Search Engine Optimisation holds a bias as a task for the Marketing Division & historically has been branded as a Non-Technical thing. However, this sentiment is far from the truth.

Of course, SEO requires some off-page activities like Link Building, Social Media Marketing, Social bookmarking, etc. which boils down to spreading your word (aka your links) & gain popularity, nuanced as “if people are talking about you, you must be important.
But, these ‘off-page’ activities can only take you to a certain level. To achieve more, further actions are necessary which are technical in the heart, like performing Web Performance Optimisations, enhancing User Accessibilities, and reducing Site Errors.

With the emergence of intelligent crawlers & AI driven search engines, let’s go through how SEO evolved, and how websites need to evolve to match the needs of modern search-engines.

I am a tech-guy, do I need to know about SEO?

The need for Search Engine Optimisation arose when people started using popular search-engines (like google, yahoo, bing, etc.) to find content rather than visiting websites directly.
The result on Search-Engine Result Page (SERP) is populated by the search engine’s crawlers/spiders who keep navigating the world-wide-web and churning in each page it visits.

To make crawler/spiders’s job easier, websites were asked to provide some meta about it, which would be consumed by crawlers to produce the result appropriately, like titles, descriptions (canonical, keywords ) , headings , alt text, etc. for which least technical know-how was needed. Just put appropriate content, provide relevant keywords, & do not try to cheat (using keyword stuffing, false back-links, and other malicious activities).

But the crawler requirements do not end there as search engines keep updating their algorithm. Now, website owners are asked to make sure that there is no Duplicate Content, check for Broken Links, Speed Optimisation, Accessibility, and other activities that call for the Modern Search Engine Optimisation.

Modern Search Engine Optimisation

People familiar with SEO must know of Google’s Search Console (previously known as, Google webmasters), which provides errors and invalidations that Google’s crawler encounters when it visits your website. The fixes for most of the issues reported there are not always trivial or textual & somehow they never seem to end, e.g. Mobile Usability errors, 4XX-5XX Crawl errors, etc.

Standard view of Google `Search Console`, showcasing “Structured Data”, “Mobile Usability” & “Crawl Errors” sections.

It is necessary to fix issues reported on the Search Console as soon as possible and also mitigate any future issues as they directly impact the ranking and performance of the webpages on Google’s Search Result.

Ergo, in current modern scenario the SEO is a multifaceted activity, requiring both technical guys and content/marketing guys to work in harmony to support the website.
To give an example for it I feel the need to discuss what needs to be done to get noticed by “Search Engine Senpai” by talking about what we do at my workplace at Makaan.com, Proptiger.com & Housing.com.

  1. Maintain URL structure & canonicalise URLs
    A canonical tag <link rel="canonical" href="***"/> is a way of telling search engines the present page shall be identified at mentioned URL. Using the canonical tag prevents problems caused by identical or “duplicate” content appearing on multiple URLs.
  2. Validate Redirections
    It is often the case that we need to redirect user when he navigates to a expired page, since there are multiple ways you can do so (301, 302, 307…) it is necessary to know how these redirects affect a page’s ranking on the search engine.
    One of the sections that is often skipped or is misunderstood is the ‘Soft 404’. A Soft 404 error is reported when a page should report a 404 (Not Found) response code, but it does not. E.g. The content on webpage says 404-Not-Found but the HTTP response code is 200, or the webpage redirects to home page.
    It causes major confusion to the Search Engines & must be mitigated.
  3. Link pages with multiple Translations & Paginations.
    Often a page has children pages (e.g when showcasing multiple paginated results) or has multiple translations it is necessary to link these pages together.
  4. Provide valid Structured Data Markup
    Marking up content on webpage, we gain search result enhancements and content-specific features, such as: Rich search results (cards & carousel preview), Breadcrumbs, & Featured Snippets.
    Using the Structured-Data Testing Tool, the validity of the markup can be verified & since it has a direct impact on how the webpage looks on the Google’s search result, it holds a priority to make it as appealing as possible to gain user clicks.
  5. Generate & Submit Sitemap
    A Sitemap is an XML file that lists the URLs for a site & may include additional information about each URL: [when it was last updated, how often it changes, and how important it is in relation to other URLs in the site], submitted to Search Engines to crawl the site more intelligently.
    The most common SEO issues with sitemaps are “Malformed XML” (broken/invalid XML markup), “Dirty sitemaps” (contain links shouldn’t be crawled) & ‘Broken Links’ (links that do not exist anymore). Hence, it is a crucial & regular activity to update the Sitemaps.
  6. Update Robots.txt
    Robots.txt is a URL-exclusion mechanism, in which we instruct crawlers the paths which we do not want to be crawled.
    The ‘robots.txt’ file should be present at the root of the URL and can have multiple subsections based on the ‘User-Agent’.
  7. Resolve Crawl Errors
    The Site Errors section shows you errors from your website as a whole, whereas the Crawl Errors sections displays recent 1000 urls in each subcategories (Server Error, Soft 404, Not Found, etc.) where Google’s crawler encountered an Error. These are generally high-level errors that affect the website in its entirety, so they must not be skipped.
    Using Fetch as Google tool it is possible to check if Google can currently crawl your site. And it is a top priority to fix these Errors as if Google can not fetch the webpage correctly, that means your webpage won’t come in the Google’s Search Result.
  8. Provide User Accessibilities.
    Another section of the Search Console is the “Mobile Usability”, which describes if the webpage is following a certain set of standards like “if the content of the webpage overflows with width of a mobile device”, “if the clickable element are too close together” and many more which can read about here.
    And ever since google rolled out its ‘Mobile-First Indexing’, it is very important to keep the webpages Mobile-Friendly. For this, Mobile-Friendly Test is an excellent tool provided by Google to verify if a webpage passes all the necessary criteria for a mobile device.
  9. Speed Optimisation
    Since a while, Google has indicated site speed is one of the signals used by its algorithm to rank pages. The site speed can be verified through Google TestMySite which provides a comprehensive report by running a webpage over multiple categories like AMP test, WebpageTest, & PageSpeed Insights.
    The test scores the webpage over multiple criteria like First Response Time, page size, compression, caching, etc. & also indicates how the webpage performs in with the competitors in its industry.
  10. [NEW] Build & Update an AMP version for each page.
    The Accelerated Mobile Pages Project is a Google-run open-source website publishing technology designed to make it easy for publishers to create mobile-friendly content and have it load instantly.
    AMP incorporates common optimisations & accessibilities that a webpage should have as an inherent feature in itself, making the pages faster & comprehensive. The features of AMP come with its restrictions (e.g. you can’t run custom javascript, all CSS has to be inlined, …) which requires quick & skilful solutions to overcome the challenge.
    Each webpage may have a “AMP” version of it, which is only accessible via Google’s Search Results. Also, Google maintains a cache of the “AMP” version of the webpage which makes it seem that the page opens almost instantaneously.

The culmination of above activities was visible in the Google Analytics data for Makaan.com, as shown below:

From the above snapshot, it is evidently visible that through the continuous and rigorous efforts put in by our team, we grew in organic traffic (clicks from search engine results).
Hence, proving that the above activities did provide a significant boost to traffic and in turn helped the product grow & reach out to a wider audience.

Conclusion

Image Credit: Anuvaa

After more than 2 years being a technical-branch of the SEO Team, we detached ourselves and branded us as a high-functional, technologically-progressive, intuitive & innovative ‘Platform Growth’ Team.

Our responsibilities extends to cater all technological needs, so that the “SEO guys” can keep tracking our competitors, provide relevant page-meta / keywords & perform their branding magic.

--

--