Search Engine Optimization for hyper-local service pages

Karan Ahuja
Urban Company – Engineering
5 min readJul 31, 2017

What is SEO?

SEO is all the things you have to do to get your website ranked high in search results, without paying for it. SEO is about optimizing a web page, with the end goal of ranking high (page one) on a search engine like Google.

Google’s number one goal is to show its users most relevant content. A company’s goal is usually to rank on top in search results. If the company helps meet Google’s goals, Google will help meet the company’s goals! Simple enough?

Yes, SEO could be simple to crack if one really understands what Google needs.

If done right, SEO can scale companies and kill competition. -(cofounder UrbanClap) Abhiraj Singh Bhal

Urbanclap and SEO

UrbanClap is a services marketplace aimed at the Indian market, whose purpose is to connect service professionals with customers.

More about Urbanclap here.

SEO is a major source of organic traffic at Urbanclap therefore its important to optimize SEO pages for Google Crawler Engine.

Urbanclap’s content is unique in a lot of ways. Its itinerary consists of people themselves. Urbanclap showcases from an Engineer who is a freelancing photographer to plumbers to fitness trainers.

Urbanclap has over 50,000 active professionals/partners on platform. They are split into smaller buckets/categories.

Urbanclap’s SEO pages are spread across partner’s buckets and hyper localities.

Typically a bucket’s page is divided into:

  • Hyper-local pages
  • City pages

Urbanclap strives to achieve the following for its SEO pages :

  • Speed
  • Mobile friendly
  • High quality
  • Rich content

SEO hyper-local/city page architecture

Urbanclap SEO pages are divided into:

  • Product ( here professionals/partner ) listing
  • Meta data
  • Historic data
  • Quick-links
Partner Listing

Product listing: Page’s partner listing is generated via mutable algorithms. Every page is tagged to its hyper-local details. Using this information, algorithm gather all relevant partners for and around a location. Ever increasing data set poses a few challenges and hence the algorithm keeps going through new enhancements.

Meta-Data

Meta-Data: Page’s meta data is static or dynamic information, related to a location and bucket. It is essential to meet Google standards in creating this data. Meta data not only provides relevant information to the customers, but also holds ranking keywords for Google’s crawl engine. Meta data gets richer through weekly analysis of webmaster tool’s reports.

Historic Data

Historic Data: Past interactions of customers with the platform in the specific location bucket is collected, curated and presented on the SEO pages. Any such data will be unique to platform and is a good source of keywords too.

Quick Links

Quick Links: For a crawler engine to efficiently crawl and rank pages , the website should have a spider web like structure. Such structures are possible through smart and intensive interlinking. Linking to nearby hyper-local pages across same or different buckets of categories helps in an effective user navigation, as well as page interlinking and indexing.

All requests to the server are served in Node.js and Mongo is used to store the data. MEAN stack scales well and its async feature provides good processing speed for SEO.

To achieve speed , all requests are pre-processed and stored in Redis cache so as to serve them quicker on future hits.

SEO caching — speed vs content

Google assigns variable interval of crawl time to every web domain, say t. One of the primary aim is to make Google crawl maximum pages in t. More pages Google can crawl in t, better it ranks the pages (assuming good quality, relevant data and proper interlinking).

For most simple get requests, this will not be a challenge. However, when complex algos run on huge data sets, speed takes an adverse impact.

To optimize for speed, any API that does heavy processing, is pre-processed and cached , ready to serve. This consequentially introduces another issue, dynamics of the content.

Good ranking SEO pages have time dependent content. Example: Updating partners on the pages as new ones are added to the platform.

To solve for this, all data entry points are hooked to the services that pre-processes and caches responses. Example: if a new partner is added to salon at home for women in Delhi, all hyper-local pages near that partner will incorporate this change.

Any change in data is hooked to exposed services, which refresh the cache as well.

This solves for optimal speed and latest relevant content.

Keeping up with business (Fault tolerance)

Google marks down pages which return 404 (not found) errors. Urbanclap is in itself an ever changing ecosystem, with new locality or category buckets being added or disabled from the platform.

The challenge imposed in such ecosystem is to elegantly inform Google of such bucket being disabled or making sure Google does not crawl these pages.

Generating new sitemaps on daily basis and plugging broken web links is one way to do it. However disabled bucket’s pre-indexed pages will always show up in searches. For such scenarios, soft 404 is returned. Soft 404 is a fancy way to show a 404 page, with alternate options for the customer to navigate too. Google doesn’t penalize soft 404. Redirection (302) is also done for such pages and is acceptable by Google as well to a point.

Disabled buckets are handled with sitemap , soft 404 or 302 redirects

This solves for ever changing business.

Dynamic page generation (Scalability)

As discussed above, new buckets of category-locality-city are added every day to Urbanclap. Example: Urbanclap recently went live in Kolkata. To cope up with high business demand, challenge is to generate SEO relevant data at thought.

Algorithms that cook such data are written in python and node.js. New entries are created in the database. Page contents are simultaneously pre-processed and cached.

Urbanclap’s SEO fresh page generation logic has been abstracted to a single click, at the new bucket’s internal UI.

To the future

Urbanclap SEO Requests growth rate

Urbanclap is working on enhancements and additions for its SEO pages. Some of them are listed below:

  • Front End — Mobile and Desktop
  • Separate server units for SEO
  • Scaling cache
  • HQ Media

SEO team continuously hustle for better content, richer quality, fast rendering and good speed. Battle for SEO services’ pages is being fought everyday from the Urbanclap Desks!

— Karan Ahuja ( Engineer Urbanclap )

--

--