Analyze and optimize: How to make a SEO audit

How to make a primary SEO audit. Step-by-Step instruction by Danny Dover

Who is Danny Dover?

Danny Dover is a passionate online marketer, influential writer and obsessed life list completer. He is the author of the best selling book Search Engine Optimization Secrets and the founder of Intriguing Ideas LLC. Before starting his own company, Danny was the Senior SEO Manager at AT&T and the Lead SEO at SEOmoz.org. Thank you for your books and recommendations, Danny! Your books helped a lot of webmasters, SEO specialists and website holders to improve the organic ranking.

How to make a Primary SEO audit

The result of performing SEO audit of the site is a report that contains statistical information, description of the revealed errors and also recommendations for their elimination, including common recommendations for improvement of all the aspects of site optimization. The report deals with such optimization directions as technical state of the site, search audit, internal optimization, link bulk and behavioral factors. It is worth noting that the resulting report is not a ready-made technical task for end specialists, but contains information for preparation of such technical tasks (e.g. for designer, content manager, webmaster and other technical specialists).

Technical issues analysis.

The purpose of the primary SEO technical audit of the site is to reveal the main technical errors in the site, such as server consistency, presence of malicious code in the site pages, configuration of server redirects, tracking of broken links, etc. The report also contains final recommendations for elimination of the revealed errors and examples of specific implementations. In case technical problems are rather simple, we will limit ourselves to common recommendations and links to the existing sites with correct implementation, in case the problem is nontrivial, there may be additional manuals developed for solving it, like below:

  • HTML improvements (duplicates titles, descriptions)
  • 404 errors
  • Duplicated content issues
  • Index health checkup
  • Categories/ Subcategories + Product URL improvements
  • Implementing Social sharing for products
  • Schema.org for product star rating and markup.
  • Canonical issues
  • 301 redirects set up

File robots.txt

Check out of correct recording of directives in the file: www.site.com/robots.txt

Recommendation: the file robots.txt is intended for search engine bots and is to be compiled following a certain structure to enable correct processing. In this file webmaster can indicate indexing parameters of their site for all the bots at once as well as for each search engine separately.

Please note that it’s not recommended to delete folders, they should be hidden from index in the Robots.txt directory

The file named robots.txt should be added to the root web-site directory with following content:

User-agent: *
.....
Sitemap: https://www.site.com/sitemap.xml

Server response and .htaccess

  • Check “www” redirects:
  • Check of “404” server responses for nonexistent pages;

Use this and this service

Website sitemap

Check out: www.site.com/sitemap.xml. Sitemap (sitemap.xml) is intended specially for search engine bots and contains additional information about pages subject to indexing. There are a lot of free programs for creating sitemaps (еxample: http://www.xml-sitemaps.com)

After that it is recommended to include the corresponding line into the file robots.txt to specify the file location with the list of all for sitemap.xml for bots: Sitemap: http://www.site.com/sitemap.xml. It is advised to update sitemap at least once in 2 months.

Broken links

Broken Links are any links from a page on your site to another page that has moved or no longer exists. Broken Links can reduce your site’s search engine rankings and decrease the quality of your visitors’ experience. Those links are broken. If the link points to a resource that no longer exists, you may want to remove or fix the link. To find broken links use Screaming Frog SEO Spider.

Check for HTML errors: Duplicate meta descriptions

In fact, it is better to have unique meta descriptions and even no meta descriptions at all, then to show duplicate meta descriptions across pages. If you have used the same meta description tag on all your pages, so you have amassed duplicate meta description tag issue on your site, then the smart thing to do would probably be to either go and delete those tags or rewrite them so each is unique, or leave it off and Google will generate the snippet for you.

Using <meta name=”robots” content=”noindex, nofollow”>

Recommended to rewrite descriptions for each page using unique and relevant content, use attribute rel=”canonical” to this page, or rewrite and titles descriptions for each page using unique and relevant content.

Short meta descriptions

The meta description won’t impact your search ranking, but it can definitely impact whether people click on your page in search or not. Generally, a meta description should be under 155 characters.

Duplicate title tags

Your title provides users and search engines with useful information about your site. Text contained in title tags can appear in search results pages, and relevant, descriptive text is more likely to be clicked on. We recommend reviewing the titles and update the title tags wherever possible.

Duplicated content

Duplicate content is identical content found on two or more pages on your site. Search engines may penalize your site if too much duplicate content is found. You may want to either remove the duplicate content so it appears on only one of your pages, or modify the content on the various pages to ensure that each page is unique. It’s possible to check it out: Site:http://www.site.com

Recommendations for all web pages: Index health checkup

The purpose of the index health audit of the site is to reveal the main problems of site indexing, to check the compliance with all requirements of search engines and collect statistical information on the site. The important part of the search audit is the analysis of the pages quality, including those in indexes of search engines, search of affiliated sites, site check for elements prohibited by the search engine license, analysis of text optimization of the pages and drawing up recommendations for all the points mentioned above.

site:www.site / inurl:query
site:www.site / inurl:spartn

Google webmaster tools health checkup

Analysis of the specified errors in the panel of Google Webmaster Tools. >URL Parameter

In Google Search Console section parameters described and can be manipulated to support our needs. For all website with significant number of indexed pages — this is most significant tool that helps to exclude tons of duplicated content pages from Google index easy. We are cheating out Google using this parameters — showing G which pages we want to exclude from Google index. So please proceed with URL parameter filtering it for a list of parameters to disable negative impact of duplicated indexed content in Google.

Books to read

  1. Search Engine Optimization Secrets — Danny Dover

2. Search Engine Optimization : An Hour a Day — Jennifer Grappone, Gradiva Couzin

Sources: here and here. Case Study for Magento websites: here

If you enjoyed this post, please hit the “recommend” button below. Thanks!

Kindly, Olga Litvinets