SEO Toolkit 101

The single most important factor in having great SEO is having great content. That is, if you need to re-do your website because it sucks — bite the bullet. If not, though, you have the option of organizing your content to be more crawl-able by Google bots.

This tutorial will show you how to tweak your website using 3 popular SEO tools, thus improving your sites search-ability. I’ll start with listing some basic SEO pitfalls one may encounter. I’ll follow with examples of how to use SEMrush, Google Analytics and Screaming Frog to understand your websites’ performance and start to address these issues.

Most Frequent Faux Pas

Low-hanging fruit regarding your websites’ organization may include;

Menu Issues

Controlling the menu controls what choices people can make. At the same time, people rarely stop to question what they looking at on a menu and why. Make sure the navbar you offer serves a clear purpose rather than acting as a distraction. People will straight up bounce off your site if the navbar is confusing. For example, the menu bar of one of my clients (shown below) doesn't provide any clear avenue to make a purchase even though that was entire point of the website.

One item on the homepage labeled “designs” dropped down to 3 choices: “Exclusives”, “Historic / Nottingham” and “Designs”(again). “Designs” is not specific enough- it says nothing about actual “curtains” or “purchasing,” rather sounds like an artist advertising different styles. Users are not easily able to travel the route that is PRODUCT — →SHOPPING CART without using lots of mental energy.

Internal Structure

At the back-end, things were strangely laid out. URLs were full of dashes and keywords with no clear hierarchy. His typical URL structure, https://www.cottagelace.com/The-Eastlake-Panel-and-Sidelight-Lace-Curtains, indicated that every product he sold was in one home folder. Can you imagine the mess of dumping all of your companies’ files into a single drawer? Not only is this confusing to those who happen to read URLs, your content is much less clear to bots crawling your website. Instead one might expect https://oldeworldelace.com/product-category/lace-curtains/cotton-lace-curtains-scotland/ which at least shows what you’re looking at and where it is.

Find and Fix Problems

SEMrush

SEMrush lets you look at your websites’ current rank on a number of keywords, as well as determine what keywords drive traffic. SEMrush also lets you look at your competition’s keywords, and how many clicks they get. You can collate, organize and display the data in different ways to create reports. Here is the basic output from a SEMrush search on cottagelace.com:

This mess answers the following question: “of the people who ended up on this website, what words did they search to get there?”

The “victorian lace curtains” keyword was the biggest driver of traffic here. 13.06 % searched ‘victorian lace curtains’ to arrive. ‘Victorian lace curtains’ is in position 7, which means it shows up as 7th in google search results. Google displays up to rank 10 on its first search results page.

When people search for “lace curtains,” however, Cooper Lace can be found in the middle of the 3rd page of results (rank 34) which is horrible. Other info such as the “Volume” column helps you target your advertisement towards more frequently used keywords; here we see an average of 9,900 people search for “lace curtains” monthly in the US.

Google Analytics

Google Analytics helps you look at traffic numbers, bounce rates and conversion rates;

We see the percentage of mobile, desktop and tablet traffic hitting the website monthly. You can get a report of the number of sessions had (naturally, more than the number of individuals who visited the site). Here, bounce rate is too high — this is how many users hit the website and immediately left (who were perhaps too frustrated to stay?) “Conversion Rate” tells us that only 2.04% of sessions resulted in purchases, which is relatively low. Information like this lets you know whether to employ strategies like making your website more mobile-friendly.

You can also group your data into search types: Organic, direct, referral and social media. This tells you where your streams of traffic are coming from.

Most of Cooper Lace’s traffic is from organic searches- no one was linking to it from social media or any other secondary source. So a heightened social media presence in this case would help drive traffic to the site.

A last interesting piece from Google Analytics indicates which parts of his website were common landing places;

In rebuilding the site you might redirect traffic from those more successful links, to their closest analogs in your new version. In this case “/Historic-Nottingham-Lace-Curtains” was the most important landing page (after the home-page).

Screaming Frog

Screaming Frog crawls your site and check for duplicate content, bad meta tags, and other things that can lower SEO. To rattle off a few issues, his . meta-tags (the short descriptions seen while browsing through google search results) were overly-long and thus cut off. This lowered click-thru rate, as partial descriptions don’t catch people’s eyes. There was also duplicate content which lowers SEO. Interestingly, the website had been updated from http to https, but both addresses seemed to still be functioning. The current webmaster did a work-around by canonicalizing all the https versions of each page, but having both up can still be detrimental.

WTH’s wrong with our h2's

This analysis tells us that 100% of his h2’s were repetitive (and thus non-descriptive.) When h2’s don’t relate to the content on the page, crawlers can’t accurately map the site and rank pages. And lo, on all pages the h2 indeed read “Free U.S Ground Shipping!” a phrase that doesn’t actually allude to the pages’ content.

It was also interesting to note that the robots.txt file included in the top-level directory of most sites was preventing his blog and a few other pages from being crawled at all;

response code analytics

Perhaps the previous webmaster forgot to unblock this. A rebuild might focus on having much clearer URL layouts with descriptive h2 tags, removing blockages and getting rid of “dead weight” (un-visited pages) that tend to drag SEO down.

I’ve covered just a few diagnostic tools that can help you get an understanding of your websites search-ability. As every website is different, in terms of history and structure and all that other junk, doubtlessly your own findings will be vastly different from mine. All you can do is research what you see after running your analyses, and get an overall picture of your SEO situation to inform next steps.