How to Fix Reoccurring Technical SEO Problems Once and For All

Matt Solar
nDash
Published in
5 min readAug 16, 2018

One of the most frustrating thing for any SEO is spending hours trying to fix an issue with a site, only to see the exact same error pop up in your crawl reports weeks or even months later. Seeing the same issue reoccur is often a sign that something in the coding or website setup is wrong, but in other cases, it’s a conflict between the way your site is setup and the way the crawler sees it. The good news is, even if you aren’t a developer, you can figure out a permanent fix for these problems.

Why It’s Important to Stop Crawl Errors

Technical SEO problems cause crawl errors, which can have a direct impact on a site’s search ranking. There are a few good reasons to try and resolve as many of these issues as possible:

  • Crawl errors prevent Google and other search engines from indexing the full website. Picture the site map as a series of tunnels, and each crawl error as a wall that stops the crawler from traveling any farther. This means any child pages might not get indexed due to a technical error with the parent pages leading to them.
  • Technical errors can disrupt the user experience and lead to lower conversions. Missing meta tags and descriptions are a great example of this. Without the hook that a meta description provides, potential customers might see you in the search results but fail to click. Every meta description is a chance to hook a customer and you should take every shot you get.

Canonical Tags

Some marketers don’t worry about a placing a basic canonical tag because they don’t believe anyone will bother stealing content, or they aren’t worried about the duplicate content penalty. But there’s more than one reason to put a canonical tag in place. Aside from ensuring that the content you write and publish isn’t stolen and put on another site for their gain, canonical tagging can solve some SEO problems that might plague you. For example, if you own multiple locations of a business and want to stop one location from outranking the other, placing a canonical tag will help Google know which site to give preference over the others.

Duplicate Content

Certain areas of your site are more likely to flag as duplicates than others. To understand why, you must understand how the crawler sees your site. This infographic gives a visual breakdown of how Google indexes web content to determine what information is relevant to a query.

Image Source: Summit Dutta

For example, your blog may have ten pages of high quality articles with unique content, but the title and meta description of each of these ten pages still says, “Your Site Blog”. This can cause the crawler to flag this pages as duplicates. Instead of trying to assign different titles and descriptions to each of these pages, which may be impossible depending on the CMS, you can use a no follow tag to exclude the pages beyond the first one. But doing so will stop the crawler from indexing those child pages, which might hurt your rankings.

This is a case where you may want to experiment with each setup to see which one offers the best results. If you place the no follow tag and your rankings go up, you can reasonably conclude that the crawl errors were a large problem. However, if you place the tag and your rankings go down, it stands to reason that having the extra content in place outweighed the negative impact of the crawl errors.

DNS Errors

DNS errors come from issues with your hosting that block the crawler from accessing the site. A one-off issue here or there might be no big deal, but if you’re seeing repeated instances of DNS errors, it can be a sign that your host is experiencing more issues with downtime than you may realize or hasn’t configured your site properly for crawling. If the name server is not set up properly, or if your host has blocked crawlers, you’ll need to contact them to have it resolved. Once this is done, you can submit the site to be crawled and re-indexed by Google through your Webmaster Tools account.

Long Load Times

With user experience and creating great visuals always at the forefront of design, its easy to forget about or ignore long load times in favor of having all the elements you need on page. Unfortunately, with the average attention span shrinking fast, pages that take a long time to load get left behind. Google recommends a load time of three seconds for optimal user experience, but research shows that most sites take at least twice that long to load. Believe it or not, those extra three or four seconds are all it takes to kill a conversion.

Image Source: Blogging.org

There are a few steps that you can take to reduce page load times without sacrificing key elements, including reducing HTTP requests, minifying data, reducing photo sizes and upgrading your hosting plan. This guide from Crazy Egg offers step by step instructions on how to do these tasks yourself.

Once these technical SEO problems are permanently fixed, you can spend less time week over week on them and devote more time and energy to finding new and interesting ways to make your site even better.

Editor’s note: This post was written by nDash community member, Melissa Samaroo. Melissa writes a variety of business articles and website copy on topics such as SEO, inbound content marketing and more. To learn more about Melissa, or to have her write for your brand, sign up for nDash today!

Content Creation Services

--

--