How to build links at scale with SEO SpyGlass

It is no secret that organic traffic from search engines is immensely valuable to online businesses (Duh! right?). This has led to businesses scrambling to race to the top of SERPs to get potential customers to their site (who by the way, probably also have their wallets out as well.)

It’s almost an arms race of sorts. And the one who truly understands how search engines rank websites is the one likely to win this particular arms race.

Google ranking factors

It is said that Google takes into account over 200 factors when ranking websites in the organic search results. However, not all ranking factors are created equal. Some factors move the needle more than others. As an entrepreneur, you have plenty of things on your plate to make your online business grow and ranking well on search engines goes a long way in getting free organic customers to your business. It is therefore important to focus your efforts on tasks that actually move the needle rather than obsessing over other factors that barely move the needle.

Of all the ranking factors that Google takes into consideration, there are two that Google engineers have confirmed that matter the most. These are:

  • The content itself and
  • External links pointing to that content

So if you are to focus on improving the ranking of your website on SERPs, you are better off working on these two before you start obsessing over other factors.

These are so much more important than others that you could optimise everything else but these two and you would never rank. On the other hand, you if you do these to well and not do anything about the other factors, there is a strong probability that you’d rank well.

The importance of high-quality content

While this article is primarily focussed on link building, I wanted to make sure I highlight the importance of having high-quality content. This is because it is a precondition for acquiring links. I mean who wants to link out to mediocre content? Why would anyone want to link to some content when their own content is already better?

Would you link out to someone who put out a bare minimum 500-word article that doesn’t say anything that hasn’t already been said by hundreds of other people already?

Your content not only needs to be better, but it also needs to be significantly better for anyone to take the initiative to edit their content and give you a link.

I cannot stress enough the importance of good content because all your efforts of building links to your site are going to be futile if you don’t have content that people would find worth link to in the first place.

So know that having a very high-quality content is not just a “good to have”, it’s a “must have” in order to acquire any worthwhile backlinks.

While there is no universal definition as to what is “high-quality content” it could also vary from industry to industry. Besides plenty of other people have written good articles on producing content that your readers would love so I’m not going to go into much details on that subject here.

However, I will show you a neat way to create what can be called very “thorough” content using a free tool.

The advantages of writing a thorough, high-quality article are many-fold. Not only will it be easier to convince others to link to your content, but you’ll also rank for a lot of long tail keywords bringing you even more traffic.

Besides that, you readers will also appreciate when you have answered all their potential queries in a single article. It also improves dwell time and builds trust with readers when they see you know your stuff well. For the same reasons, your readers are more likely to get into any financial transaction with you as they will see you as an expert on the subject.

In order to create a thorough and detailed article you need to ensure you cover your topic from various angles that readers are looking for. An easy way to do that would be to look at what other content is already ranking on search engines for your keyword. This is useful because Google has already applied its algorithms and filtered the results to what most users are looking for.

All you need to do now is to study the top results and make sure your content covers all the topics that the top results in Google are talking about. This will essentially make your content the “sum total” of all the top ranking results for a given keyword.

While you could do this manually, it can become tedious. So I prefer to use the free Website Auditor tool. Specifically, the feature that we are interested in is the TF*IDF analysis. This feature works without any limitation even in the free version of the tool. Some sites want to charge you over $100/month for this feature. Yet others are so expensive that they won’t even list the price on their websites.

On the other hand, you could use the same feature in the Website Auditor tool without paying a dime, without any limitations and it works just as beautifully! How awesome is that?

Without going into the technical details (and I’m simplifying things here), what TF*IDF analysis does is, for a given keyword,

  • It scrapes the top results from Google (or other search engines)
  • Analyses the content to see what topics/keywords occur frequently on the top ranking pages
  • Shows you the topics/keywords that appear in your SERP competitors and their frequency across the competitors.

All you need to do now is to make sure your content also covers the same topics that appear in the top results.

In some sense, you could see this as reverse engineering what Google already likes to rank for a given keyword :)

I wont go into the exact details on how to use the Website Auditor as you can get the details by reading their help docs. But in short, you enter one or more keywords and choose the page you’re trying to optimise. Then the tool will do its analysis and present you with a table and graph like this:

For the above example, I entered the keyword “gaming laptops” and choose a page on my site I am trying to optimise.

It even tells you when you’re under or over using a keyword on your page. While helpful, I would suggest you use those recommendations with your own judgement. This is because there could be some false positives.

As you can see from the screenshot, it analysed the top results that rank on Google, identified the keywords that your competitors are using and how your own page fares against each of the keywords.

Armed with this information, all you need to do is make sure your page covers these topics and keywords as well. This will make your article very comprehensive and could potentially also rank for a lot of long-tail keywords in the process.

More importantly, this will set the foundations for acquiring backlinks which I will cover in detail in the next section.

The importance of backlinks

If you’re in any niche that is even remotely competitive, you’ll know that it’s virtually impossible to rank in SERPs without backlinks. And there is a good reason for this. Virtually all other parameters that Google takes into account in its ranking algorithm are things that you have full control over. So backlinks pointing to a content serve as an external validation for any given content — sort of.

Now, sure you could potentially rank for brand names or other keywords where there is no or little competition without any backlinks. But for a vast majority of keywords with competition, it’s just not going to happen without enough backlinks.

In this section, I’m going to show you how to find potential link partners and build hundreds and hundreds of backlinks with relative ease. This technique isn’t particularly new. Its has been done several times before with many variations, but the origins are the Skyscraper technique made popular by Brian Dean and then further optimised by the guys at AuthorityHacker. I have merely adapted and tweaked this technique to suit my workflow and get a bit of scale in the process.

In a nutshell, the process boils down to finding potential link partners and asking them to link to your content. All variations of this basic technique involve bringing scale to various processes to this fundamental task.

We will break down this task into several steps.

Step 1: Find related keywords

This is where you find keywords that closely match the primary keyword you’re trying to rank for. There are plenty of tools on the internet that allows you to find related keywords. However, off late, Ubersuggest from Neil Patel has become my go-to tool. Ubersuggest is easy, fairly accurate and best of all, completely free!

To do this, go to Ubersuggest, enter your keyword and choose the market where your primary audience resides. Ubersuggest will then show you a number of keywords that are related to the main keyword you just entered as shown below:

Vet this list carefully and make a list of all keywords that are very closely related to your primary keyword. At this point, I must warn you not to get greedy and grab every single keyword you see. If you need more keywords, use one of the keywords you just got from your first search and use that as the seed keyword. Repeat this process until you have 20–30 very closely related keywords.

Step 2: Find & filter sites that rank for each of these keywords

If you already have a tool like Scrapebox, you could use that to scrape SERPs for each of the keywords you just discovered. However if you don’t have Scrapebox, there is no need to buy that just for this task - you can also do this manually using Chris Ainsworth’s excellent bookmarklet.

Just head to Chris’s website and add the bookmark to your browser. Then change your Google settings to show 100 results per search as described on his website.

Then enter your keywords one by one and copy all the SERP URLs into a list using the bookmarklet as shown above.

So assuming you had 20 keywords, you should now have about 20 x 100 = 2000 URLs in your list. However it’s common that there will be several duplicate URLs in your list.

You should now remove the dupes using Excel, Google sheets or one of the many list deduplication tools online. Personally, I prefer this tool from Orangefox. After you’ve removed the duplicates, you may end up with say 1000–1200 unique URLs

Once you’ve removed the duplicates, manually go through each of these to make sure each of the URLs are actually closely related to the topic you’re trying to rank for. Brutally remove any pages that are not directly related to your topic. This is very important for two reasons:

  • If you don’t filter and remove irrelevant content at this point, you’ll end up with a lot of garbage in the next step.
  • When you outreach to people who have nothing to do with the topic of your content, they will start marking you email as spam. This will negatively affect your overall delivery rates into the inbox.

So make sure your URL list is as spot on as it can possibly get.

Step 3: Find out how many websites link to these URLs

It’s now time to talk about the star of the show here!

Backlink tools are the holy grail of building links at scale. You’d really struggle to build links at scale without these. As it stands, Ahrefs is the undisputed king when it comes to their backlinks database beating Majestic by a long shot according to a comparison carried out by Matthew Woodward. And this has been my personal experience too. For the quality and quantity of backlinks, Ahrefs is hard to beat. Most other backlink data providers usually just resell Majestic data under a white-label.

Unfortunately, Ahrefs is very expensive at $100/month even for their cheapest plan. But understandably so. To build a backlink database that is comparable to that of Google, you would have to essentially crawl and parse the entire internet — well, the web/HTML portion of it anyways.

As you can probably imagine, building such an exhaustive backlink database requires an immense amount of computational and bandwidth resources. And they need to do this on a continuous basis! I wouldn’t be surprised if Ahrefs server costs run into several millions of dollars a month. It is no wonder then that there are not many companies who are willing to invest so many resources into it.

Among the other tools I used was SEO SpyGlass maybe a year ago but soon abandoned it because the results were rather lacking to say the least. So when I first heard the were now building a new backlink database from scratch that would rival Ahrefs, I was intrigued. I immediately signed up for their closed beta and have been using it for the last couple of days.

I’m not going to make any comparisons with Ahrefs yet because it would be unfair to compare a pre-release product with a market leader. What I will say though is that I am excited with what I see and I’m hopeful that for once there will be a real competition to the Ahrefs backlink database supremacy.

A quick look at their beta announcement page suggests that they have made a breakthrough innovation which allows them to crawl the internet with far fewer resources as compared to their competition which in turn allows them to pass on the cost benefits to their users and offer their product at their regular prices. Sweet :)

Why a fresh index matters

Another important aspect of the new version of SEO SpyGlass that I like is the focus on the freshness of the backlink database. Many backlink providers tout about the size of their historical backlink database. The size of the historical database serves very little practical purpose for the average link builder but offers immense bragging rights to the website. It is similar to how some dating sites brag about the number of users who ever registered on their platform. For a new user, the total user base is largely irrelevant because 99.99% of those users are long gone. What matters to a new user are the number of active users that they can potentially make a match with.

It’s the same with backlinks. In a majority of cases, what matters are the links that exist as of today, unless you want to see the links gained and lost over a significant period of time.

I cannot understate the importance of having a fresh, continuously updating backlink database because in the past, I’ve had situations where a tool reported hundreds of backlinks to my domain but those backlinks were from pages hosted on parked domains. These parked domains come and go by the millions and they link to random sites all the time because it generates a ton of revenue for them. But for someone who is looking to build backlinks, these false positives are a huge waste of time.

I’m sure you’ve run into these yourself:

Similarly, If you’re an agency or an individual that does SEO work for clients, then you need to be extra careful when you generate backlink reports for your clients. If you rely on a historical index to generate your report (because showing thousands of backlinks sounds impressive), it’s quite possible a lot of those links are long gone and if your client does some spot checks only to find the backlinks in your report does not exist, it would reflect quite bad on you.

Now that you understand the importance of using a fresh index, let us put to good use the data we collected in the previous steps. Note that I’ll be using the beta version of SEO SpyGlass for the remainder of this article and it’s quite possible some of the options you may see in the screen-shots may change significantly by the time the final release arrives.

The next thing we are going to do is, from the list of URLs we have, we will narrow down to the ones that have a significant number of backlinks pointing to them by using the Pareto principle. While this step is optional, I recommend you go through with it as it will save you a lot of time.

To do this, fire up SEO SpyGlass and start a new project. You can use any random URL to analyze as it really does not matter for this particular task. When it loads, click on Domain Comparison menu on the left panel. Then start entering URLs from your list as shown below. Enter as many URLs as the tool allows.

SEO SpyGlass allows you to compare 10 URLs at a time with the current version. So if you have plenty of URLs you may have to repeat this several times.

Once you’ve entered all URLs, click on OK and wait for it to fetch information for the URL’s This is fairly quick and takes only a couple of seconds. When it’s done, you’ll see a report that looks like this:

The key information that we’re interested in from this report is the number of Dofollow backlinks pointing to those specific pages.

Open a spreadsheet and make a list consisting of the URLs and the corresponding number Dofollow backlinks for each.

Repeat the above process until you’ve processed all your URLs.

Once you’re done, sort your spreadsheet on the number column in descending order of Dofollow links. In a majority of the cases, you’ll find that only 20–30% of the domains account for 70–80% of all the backlinks. This is your Pareto principle at work.

These top URLs are the ones we are going to work within the next steps. If you’re desperate, you could work with all the domains but know that after the top 20–30% URL’s law of diminishing returns kicks in and you’ll be spending a lot of time for smaller and smaller gains.

Note: I understand doing this process for hundreds of URLs is very time consuming so I’ve already made a feature request on their beta feedback forums to allow more URLs to be compared at a time and make it exportable. I’m hoping they can include this feature in a future version. Until then, you have a couple of other options. What you can perhaps do is instead of taking all 100 Google results in Step 2, you can limit it to the top 20 or 30 results. In general, I’ve found results beyond the top 20–30 on Google don’t have many backlinks anyways (thanks, Pareto). This would limit your total URL list to less than 100 which might be much easier to manage but still account for a majority of backlinking domains.

Step 4: Find the exact backlinks using SEO Spyglass

Once you’ve narrowed down your URL list to the ones that matter, you should be left with very few URLs compared with the original list. Typically you should be down to only a fourth of the original list size.

Next step is to find the exact pages that link to each of our URLs. To do this, open a new project in SEO SpyGlass for each of the URLs in your list. Enter the URLs one by one while making sure you have selected “Exact URL” in the drop-down.

The “Exact URL” option ensures we get pages that link to the specific URL. On top of that, it also finds links that link to either the HTTP or HTTPS versions of the page. This is cool because we also want backlinking pages that may have been created before the target site moved to HTTPS.

Also, enable the Expert options and click next twice to go to step 3 of the wizard. On step 3, select “Limit backlinks from one domain” and set its value to 1.

Because we will be doing an outreach to these backlinking domains, we only need to know the domain once. We don’t really need to know each and every page on a domain that links to our target page.

What I find cool about this setting is that it treats different subdomains as individual domains (except for www.) This is especially helpful because some public blogging platforms such as Blogspot create each user blog under a different subdomain.

So for example, let’s say we’re trying to find all backlinks to the exact URL https://www.google.com/ and there are two links pointing from siteA.blogspot.com and siteB.blogspot.com then this setting would pick 1 link each from both of them. I think this is pretty smart and cool.

Click Finish and let the tool do its thing. In just a few seconds, it should come back with a list of all pages pointing to this specific URL. This is what the result typically looks like:

As you can see from the screenshot above, it fetched both Dofollow and Nofollow links. But in our case, we’re only interested in sites that offer Dofollow links. So apply a filter on this column to get only the dofollow links. To apply the filter, click on the advanced filter option, choose “Links back” -> Contains. Then enter “dofollow” in the text box. This will filter all rows where the “Links back” column contains the text “dofollow”.

At this point, you can apply additional filters to only select pages with a certain Domain inlink rank. I don’t bother with it and choose all rows.

Next what you’d want to do is get SEO SpyGlass to fetch the contact information of these domains so you can do an outreach to them. This is a gem of a feature but its hidden behind a few button clicks. To get the contact info, click on the “Edit visible columns” icon next to the advanced search text box. This will open a new pop up window where you can select all the advanced data you’d like to see.

Scroll down and select

  • Linking domain info -> Linking Domain
  • Linking domain info -> Contact Info

This will add two new columns with the title “Contact info” and “Linking domain” to your results. We want the linking domain as well because we will be using this to filter out duplicates in a later section. Even though we selected to have only one link per domain in step 2, we might still end up with duplicates when we combine the results from multiple URLs. Therefore, having this column will help us filter out the dupes in later steps.

To update the contact info of the linking domains, click on the refresh icon left of the column title (the refresh icon will reveal itself when you hover your mouse over it):

This probably won’t be able to fetch contact info for every single domain on the list. No tool will.

Most tools that can fetch email address for websites cost a lot, while this feature is included with SEO SpyGlass at no additional cost which I think is awesome!

We’re almost there! Just bear a little bit more with me…

Next, export the filtered rows in CSV format by clicking on the export icon and selecting “From project workspace” from the pop-up menu.

Repeat this for every URL in your list. Create a new folder and make sure you save all the CSV files in the same folder. We want to save all the files in one folder that does not have any other other files because we will need to merge these files into one in the next step.

I know this can get a bit repetitive and tedious if you have several hundred domains to fetch from, but there is no way around to automating this. Perhaps if they make an API available, it might be easier. But let’s make do with what we have for now.

Step 5: Merge all exported CSV files & remove dupe domains

Once you’ve exported all data to CSV files, it’s time to merge them. For this step, I’m assuming you’re using Windows. If you’re on another operating system or are uncomfortable doing basic command line operations you can merge the CSV files online as well.

On Windows, open the command prompt and navigate to the folder where you have saved the .csv files. Type the following at the command prompt and press enter.

c:\my_folder>copy *.csv all.csv

This will create a new file with the name all.csv and this file will have the combined contents of all .csv files in the current folder, which will include repeating headers from each file.

We will now use Excel to remove duplicate domains and headers from the merged file. This step is required regardless of whether you merged the files at the command prompt or using the online tool linked above.

To do this, open the merged all.csv file in Excel. Click on Data menu item and then click on the Remove Duplicates icon. This will open a popup where you can make selection how exactly you want Excel to remove the duplicates. Follow the clicks exactly as shown below:

It is important that you follow the click order as shown above. This is because, the moment you unselect “My data has headers” the columns names will change to “Column A, Column B, etc” and you wouldn’t know which column has the Linking Domains.

The reason why we want Excel to disregard the headers even though we have them is, this will cause Excel to treat the first row as any other row of data which will remove all the duplicate repeating headers as well :)

Step 6: Add missing contact info

You may have noticed that the contact info is missing for some of the domains. That’s understandable. There is no tool I know of that can fetch the contact details of each and every domain. Some of the best tools email finding tools out there are able to get emails for 40–50% of the domains at best. You still have to figure out the emails for the rest.

There are a few options you can explore to update the missing emails.

The cheapest option but which takes a little bit of time is to manually visit every domain and see if they have listed their contact emails on their site. Typically these would be under the Contact or About us pages. In my experience, if you cannot easily find the contact information on these pages check their Facebook page as many websites to list their email there. If you still cannot find their email, just give it a miss as they probably don’t want to be contacted. Just move on to the next URL when you cannot find the email even after 3 or 4 clicks.

The other option you can try is to have a VA do this for you. They will not only find the emails but can also send the emails for you from whatever mailbox you share with them.

The third option, which although may cost some money is by far the most efficient in terms of time saved. The idea is to use a third party email finding service to find the missing emails for you. Hunter.io, FindThatLead.com and EmailCrawlr.com are the more popular ones. Either way, know that no tool is going to be able to find emails for each and every website you throw at them.

If you really want to gather as many emails as you possibly can, you can try running your list through multiple email finding tools. What you do then is run your list with one of the tools first. Then you take the ones where it could not find an email and run it through the other tool and so on. This works quite well (up to a point that is). This is because the databases of either of these tools are not exactly the same. So it’s quite possible that the emails that one of them could not find are available with one of the other providers. If you use all the 3 providers above, you might be able to get emails for as many as 70–80% of the total domains you in your list. It probably doesn’t make sense to use more than 3 providers though as the law of diminishing returns starts kicking in after the 2nd email finding service. You’d be better off finding more keywords in Step 1 instead.

Step 7: Clean your email list

This step is critical. Regardless of how you collected your emails, there is a good chance some of them are no longer active or non-existent. This can happen for several reasons — people leave organizations, typos, etc. Sometimes automatic tools give out “guessed” emails as well.

It is therefore imperative that you validate the email addresses before blasting out those emails. If you don’t and your bounce rate crosses a certain threshold, mail agents may start putting your emails under spam folder where no one will ever find them.

Ideally, you should aim to keep your bounce rate below 3%. If your bounce rate crosses 5% chances are your mail provider may even suspend your account.

There are many ways you can verify the email addresses you have gathered. Some of the tools I mentioned above already have this feature (at an additional cost usually) but it’s totally worth it.

There are also many third-party providers that can help clean your email list. The cheapest one I’ve found so far is BulkEmailChecker.

There really is no substantial difference between the cheapest and the most expensive service as they all use the same fundamental techniques to check the deliverability for any given email. So go for the cheapest one you can find.

If you’re sending emails from your custom domain, make sure you’ve configured the SPF and DKIM records correctly. This will improve the odds of your emails landing in the inbox versus the spam folder, although it’s not guaranteed by any means.

Step 8: Send the emails and follow up

There are really two schools of thought when it comes to sending outreach emails for link building.

The first school of thought is you should heavily personalise every email you send.

The other school of thought is to use a templated email (without being spammy) as this is a numbers game essentially.

I’m really not going to suggest which route you should take. I leave that up to your individual judgement.

I’m not even going to tell you what you should write in your emails. Although most people either offer to guest post for them or simply ask the recipients to link back from one of their existing pieces — usually from the back-lined page list we just created in the previous step.

What approach you take is entirely your choice. Besides, there are plenty of other SEO’s and marketers who have shared on their blogs what your outreach mail should look like so you can take tips from them on how to compose your email.

In case you decide to go the template route, you will need a mail merge tool to do that. In which case, you will create a templated email and have the tool replace some sections in the email with the values from the CSV file we created in previous steps.

While any mail merge tool will work, there are some that have been specialized for outreach. These specialized tools can also send follow up emails automatically on your behalf in case the recipient does not respond after a certain period of times. They also capture additional details such as open and bounce rates among others. MailShake and LemList are the ones more popular with SEOs for this task.

That’s about it! That is all it takes to speed up your link building activities. Building backlinks is no longer a chore thanks to SEO SpyGlass!

Happy link building!