I finally discovered what might be a holy grail — the motherload — of content practices that should dramatically improve organic search PageRank for any site. Sometimes I feel like a dumbass for not “getting it” sooner. But then again, not a single person that I have talked to over the past 18 months — including self-anointed web marketing gurus — suggested that we try to do what I will explain in this article. I stumbled on this myself, and then asked some SEO folks whether this approach would work. All of them said yes, it should work, if implemented correctly. Why no one suggested this to me in the first place when I mentioned what we were trying to do (instead of the other way around) is beyond me.
Everyone knows that PageRank is about relevant content and linking domains
Take a look at the image to the left. This is a screenshot of our Hubspot Competitors page. In it you can see some numbers for companies which we are tracking. Now take a look at three key metrics: traffic rank, indexed pages, and linking domains. Note that traffic rank is dramatically lower (lower is better) when indexed pages and linking domain values are high. Naturally, the more indexed pages you have on your site that Google can crawl, and the more high-quality linking domains that link back to your site (aka backlinks), the better your PageRank will be for certain keywords. Even though there are many many variables that impact your PageRank value, those two are quite important, and for purposes of this article, we will stick to those only.
So if relevant content and high quality linking domains are crucial, how do you acquire a ton of it quickly?
Easy. Instead of spending years writing blog articles, become the yellow pages of whatever it is that you do. In other words, harvest public information on the internet which is highly relevant to your business and which is scattered around widely, aggregate it onto your website in a crawlable fashion (ie, unique pages that can be found via links), and tell Google to index it for you. And then ask those same people that you indexed information from to link back to you. F***ing genius! This is what a ton of firms are doing, and they aren’t exactly publicizing their approach.
Who uses this content strategy?
Avvo does it with lawyers. They scrape the internet for lawyer websites, create a “free” profile for each lawyer using content they copied from the lawyers’ own websites, and then ping the lawyers and ask them to “claim” their profile for free and to link to it from their own website. This gives Avvo tons of SEO juice, facilitated by the lawyers themselves, sometimes unknowingly. Now they are doing the same exact thing with user-generated content: users ask questions, and lawyers answer them. Now there is a ton of Q&A content on their site that Google can also index, and Avvo doesn’t even pay the lawyers for this…they ask them to answer questions, and in return their Avvo rating will be increased accordingly. Again, a stroke of genius! It’s no wonder that Avvo is the 4,700th most heavily-visited site on the internet.
This content strategy is not anything new or magical. It does take a lot of work, obviously, but the payoffs are potentially larger than any sort of short-term temporary boost you get with paid ads.
Need more examples?
BandsinTown does it with music artists and concerts. AllRecipes does it with recipes. Wine.com does it with wines. TripAdvisor does it with hotel destinations and user reviews. In the book, Founders at Work, Steve Kaufer, the founder of TripAdvisor, describes in some detail how it took them — on their own — 2 years to scrape destination reviews from other sites and aggregate those onto their own website. Now the user-generated reviews posted on their own site dwarf their 2-year initial data scraping effort, but it was that 2-year effort of content aggregation that got them on the map in the first place, which resulted in their ability to send (and get paid for) high quality leads to Expedia. The rest is now history.
Follow Google’s guidelines, or else…
There are some things that you need to do to stay within Google’s rules of conduct…for example, make sure you mark your pages that contain your dataset as canonical <link rel=”canonical” href=” so that Google doesn’t flag you as a data farm. Consult an SEO expert and Google Webmaster Tools about technical aspects such as these.
The bottom line
So….as long as you follow Google’s rules of conduct, and make each page of your content unique (a unique URL for each page) and accessible to Google’s robot via links, then you should be good to go.
Our lessons learned
I’ve learned a couple of things:
- Self-annointed web marketing gurus don’t know as much as you think they know (or as much as they say they know). They are also trying to figure all of this out themselves.
- We at BernieSez need to embrace the above-described content strategy.
So how is BernieSez going to capitalize on this approach to content? Stay tuned.