You’re probably thinking “Great… ‘New SEO’, just what we need, another meaningless marketing buzzword.”

I get it. You’re a seasoned blackhat link-building ninja on the bleeding-edge of semantic analysis. Your domains are aged, you’re immune to caffeine, you’ve never been to Florida. You’re panda-proof, penguin-safe, resilient. “There’s nothing new in SEO,” you reassure yourself, “It’s all keywords, rankings, anchor-text, and backlinks. An XRumer blast here, a little ScrapeBox there, mix in a few link wheels, spun blog posts, fake Google +1's. SEO mannnnnnn… I got this.”

Search Engine Optimization is difficult to write about; a quickly moving target, often misunderstood, constantly in a state of flux. We’ve got white-hats, black-hats, grey-hats, and blue-hats, and if that’s not confusing enough, 2013 was a particularly volatile year, with Google staging a full-on offensive leaving many to proclaim “SEO is Dead”.

I just spent the past week at PubCon with some of the brightest minds in search and I can assure you, SEO is doing fine - just don’t count on Google being the most transparent business partner.

All of this got me thinking, if SEO is dead as we know it, what exactly is the new SEO and how can we start taking advantage of it today?

Use Schema.org Semantic Markup In Your Content

In the modern marketing environment you need amazing content to succeed; content that provides value, gets people’s attention, makes them stop and think. Publishing great text on the web used to be enough, but with the adoption of semantic web protocols like RDFa, JSON-LD, microdata, microformats, and schema.org, the Internet is transforming from a web of linked pages into a web of linked concepts, or as some like to say “From strings to things”. Semantic markup helps Google and “the machines” better understand your content.

If you’re not doing so already, go to schema.org, check out the schemas, and start using semantic metadata to enhance your content. Schemas allow you to wrap your unstructured content in tags that signify the semantic meaning of the content they contain. Semantic metadata is added directly to HTML using attributes. You can use schema today to markup events, people, reviews, products, local businesses, and much more. Google will index this data and reward you with enhanced SERP listings, wider distribution, and potentially increased keyword rankings.

Keep an eye on schema.org going forward as new content types are added. If you need help getting started, check out this awesome Schema Creator tool from Raven Tools. The recent Google hummingbird update has made semantic search a reality, effecting 90% of all queries. In the coming months features like persistent context, computed query results, and serendipitous search will become increasingly apparent.

*As a caveat, only use schema.org markup where your content truly fits a specific content type. There is such a thing as “over-optimization” and you may be penalized for it.

Google+ Is Not A Social Network

You’ve heard it’s a ghost town; no one hangs out there except photographers, internet artist types, eccentrics, Robert Scoble - the lonely fringe of the Interwebs. You may log on every once-in-a-while to discover a notification that you have new followers, “How the hell? Who are these people?!?”

Many people think of Google+ as a social network like Facebook or Twitter. It’s easy to mistake it for one, but if you dig deeper you’ll realize that Google+ is the social spine powering all of Google’s services. Google+ is Google.

Here’s why Google+ matters, Google Glass jokes aside. Google is forcing value on Google+ by rewarding publishers with social signals, traffic, and faster indexing. That’s right folks, a post shared on Google+ will be indexed by the Google search engine within 6 seconds. How’s that for real-time SEO?

It gets even better. Although Google claims there’s no SEO benefit from a Google +1 or a Google+ share, we know better than to trust what Matt Cutts tells us. Numerous studies have shown Google +1's and other social signals from Google have a direct correlation with increased search engine traffic and rankings.

Google+ recently reported 540 Million active users worldwide; that’s a pretty active ghost-town if you ask me. Regardless of your opinion of Google the company or Google+ the service, using Google+ as a channel for social signaling, content distribution, and traffic is a requirement going forward. Check out the Google+ API for more detailed integration instructions.

People Are The New Links

Back before people openly identified themselves online the only relationships that could be easily mapped were hyperlinks between pages. Web pages were the atomic units of the web, everything beneath that level could be considered an unstructured or semi-structured resource. Hyperlinks became the main signal for search engine relevance with PageRank and the distribution of backlink anchor-text powering the SERP rankings for all queries.

Facebook was the first large-scale consumer web service to enforce true identity online by requiring users to authenticate with an official University email address. Then along came Twitter, and OAuth, FullContact, and now Google+. Along with concepts like recipes, businesses, and products, search engines now identify people. Google is using its concept of a person to attach a semantic link between authors and the content they publish across the web, resulting in a new ranking factor called Author Rank, which many believe will soon be more important than PageRank.

To take advantage of this today you’ve got a few options. As an author you create a Google+ profile and link to this profile from every site you publish on using the tag rel=”author”. Then, through your “about” page on Google+, you can reference the domains you publish to, completing the Google+ authorship link.

As a publisher you’re going to create a Google+ page for your site or business, use the rel=”publisher” tag to link to your Google+ page from your domain, and finally, like authorship, add a link to your domain from your Google+ page to complete the link.

If you’re a publisher or business with multiple authors underneath your umbrella the next step would be to implement authorship across all content on your site so that individual authors are associated with the content the create. This way, you can benefit from the added Author Rank your writers bring to the table.

Environmental Linking

This one took me a few seconds to grok, but once it sunk in, it made a ton of sense. Environmental linking is as simple as including 3 to 5 (or more based on the length of your content) natural links to relevant sources referenced within your content. Is this just a complicated way of saying “Include external hyperlinks within your copy.”? Kind of.

But it’s more than that. We all know we should be linking to other sites when our content references a concept. That’s the beauty of hypertext and the web; you can instantly be transported to a new resource at the click of a link, and that relationship is logged in the link graph. But how often do we actually do it? I find myself being more concerned with internal linking and linking to my own sites rather than giving my precious link juice away for free.

The problem is if you hoard your links for yourself, your link profile ends up looking highly unnatural. Environmental linking obfuscates your links by including them within content, embedded along side contextually relevant links. If you think about it from the perspective of the link graph, you’re associating your site and content with the content you link to, feeding topic relevancy signals to search engines in the process.

After you write your content, as you scan through it, look for opportunities to use hyperlinks to your advantage. Your content will look a little bit more like a link-rich Wikipedia article and that’s a good thing (ever wonder why Wikipedia ranks for so many terms?)

Open Graph, Twitter Cards, and Google+ Markup

Similar to schema.org, Facebook, Twitter, Google+, and other social networks provide special metadata for you to include within your markup. Reference the documentation for Facebook Open Graph, Twitter Cards, and the Google+ API for specific implementation details.

It’s always a good idea to play nicely with social networks and feed them the yummy meta-data that they crave. KnowEm has put out an awesome social media optimizer tool to help you validate your social network metadata. Use this tool to verify that you’ve correctly implemented this code on your site.

Disavow Backlinks

There are a few scenarios that would lead you down this path. The obvious one is that you’ve received a penalty from Google and you have no choice. I’ve been reluctant to use the Google and Bing disavow link tools for some time, thinking that by using them I was somehow raising my hand and saying, “Hey Google, hey Bing, I broke the rules, but I’m a good boy now. Please don’t make your punishment too severe.”

After some thought and talking to experts with way more experience using these tools I’ve come to realize that for the right site, disavow can be a powerful SEO tool.

Say you have a client and this “client” paid for some not-so-legitimate back links a few years back. Or worse, say your client was a full-on link-buying bandit, ala. JC Penney, shelling out millions of dollars for paid links. Either way, it doesn’t matter how you got them, if you bought them for yourself, or if a competing SEO is trying get you de-indexed using negative SEO. If this is your lot in life, disavow is your friend.

The nitty-gritty of disavow involves auditing a list of all backlinks pointing to your site, an inbound link quality audit. You can start with link data sourced directly from Google and Bing webmaster tools (Bing has recently added new features and more backlink data to their tools, it’s worth checking out).You can also use data from a paid service like Link Detox from Link Research Tools (I really like this tool, they source their backlink data from Moz, MajesticSEO, and ahrefs). You’ll use this data to weed-out a list of the bad links you’ll reporting to Google and Bing through their respective webmaster tools.

Link-Building Isn’t Dead

If someone tells you link building is dead, ask them for some data to back it up. A more accurate statement would be “illegitimate link building is dead”. The key is to always cultivate and maintain a natural link profile and the easiest way to do that is to stop buying links and start earning them.

Easier said than you done you say? Not possible? Oh, well it is, and many people much smarter and more creative than me have turned organic, non-paid link building into an art form.

I probably shouldn’t say this, as it goes against all the advice you’ll hear from the most respected names in the SEO industry, but you can still buy backlinks, and that can still work as a strategy (I’m not recommending you do this). Just like financial investing, SEO is all about risk management. Backlinks and anchor text remain a major search ranking factor and they can still be manipulated, in a natural looking way, to increase your ranking and traffic, although I’d argue this is a short-term, high-risk strategy that is becoming increasingly difficult to pull off. Are you smarter than the army of PHDs at Google? All the power too you.

Speed Is Everything

No one likes a slow website, especially mobile users, especially Google. This SEO tactic is as old as search engines and it remains more true today than ever. In terms of user experience, engagement, bounce-rate reduction, and how Google views your site, speed is unquestionably a major factor.

80% of performance optimizations happen on the client-side, which means you need to work with a front-end developer who can audit, streamline, and optimize the delivery of your site’s images, CSS, and JavaScript assets. There’s too much to go into in this post, here are a few key points to point you in the right direction:

  • Install Google modPagespeed on your server.
  • Optimize image file size using compression and new formats like WebP.
  • Setup caching within your CMS (ie. w3 total cache or wp-super-cache in WordPress).
  • Use tools like Pingdom Tools, WebPageTest.org, GTmetrix, and Google Pagespeed tool to benchmark you website performance.
  • Use a CDN to host static assets like CSS, JavaScript, and images.
  • Optimize SQL queries on the backend.
  • If you’re using PHP, activate APC opcode cache on your server.
  • Host your DNS records using a robust, distributed provider such as CloudFlare, Amazon Route53, etc.
  • Setup uptime mointoring with a tool like Pingdom to track the speed and uptime of your site over time.
  • And please don’t host your beautiful website on a commodity web host (ie. Dreamhost, Bluehost, GoDaddy, HostGator, etc). You’re better than that.

Conclusion

  1. Use schema.org semantic markup for things link events, people, locations, businesses, reviews, etc.
  2. Use Google+ for quick indexing, social signals, and traffic. Reference the Google+ API docs for tighter integrations.
  3. Link your content to your Google+ pages using rel=”author” and rel=”publisher”.
  4. Use environmental linking within your content, both internal and external to your site, to create a more natural looking link profile.
  5. Implement code for Facebook Open Graph, Twitter Card, Google+ Integration, and other social networking metadata tags.
  6. Use the disavow link tool along with backlink data pulled from SEO tools to identify and disavow any negative or low quality backlinks. Communicate directly with Google and Bing using webmaster tools.
  7. Links will always remain an indicator of relevance in search engine ranking algorithms. The key is to cultivate and maintain a natural backlink profile.
  8. A faster loading website leads to an enhanced user-experience, longer time on site, a lower bounce rate, and increased engagement. Both Google and your users will reward you for increasing your site’s performance.

I hope you found this post useful. Follow me on Twitter @devevangelist for web development and internet marketing content curated daily. Thanks for reading!