As SEOs, we often spend a lot of time digging into our client’s websites to uncover a myriad of technical issues. However, it seems that web security is quite often forgotten and not included in these deep dive audits at all. This article will outline just a few basic web security checks all SEOs should consider when performing technical audits. I’ll be updating this article on an ongoing basis to accommodate more basic web security checks SEOs can perform.
Many of the ideas covered in this article were originally covered in a fantastic web security article by State of Digital’s Barry Adams, so be sure to check his article out for even more tips and info I won’t be covering.
Testing SSL Certificate Strength
All SEOs, and probably digital marketers in general, have heard about HTTPS and know the importance of switching your website to it. It’s important to SEO since Google has made a concerted effort to reward those who have switched and penalize those who have not, but it’s most important in providing another layer of security for sites that support online transactions.
However, not all SSL certificates are created equal. The level of encryption and overall security of the certificates can vary greatly depending on the issuer. Many free certificates provide marketers with the desired padlock symbol, but the certificate itself could be lacking heavily in the security department. In fact, Google has publicly stated that their ultimate goal in the future is to remove the padlock symbol altogether, because HTTPS will become the “new standard” of the web.
What does this mean? This means that the coveted padlock symbol will disappear, leaving your website with just a potentially weak and ineffective level of encryption.
For organizations who do not conduct transactions directly on their website, they may be able to get away with just a low cost, basic SSL. However, for those who do, significantly more robust certificates are needed.
How Do I Test my SSL?
There is a fantastic free tool called SSL Server Test that will score the SSL for any domain you input, calculating the score from a few chief security categories.
Once you input the domain you wish to test the SSL for, you’ll need to wait a few minutes while the analysis is compiled. Then you should be able to see a scorecard in the format seen below.
As you can see from the example above, this domain has a very strong SSL rating. It’s always best to aim for an “A” rating if possible, especially if you handle transactions on your website. If you run this test on a client website and a low score is returned, it would be a good idea to include in your technical audit so they are aware and can take action if necessary.
Checking Client Software for Vulnerabilities
Software vulnerabilities are a constant threat when trying to stay ahead of hackers. New vulnerabilities are constantly being found and patched, leading to new versions being released on a frequent basis. This is great, however, what if a client’s website lapses and is not keeping up with the latest version releases? This can potentially pose a real threat to the site’s security.
So what can SEOs do to help? Admittedly we probably aren’t privy to all software a client is currently using, however, we do have a few tools to perform some basic checks.
Using both the BuiltWith and the Wappalyzer chrome extensions, simply visit the domain of your choice and you should get a sizable list of software resources being used. I advocate using both tools because while BuiltWith generates a much more exhaustive list, Wappalyzer provides the actual version number for select resources. This is invaluable when looking for known vulnerabilities or newer versions to update to.
Below we can see the Wappalyzer extension in action, you can see that the SharePoint CMS version currently used is 15.0.
Now that you know the current versions being used:
- First perform a quick search for the product release schedule to see if newer versions are available.
- Second, you can search for known vulnerabilities for these versions and see if the latest versions provide patches.
It’s important to keep in mind while researching latest versions, that “latest” doesn’t necessarily mean “better”. Some software requires a lot of effort to update and the gain might be minimal and not any more secure. As such, it’s good practice to make sure you’ve done your research on the potential vulnerabilities or benefits of an update.
Domain Hosting: Dedicated or Shared?
An often overlooked, but crucially important, data point to look into is whether the domain is on a “dedicated” or “shared” server. What does this mean? Being on a dedicated server means your domain essentially has the server all to itself, you don’t have a bunch of stranger domains housed with you. Shared is the exact opposite, you could be sharing a server with a ton of domains whose web security is a total question mark. If web security is pertinent to your business and keeping information safe is a top priority, you should always shoot for dedicated hosting.
Why does it Matter?
So you may be asking yourself what the problem is with having shared hosting, let’s look at it through the lens of web security. On a shared server, the collective security is only as strong as the weakest domain’s security. If even 1 out of the 100 domains housed is infiltrated, the hacker likely has the ability to break into the other domains hosted on the shared hosting server as well. Think of the server as an impenetrable castle and each domain is responsible for guarding a given gate, if even one of the domain’s security is lax then the entire castle could be overrun as a result.
With this in mind, there is a very useful tool to test whether your domain is hosted on a shared or dedicated server. The tool is called SpyOnWeb, all you need to do is enter in the domain name or IP address you wish to look up.
Once the analysis completes, you should see a page similar to the one below. In this instance, the domain is clearly hosted on a shared server with 117 other domains. As you can see, the security of the entire server and ultimately the neighbor domains, is only a strong as the weakest link.
On the other hand, here is an example of a domain hosted on a dedicated server. I’ve obscured the domains themselves, but they consist of sub-domains for the main domain as well as a few variations on the main domain name. The main takeaway here is that there are no unknown domains operating with complete web security autonomy.
While I cannot attest to the absolute accuracy of this tool, I’d say it’s best used as an indicator of the hosting method used by a domain and inform you whether you should call out to a client who may not be aware.
Symantec SSL Certificates Distrusted in Chrome
There was a lot of buzz earlier this year about Google deprecating its trust in SSL certificates issued by Symantec PKI and the brands it owns:
This announcement made it clear that immediate action was needed from webmasters if they used any of the brands above. If ignored, the resources using these certificates would stop functioning for all Chrome users, which could be quite impactful if the resources are crucial for your site’s performance.
If you were using one of the listed SSL certificates for your own domain’s security, you’ve hopefully already switched issuers to a trusted one. If you’re good on this front, the next issue you may notice are third party companies using these SSLs. We’ll cover this next.
As SEOs digging deep in the dev console, you very well may have come across warnings like the ones seen below.
In the two instances above, the certificates are being used by third parties which the site loads resources from. Obviously, if these third parties ignored the warnings from Google, then these resources would no longer load for Chrome users which could cause huge issues depending on their intended function.
While the current versions of Chrome have already distrusted many of these certificates, the full deprecation will come with the release of Chrome 70 in late Q3 2018. If you’re curious and want to make sure none of your site’s resources will be deprecated unexpectedly in 70, you can download Google Canary which is the first publicly available versions of new Chrome. Canary versions are usually a bit unstable and can crash, so it’s best used as a testing tool in this instance.
Referring back to the warnings seen in the chrome dev console earlier this year, it’s a good idea to review the resources you were seeing warnings for and checking back to make sure their certificates were updated. Chances are the companies themselves took action and updated but it’s never good to assume, it’s best to open the resource directly and check the SSL there. You can perform this by finding the file in the “Sources” tab, open it in a new browser tab and then click “Secure” next to the address bar. In the “Certification Path” tab, you should be able to see who the issuer is.
Looks like the owner of this resource now gets their SSL certificates by the issuer DigiCert which is trusted by Google, meaning there are no longer any deprecation issues here.
Using this method, plus checking Chrome 70 (via Canary), you should get a good idea of where you stand and if any action is needed. If you find that resources are no longer loading or if warnings are persisting in the dev console, you may want to reach out to these companies to verify with them that they’re updating.
Site Admin/Login Pages
One basic web security aspect that is often overlooked, is the ease of which a hacker can find your website’s admin login page. Many Content Management System (CMS) platforms include the admin page directly in the robots.txt file by default, essentially publicly advertising the path to anyone who may be looking for it. It’s important to note that even if your website is built on a proprietary platform, it could still be easy to find your login/admin page if basic web security practices are not followed.
WordPress and Shopify are two such CMS examples, seen below. On the left you have the website’s robots.txt file where the first line item is the admin path. On the right is the admin login page I was able to navigate to in a matter of minutes as a result. From here a hacker can try a variety of methods to force access into your website.
What’s the solution?
Instead of including the admin/login folder path right in the robots.txt file, make sure the admin pages have NoIndex,NoFollow on-page directives implemented instead. This way search engines will know not to index the page when they request it, and also so it’s not called out explicitly in the robots.txt file for anyone to find.
Conclusion — Why Should You Care?
You may have noticed that some of the topics covered here seem a bit disparate, however, for SEOs they all distill down into two primary concerns:
- Google’s trust in your site.
- Overall site performance.
Allow me to further explain. If your site is hacked and shady activity starts emanating, Google may notice this and slap you with a manual penalty. A manual penalty can be devastating for any site, pages are de-indexed, rankings lost and traffic drops off causing massive headaches for you as an SEO. Performing basic web security checks can help identify and prevent potential breaches, which is good for SEO and good for users alike.
The same goes for the SSL certificates being deprecated by Google. If you or the third parties you load resources from haven’t updated SSLs to a trusted issuer, the result could be broken functionality or the dreaded “Your Connection is not private” warning appearing to users.
Hopefully this article has provided some useful tools and methods for SEOs to start incorporating basic web security checks into their technical audits. As SEOs we should have an invested interest in the security of a client’s domain, since actions like Google’s manual penalties and bad UX can impact rankings and traffic significantly. As web security continues to evolve and adapt to threats, SEOs should always be vigilant and call out vulnerabilities that we find while auditing.