HealthWellNext
Published in

HealthWellNext

Digital Discovery 101: How to Technically Evaluate a Website

At the heart of determining what is and what isn’t working on a website is the technical evaluation. While one part of the equation is the heuristic evaluation taking a look at the front end of the site, the technical evaluation is meant to get under the hood of the site to determine if the infrastructure of the platform is working to benefit the site owner and unique visitors or if the infrastructure of the site is hindering its ability to provide the user with a simple and great web experience.
  • Meta Content (Title Tags, Descriptions, Alt Text, H1 — H6)
  • Status Codes (200, 301, 404, 505 etc.)
  • Image Libraries (Source, Destination, Alt Text)
  • Internal and External Linking Structures
  • Site Depth and Width
  • Site Speed
  • Website Content Based on URL Page Source
Screaming Frog is a constant excellent tool to understand what is under the hood of any given website.
  • Is the content infrastructure of the website set up for success? Does the website contain the basic back end content needed — meta — for the site to be indexed properly?
  • Does the site in question provide Google with the file indexes which help its spider map and index your website? Does it contain a working HTML and XML Sitemap? Does it make use of logically and thematically related external deep and internal contextual linking structures?
  • Is the site physically working? Is the status code of each page 200 (ok), or is the status code of each page segmented between 301’s, 404’s and 505’s? If so, what is the underlying issues causing server side errors, pages to time out, or dead links? Are the permanent redirects set up properly and are they helping or hurting your site?
  • Are the file sizes of the site too large for the compute resources (bandwidth, CPU, RAM, Disk) which power it? Is the site loading within the optimal time span or is both the size and unstructured depth of the site negatively impacting page and domain load times?
  • Is the signal content of the site aligned to the page location that content lives on? Is there contextual relevancy between page URL, Title Tag, Meta Descriptions, Contextual Keywords, H1 — H6, and CTA’s? Does the content of the site send the right signals to Google for it to be indexed in the best fashion?
  • What is the structure of the website? Is the structure — the physical site map — set up in logical content funnels? Is the structure of the website balanced between all navigation content silos or is it heavy handed within two of the silos and light within the remaining three? To support the structure, are their natural linking points to drive and ping pong user traffic within and between content/conversion points?
  • What is the size of this script? Can it be optimized by cleaning up code or can it be optimized by deploying the script in a different language?
  • What is the load time of the script on page? Does the script load after a variety of other technologies or does it load at the start of the waterfall?
  • Are all tracking implementations firing correctly? Are all tracking implementations held within the correct JS container? Do any tracking implementations cause redundancies or add unnecessary data outputs for the platform?
  • Are the CSS and HTML implementations of each page element working with one another? Are there any errors in the syntax causing a link to time out or a visual element to show itself incorrectly across hardware platforms?
The inspect element, given through browser client development tools, is another avenue to understand the physical scripting, CSS, padding, HTML, etc. elements of any given website.
  • Are all elements of the site set up to perform under load or will a spike in traffic break certain containers? Are all elements of the site set up to perform between all hardware profiles (desktop to variety mobile) or will certain elements negatively impact mobile ranking factors?
Page Source. Learn to read and make sense of it.
  • Site speed
  • Page speed load times
  • Waterfall script issues
  • File size libraries
  • Website load time from multiple international locations/servers
  • Overall site health
Website landscape scrape.
3rd Party Website Traffic Analysis.
On a granular level, a tool like BuiltWith provides you access to the real time functionality of a website. The platform provides you with excellent insight into what technologies make a site run. By understanding this, you can direct deduce what the capabilities of that site are or what the downfalls of the site are given current market evolution of the digital platform.
  • The type of CMS a website uses
  • The email providers they deploy to carry out messaging/DB management
  • The NameServer and Hosting profile of a website
  • Deployed website frameworks and code bases
  • JS Libraries
  • Website Tracking Implementations
  • Deployed CDN’s
  • Website Encoding

--

--

HEALTHWELLNEXT is a thought leadership platform focused on forward-thinking insights, strategies, ideas and innovation, brought to you by The Bloc.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Brad Yale

Nerd at heart. I write about health, tech, data, search and content.