Tackle the 4 problem areas of page bloat

Oliver Lindberg
net magazine
Published in
8 min readMay 18, 2016

Tammy Everts reveals four common problem areas you should target to help speed up your ecommerce site

Every content type on the page, from images to Flash, has grown

Three seconds. Case study after case study indicates this is how long most visitors will wait before they bounce if a page is not loading quickly enough. It’s no coincidence that plenty of studies show us that this is also the point at which business metrics — from page views to revenue — begin to be seriously hurt by slow page rendering. Whether your goal is to convert browsers into buyers or to ensure your advertisers’ content is seen by as many people as possible, your eye should be on this three-second target. However, looking at industry-leading sites year-over-year for the past several years, the opposite trend seems to be in place: pages are actually getting slower.

At the end of 2014, the median rendering time of the top 100 ecommerce pages (ranked according to page traffic by Alexa.com) was 6.5 seconds for primary content and 11.4 seconds for complete load (access the full report).

In just one year, the median top 100 ecommerce page
slowed down by more than 20 per cent

Only 12 per cent of the top 100 pages rendered feature content in less than three seconds, while 22 per cent took 10 seconds or longer to become interactive.

These are alarming numbers. When one in five of the world’s leading online brands takes more than 10 seconds to become usable, that’s a sign something is fundamentally wrong with the way modern web pages are being served to visitors. An equally alarming finding is that the median top 100 page has significantly slowed down in just one year. At the end of 2013, the median page took 5.3 seconds to render feature content and 8.6 seconds to fully load. In other words, the median top 100 page has slowed down by more than 20 per cent in a mere 12 months. And if you find these numbers troubling, consider this: the top 100 sites perform much better than the rest of the web.

Pagespeed and business metrics

Why is the difference between 5.3 seconds and 6.5 seconds such a big deal? While a 1.2-second slow-down may sound negligible, when it comes to PageSpeed, every second counts. Walmart.com found, for every second of load time improvement, conversions increased by up to 2 per cent. At Staples, one second of improvement increased conversions by a staggering 10 per cent.

When I analyse individual pages, I encounter the same four problem areas. If your goal is to improve your page rendering speed (and I hope it is), these are the problem areas you should scrutinise and fix first. The good news is that there’s a lot of low-hanging fruit here. Unless you’re already aggressively optimising your pages, tackling these areas will help you generate some performance wins.

1. Images

With the advent of Retina displays, consumer expectations of image quality have never been greater. Yet with the overwhelming advent of mobile usage, consumers expect those same images to render quickly on their smartphones and tablets. In order to remain competitive, site owners must somehow miraculously meet consumers’ demand for large, high-resolution product images, while at the same time ensuring those images don’t clog the pipe to the user’s screen.

Images typically comprise between 50 and 60 per cent of a page’s total weight

Images typically comprise between 50 and 60 per cent of a page’s total weight. In my research, I found that the bulk of images on most sites are not fully optimised for performance. For example, when analysing the top 100 Alexa-ranked retail sites, I found that 35 per cent of sites fail to compress images, and only 13 per cent of sites got the top page- speed score of ‘A’ for image compression. Image compression is a basic optimisation technique, yet many sites are missing out on it.

To make images render more efficiently, ensure they are consolidated (also called ‘spriting’), correctly sized and formatted. You should also ensure you’re fully leveraging the browser cache and local storage, which will help render times on subsequent pages in a transaction, as well as return visits.

An unoptimised hero image on this retail page takes more than six seconds to render

2. Stylesheets

Stylesheets are an incredible boon for modern webpages. They solve a myriad of design problems, from browser compatibility to design maintenance and updating. Without stylesheets, we wouldn’t have great things like responsive design.

When poorly executed, however, stylesheets can create a host of performance problems. These range from stylesheets that take too long to download and parse, to improperly placed stylesheets that block the rest of the page from rendering. Stylesheets should be placed in the document HEAD, which allows the page to render progressively.

Here, there are three CSS files that take between 5586ms to 7732ms to download. The page relies
on these files to define layout and styles
Both the CSS and JavaScript files for RWD resources are not only heavy (requiring up
to 6528ms to download) they also block the rest of the page from rendering

3. Custom fonts

Custom fonts allow designers unprecedented aesthetic control over the typefaces used in their designs. This desire for control accounts for the surge in popularity for custom fonts. In 2010, only 1 per cent of the top 1,000 websites used custom fonts. Today, that number has grown to 45 per cent.

This popularity comes with a performance price tag, as some fonts require huge amounts of CSS code, while others have heavy JavaScript or are hosted externally — all of which can dramatically slow down page rendering.

4. Third-party scripts

Third-party scripts — such as those used for ads, recommendations, analytics and tracking beacons — offer a number of benefits, including increased ad revenue, higher conversion rates and better visitor data. These benefits explain the recent proliferation of third-party scripts on ecommerce pages. In fact, third-party calls can make up to 50 per cent — or more — of a page’s total resource requests.

All it takes is one unoptimised third party script to take down an entire site

Cumulatively, these third-party requests can have a huge impact on performance. Not only do these scripts increase page weight and latency, they also represent the single greatest potential point of failure for web pages. All it takes is one non-responsive, unoptimised third-party script to take down an entire site. Despite this, site creators often neglect to measure the impact of third-party content on a site’s usability.

The solution is to defer scripts so they load after critical page content. If deferral is not an option (as with some analytics tools and third-party advertisers), then use an asynchronous version of the script, which loads in parallel with the critical content. You should always know which scripts are live on your site, prune dead scripts, and monitor third-party performance constantly.

Faster networks and devices

The growth in size and complexity of pages present critical web performance problems that can’t be entirely mitigated by technological advancement. In tech, we typically think about things getting better, cheaper and faster. And in an ideal situation — in which users have access to great hardware, great networks and well-optimised pages — this is definitely the case. However, most of us tend to focus on this best-case scenario and ignore the fact that worst-case scenarios are rampant on the web. Site owners design their pages and apps with best-case hardware and networks in mind, to the detriment of every other technological use case.

Our use of the web today is highly situational. In the past, we could expect a relatively consistent user experience, as we used our desktop computers on speedy local networks to browse somewhat dynamic (but mostly static) pages that were hosted on, at most, three or four different servers. Today, we’re more likely to use mobile devices on congested wireless networks to browse highly dynamic pages stuffed with rich content and hosted on dozens of different servers. Because of this, we’re seeing an increase in sub-optimal user experiences.

While we can’t affect the end-user environment, we do have a great deal of control over our pages. Luckily, there are a wealth of opportunities to optimise our pages so we can serve our visitors the user experience they expect and deserve.

Deconstructing ‘page bloat’

According to the HTTP Archive, the median top 100 web page carries a payload of 1335KB (at time of writing). One year ago, that number was 1022KB — the median top 100 page has ballooned by 30 per cent in just 12 months. If page bloat is hurting desktop performance — which it certainly is — just think of the pain it’s causing in the mobile arena.

Page size is just part of the problem. Page complexity is arguably an even greater performance challenge than page size. In the 12 months between the last quarter of 2013 and the last quarter of 2014, the median top 100 Alexa-ranked page grew its total number of page resources by 21 per cent, from 87 to 106 resources.

Each of these resources represents an individual server call. Not only does each server call introduce an incremental performance slowdown, it also increases the risk of page failure. For example, poorly executed CSS can create a host of performance problems, ranging from stylesheets that take too long to download and parse to improperly placed CSS files that block the rest of the page from rendering.

106 page resources means 106 potential points of failure for your page

Tammy Everts has spent the past two decades obsessed with the many factors that go into creating the best possible user experience. As a senior researcher and evangelist at SOASTA, she researches the technical, business, and human aspects of web/application performance shares her findings via countless blog posts, presentations, case studies, articles, and reports.

This article originally appeared in the ecommerce guide that came with issue 266 (May 2015) of net magazine.

--

--

Oliver Lindberg
net magazine

Independent editor and content consultant. Founder and captain of @pixelpioneers. Co-founder and curator of GenerateConf. Former editor of @netmag.