Performance Matters🔥 — Part 1
… and why you should care
The Web has evolved a lot in the last decade. As of 2018, we have reached 4 billion users worldwide. Lots of exciting things are happening as we empower more users to contribute, constantly pushing the boundaries as new frameworks are being created, tools built, and standards drafted to ultimately expand the concept of what we define as “The Web”.
These exciting breakthroughs have pushed the Web in all sorts of directions that we never thought possible. Sites and applications have become richer in functionality and more complex. As developers, we have become accustomed to adding different libraries and packages that enrich our users’ experience and breath life into our applications. Let’s dive deeper into the cost of these heavier sites, and how we can mitigate their drawbacks. That brings us to our topic! So get your popcorn 🍿 ready and prepare for the ride 🎢.
What is performance?
Wikipedia defines “web performance” as:
The speed in which web pages are downloaded and displayed on the user’s web browser.
While this definition is technically correct, it also raises many questions:
- What is considered “displayed”?
- How do you measure speed?
- Which metrics should I use?
- Which web browser should I test this with?
Through this series of blog posts (yes, there are more coming!), we will touch upon all these questions, so let’s dive deeper into why performance matters and what it brings to the table for your product.
Why performance matters?
Most of us have experienced a slow site and it can be quite disheartening. Most likely you are not even using that product anymore in favor of the competitor that was more performant. You are not alone. A user that gets frustrated waiting for your application to load will likely move on before even considering becoming a customer.
Rigor, a performance advocating company, wrote an excellent article titled performance matters for marketers. They remark on how performance impacts a user’s experience:
A few hundred milliseconds can make a big difference between users hitting a back button or engaging with your content. Waiting time gives your visitors a chance to switch to another activity, or even worse, drives them to a competitor’s website.
The author also goes on to identify three primary reasons why performance gets neglected:
- Lack of ownership
- Uncertainty about where to start
- Unclear expectations about success
(While I won’t go into each topic, I greatly encourage you to go through their article!)
I want to dig deeper into what “Lack of ownership” means. More often than not, what bogs down a product’s performance comes from beyond its technical specs; the major hurdle is cultural. Without someone that champions performance in the company, it quickly becomes an afterthought with excuses thrown left and right like “it’s nice to have but we don’t have the time to address it” or “we’ll fix it when someone complains.”
This champion, unfortunately, can’t be anyone; it must come from the top. It can only be someone who leads by example and instigates changes in the company’s culture such that performance is embraced as an essential aspect of their products.
This not only affects the developer as an individual but other teams as well. Marketers, project managers, and designers must understand the significance of performance since they also have a considerable impact on decision making that ultimately impacts performance.
If it has not become clear how performance affects a product, let’s consider some metrics collected by other companies. The following website collects metrics for multiple companies: https://wpostats.com/.
Some noteworthy case studies include:
- “Rebuilding Pinterest pages for performance resulted in a 40% decrease in wait time, a 15% increase in SEO traffic and a 15% increase in conversion rate to signup.” [Link]
- “Instagram increased impressions and user profile scroll interactions by decreasing the response size of the JSON needed for displaying comments (by 33% for the median and 50% for the 95th percentile for the main endpoint).” [Link]
- “Removing one client-side redirect from Google’s DoubleClick resulted in a 12% improvement in click-through rate.” [Link]
In contrast, poor performance can be a liability:
- “BBC has seen that they lose an additional 10% of users for every additional second it takes for their site to load.” [Link]
- “53% of visits are abandoned if a mobile site takes longer than 3 seconds to load.” [Link]
- “TRAC Research found, in a survey of 300 companies, that the average revenue loss for an hour of downtime was $21,000. For the same set of companies, average revenue loss due to an hour of slow performance (defined as load times exceeding 4.4 seconds) was $4,100. Website slowdowns occurred ten times more often than outages.” [Link]
Google has also started penalizing sites with poor performance. So if SEO is important to you, consider how your application’s performance may be helping (or hurting) your conversion rates.
Usually, when we talk about measuring performance, people think that “if this site loads for me, it must load as fast for my users!”. This is one of the most common mistakes developers make. The machines and networks used during development are often an order of magnitude faster than ones the end user will likely be working on, so what you think loads fast may take quite a long time to load from your consumer’s perspective.
To get a brief idea of where your application is currently standing, you can start using this site https://testmysite.thinkwithgoogle.com/. Alternatively, go a step further and compare your site to your competitors with https://www.thinkwithgoogle.com/feature/mobile/.
Other popular tools to measure performance are:
Simply testing performance without understanding the metrics by which it is measured is not very useful. So, some of the most common metrics used to test performance are:
- Page Load: How long it takes for the handler window.onload() to be fired (excluding any deferred scripts).
- Time to First Paint (TTFP): The time it takes from the moment the user types the URL until the first non-white element on the page gets rendered.
- Server Time: How long it takes for the server to respond to the initial request.
However, these metrics are often misleading since they are not tailored to your particular product but were created to be somewhat relevant to as many websites as possible. For example, while knowing your TTFP may be useful, there may a delay between the first paint and time to interaction (that is, the time when the user can interact with the page) and that can have a greater negative impact on your users.
Lastly, I would like to point out that performance is not all about numbers. In future posts, we will look into how we can “trick” the user’s perceived performance by doing some small changes with respect to how the app works and how the data is presented.
How can we improve performance?
There are actually tons of resources and techniques out there (my favorites will be included at the end of this post) and we will be digging deeper into the details in the next blog post.
However, you can start embracing it today by making it part of your company’s culture. Start by looking into performance as a feature, so it doesn’t slip through the cracks. Measuring performance should be also on your roadmap, that way you can prove that the time you are investing in performance is paying off and through the development cycle there are no regressions. Companies like Pinterest have made performance testing part of their CI pipeline and going as far as having their own dedicated performance team which helps and guides other teams to achieve their performance goals.
This fast-paced environment has had its consequences on the user’s experience. Hopefully, thru this blog post, you were able to find that performance also affects the entire company’s culture and other teams alike. Mitigating this problem is easier than it seems, and you can start today!
In the next post in this series, we will cover Webpack 📦, and how you can start using it today to boost your app’s performance.
Special thanks to:
Serge Basile, Collin Cierlak, Ben Koch, J Smith, and Susan Thai.