Visualize the correlation between web performance and product KPI

Shogo Sensui
10 min readOct 4, 2017

--

Today, I introduce my activity to visualize web performance and product KPI using some metrics such as Speed Index, First Paint, First Contentful Paint.

Traditional Web Performance Metrics

In the past, I think that the value of DOMContendLoaded event or load event was treated as a good metric of Web page performance in many cases. However, it is no longer an appropriate indicator to measure the performance of Web pages that are getting more and more complicated in architecture.

DOMContentLoaded — DOM construction is completed

The DOMContentLoaded event represents completion of construction of the DOM tree, but it does not guarantee completion of subresource loading occurring while evaluating the HTML document. From here, the browser loads the CSS files, synthesizes the DOM tree and the CSSOM tree and finally draws the page, so it does not indicate the display state of the page content that might be related to the user’s experience.

Load — All resources are loaded

The load event means loading of CSS files, JavaScript files, subresources such as images, and so on. The display of the content is almost finished when the load event occurs, but in reality most of the content is often displayed before the load event occurs.

Modern Web Performance Metrics

New metrics are introduced in Leveraging the Performance Metrics that Most Affect User Experience focusing on user experience in these days. Some of them are being defined as Web Standards.

First Paint and First Contentful Paint — Anything displayed

Both First Paint and First Contentful Paint are indicators to evaluate how quickly the page starts to display after the browser navigates. First Paint refers to the timing when something is displayed, and First Contentful Paint refers to the timing when some content (text, image, etc.) is displayed. Although neither may be a numerical value directly linked to the user experience, it is useful to evaluate the performance of the web application, suggesting the status of loading subresources and the state of critical rendering paths.

The timing defined for First Paint and First Contentful Paint is clear and standardization is being promoted as an index of Paint Timing API. The Paint Timing API is already supported by Chrome 60 and can be measured.

First Paint and First Contentful Paint

We can use them with the Paint Timing API introduced in Chrome 60.

First Meaningful Paint — Anything meaningful for user displayed

First Meaningful Paint refers to the timing when a meaningful display is made to the user. It is equal to whether the user is able to provide the expected content, so shortening this can be expected to have a positive effect on the product as it is.

However, it is difficult to standardize unlike First Paint and First Contentful Paint, because “depends on the user’s meaning” depends on the nature of the product and how it is implemented. To measure First Meaningful Paint, you will define the timing according to the nature of the product and calculate it using the User Timing API etc.

Speed Index — Score of drawing amount in first view

Speed ​​Index is a score obtained by summing the proportion of unrendered content with respect to the elapsed time from the start of navigation in the visible region, and it is an index introduced in the performance measurement tool called WebPagetest developed by Google.

Assume that there is a page A and a page B that display the same content, and the timing of page load completion (generation of load event) is the same for both pages. If, on page A, 93% of the visible region is displayed after one second, whereas for page B only 18% of the visible region is displayed after one second, it is clear that page A has excellent performance.

In order to compare them, try graphing with the elapsed time on the X axis and the drawing amount on the Y axis.

Visually Complete Progress of “A” and “B”

Although the completeness with respect to the elapsed time can be set as the drawing progress rate, there is a disadvantage that the value becomes unlimitedly large as the loading time until the completeness reaches 100% is prolonged. Therefore, if you calculate the area above the graph with respect to the elapsed time, the smaller it is, the better it is (more is drawn in a short time from the start). This is the Speed ​​Index.

Remaining to be rendered for “A” and “B”

WebPagetest uses the video capture when the test agent accesses the measurement page to calculate the Speed Index using the degree of matching between the frame at completion of loading of each loaded frame as the completeness. SpeedCurve which uses WebPagetest as the backend uses the same calculation method.

There is another way to calculate from the drawing event of the browser (Layout, Paint). By executing RUM-SpeedIndex which is calculated by using the Resource Timing API, it can measure even on the browser, and Lighthouse which is bundled with DevTools also uses a tool called speedline to measure it.

For this time, I decided to use Speed Index as a comprehensive performance evaluation index of First View related to user experience, and to measure with each Web product. Although it is not being developed as a Web standard, introduction is easy because there are already measurement methods such as WebPagetest, SpeedCurve, Lighthouse, RUM-SpeedIndex.

Synthetic Monitoring and Real User Monitoring

There are already a lot of ways to measure performance as introduced. They are largely categorized as Synthetic Monitoring and Real User Monitoring.

Synthetic Monitoring

In synthetic monitoring, we measure the performance when accessing the target URL from a controlled environment. As browser, server location, network speed, etc. are fixed, variance per measurement is small, and it is suitable for detecting application bottleneck. The previously mentioned WebPagetest and Synthetic of SpeedCurve etc belong to synthetic monitoring, and the Web product of our media business has continued the performance measurement by SpeedCurve. Following articles are written in Japanese :)

As the environment for specifying the bottleneck of the application is in place, it is becoming important to detect regressions such as loading speed unintentionally decreasing in the function release, and how much effect is obtained when tuning the application.

Photo by Joey Kyber on Unsplash

Real User Monitoring

With synthetic monitoring, we can improve application performance with confidence, but cannot measure how performance is affecting the user experience. In a real user monitoring, performance is measured in real user environment where both device performance and network environment varies, so we can measure performance effect on the user experience.

Aggregation of real users’ SpeedIndex for each product

The purpose of using real user monitoring is to measure what kind of performance the real user is experiencing and to know how it correlates with the project indicator. For this project, we introduced a measurement script of RUM-SpeedIndex for each project, we decided to send the calculated Speed Index to Google Analytics and gather data.

There are a couple of reasons why I used Google Analytics as your aggregate destination.

  • Since the function to measure the bounce rate and the page view per session is originally provided, you can see the correlation with them by sending the Speed Index as custom data
  • Since we have already installed Google Analytics on most projects, the barriers to measurement are small. Simply load RUM-SpeedIndex and send the execution result to Google Analytics to complete
  • Many people are familiar with tools. I also have a full-time marketing staff for the project so I can ask for detailed analysis
  • Data can be acquired through add-on, so you can freely format and visualize using Google spreadsheet etc.

After the real user’s Speed Index is aggregated for each project, we can see how it correlates with the bounce rate and page view for each session. When preparing and introducing, we received a lot of cooperation from every project (especially Suma-san, web analyst) 🙏

Correlation between real user Speed Index and bounce rate and pageview number per session

The following graph shows the results of Project A as a graph. The horizontal axis is the Speed Index of the actual user environment calculated with RUM — SpeedIndex. The blue line shows the distribution of the number of users, and it can be seen that the ratio of users with the Speed Index near 1000 is high (The cases where the Speed Index is 0 are also graphed, but ignored as outliers ). In addition, the number of users has been sampled, it does not accurately represent the actual number of users.

The red line is the bounce rate (Bounce Rate, top graph) and the page view per session (PageViews / Session, bottom graph).

First, looking at the bounce rate, the better the score of the Speed Index (the smaller the value) the lower the bounce rate tends to be. In addition, it is flat with the Speed Index = 1500 as a boundary, and on the mobile Web of Project A, you can see that there is no change in bounce rate when it is more than a certain delay.

Bounce Rate distribution map of Project A

Next, I will look at the page view per session. As the score of the Speed Index gets better, the number of page views has also increased, and it has remained almost flat from the vicinity of Speed Index = 1500.

PV/Session distribution map of Project A

From these two points you can see that the better the score of the Speed Index, the better the performance, the lower the bounce rate and the more page views per session. This is the result of a certain day of Project A, which basically changes like this without being influenced by release of functions, although there is some increase or decrease depending on the day. From this we can confirm that the user’s behavior depends on the nature of the product, but the performance will affect the KPI of the product.

Speed Index and Product KPI, Milestone in Project

I was convinced that the performance is directly linked to the product KPI, so I will set performance objectives and improvements as project objectives.

As a project, since it is only necessary to increase the ratio of users who have good performance when accessing, the value of the Speed Index separating the top 50% and bottom 50% of the users with good performance is used as a threshold value, and thereafter it is better than the Speed Index We will follow the ratio of users to users experiencing bad performance in days. This should increase the ratio of users with better performance (better than the threshold of the Speed Index you decide) when performance is improved.

After improving the monitoring environment, we improved the performance in Project B and the result was as follows. We released the tuning on the night of July 20th.

Daily Active Users of Project B
PV/Session and Bounce Rate of Project B

The DAU that can be read from the above graph seems to be on a slightly increasing trend although it is a usual trend as increasing trends on Saturdays and Sundays. In the graph below, we plot the page view per session (PV number per visit in the figure) and the bounce rate in addition to the index set in Project B (gray fill in figure), but in the graph below, You can see that the number of page views has improved by about 30% to 40%.

While performance and accessibility are very important for Web products, their effect is difficult to see and tend to be postponed. However, if you can visualize the degree of influence on KPI and show that the quality improvement of the product is a merit to the business, you can begin with prioritizing.

Future of Web performance initiatives

By constantly measuring and visualizing performance and visualizing how it correlates with the product’s KPI, we have made a foundation on which engineers can relax and improve their performance.

In the future, we will analyze by adding the most important KPI set for each project (whether it is purchases at an EC site or whether it is purchases of movie media, etc.), analyze and add First Paint as a performance indicator We also plan to analyze how we correlate with product KPI by adding First Contentful Paint and other metrics.

I would be pleased if this effort for performance is helpful to further improve the quality of your web product. Thank you for reading.

Related links

--

--