How to define a Browser Support Level Matrix

It is important to identify which browsers need to be supported and how. Here's our (step-by-step) approach for Planet 4.

Anselm Hannemann
Planet 4
5 min readJan 27, 2017

--

Define the Audience

One of the first things to do when researching browser support levels for a project is to define who we want to reach with the website being developed. The key is to define all the people the initiative is aiming to reach, not the ones already reached. For Greenpeace, it’s important to reach people around the world — whether it’s in the Americas, Europe, Asia, Africa or an island far away in Oceania. With such distributed public and variety of devices, network conditions and costs of Internet access it’s certainly not easy to build a website properly.

Research User’s Browser Share

Once the key audiences are identified, it’s important to dig into available data on public’s behaviours. If data from analytics softwares are accessible, the first step is to look at the users’ browsers and devices right from the beginning (it’s reasonable to focus on the past three months, since browser versions and choices adapt quite quickly these days). Secondly, to find out which browsers are the most used ones is quite easy. For browsers that have fixed versions (such as older Internet Explorer), it is useful to write down the usage of each version, to understand the portion of users using Internet Explorer 9, for example. For browsers like Chrome and Firefox, it is enough to assume the latest version and write down the share of all versions as one number.

The third step is to write a list of all browsers that have relevance and usage share over 1%, and then compare this list to the audience you defined before. Marking the browsers that aren’t important or don’t need to be tested separately is quite helpful (some folks use the same Chromium version as Chrome, so it can be relatively safe to not test it separately).

StatCounter’s Global Stats (Sept — Nov 2016)

The fourth step is to look at global, independent statistics. One option is using StatCounter’s global statistics, but there are also other (paid) services that provide similar or even more detailed data. It’s important to view the usage of browsers in all the areas of the key audience/s, and the global statistics as a generic indicator. Looking at certain countries or geographic areas is helpful, since behavioural data can differ a lot from the country or continent the public lives in. For example, while Europe has only a very small amount of Opera users, Africa has up to 35% of people using Opera due to unstable network conditions, (Opera offers a proxy-browser that auto-compresses websites to be served faster and with less data). People living in Asia, on the other hand, tend to use UC (18%), a browser not even known by many people in the Americas or Europe. Furthermore, Europe has still many Internet Explorer users, as you can see from the graphs below.

StatCounter’s Africa Stats (Sept — Nov 2016)
StatCounter’s Asia Stats (Sept — Nov 2016)

Set Data Into Perspective

Once both previously collected and global statistics are consolidated, we can focus on the thinking phase, setting all data into perspective, comparing them again with the defined audience/s and figure out whether it’s worth supporting a specific browser (or a version of it) or not.

As I described in “The world uses the Internet differently, each user on the planet visits a website using different network conditions, a different device and having a different budget for mobile data (priced differently across the globe). Even expectations on how a website must look like can vary a lot!

Users that only use proxy browsers like Opera Mini usually don’t specifically care if a website has web font or default font, if an element has nice transitions, backdrop filters or elements with a fixed position. Opera users usually tend to focus on readability and getting information, without any fancy effect.

For a useful Planet 4 browser support grading we can define which browsers need what sort of fixing, if differences from the intended experience should occur.

Define Support Grades

I personally like defining a browser grade in a simple color-table from A-D, where A indicates a fully supported browser and D a not at all supported browser. The second column is only used to showcase examples and everyone should ideally come up with their own.

Exemplary browser support grade definition (explained below)

A: Fully supported browser. All functional and visual bugs should be fixed.

B: Partially supported browser. Functionality needs to be tested and given but visual appearance can differ slightly

C: Rudimentary supported browser. Functionality should be ensured, although only core-features need to work. Visual bugs can appear.

D: Not supported browser.

Using this methodology, we can apply the grading to the list of browsers previously made from owned and external data, listing and grading browsers accordingly.

This process may take some time, and browsers’ grades are likely to be re-assigned a lot in the beginning. Within a few hours, though, we will get a solid list of browsers where everyone in the team can easily understand what type of support a browser has for the specific Planet 4 instance (or other project, of course).

Note: It is by design that grades leave some space for discussion. If in question, the decision if a fix needs to be developed should be decided by the assigned developer and the Project Owner together.

Continuously Review and Improve

Even the best system is not perfect. Browser support lists or matrix get outdated very quickly and can rarely cover all circumstances, all questions, all issues.

It is very important to avoid building this matrix once and forget about it, but to re-evaluate it regularly and adapt the system to specific or new needs. If the development team needs more specific numbers and hints, they should be added. It’s everyone’s matrix, and everyone’s responsibility to grow the audience of the portal.

A well defined browser matrix can help defining the audience and show eventual problems of the code: if you need to add fixes for the same browsers again and again, it means your codebase has not been built for this browser, and the target audience is not clear to everyone in the team. Maybe even the audience definition was wrong…

If you agree, disagree, or have additional thoughts on this, please let us know and feel free to share the article on your networks. It’s important for us to get feedback and share our findings in the public.

--

--

Anselm Hannemann
Planet 4

Founder of https://colloq.io. Frontend Developer and Photographer. Ethics matter. I do @colloq_io, @wdrlinfo, @nightlybuildio, former @workingdraft host.