Designing a test strategy for updating mobile functionality of p5.js — Part 1

Sithe Ncube
7 min readJun 14, 2018

--

A Google Summer of Code 2018 project

Organisation: Processing Foundation
Mentor: Lee Tusman
Project: Test strategy for maintaining and updating mobile functionality of p5.js

The first time I heard about p5.js was during an experimental games workshop by Marie at A MAZE./ Johannesburg in 2017. p5.js is a JavaScript library by The Processing Foundation that reinterprets their language Processing for the web, to make coding accessible for artists, designers, educators, and beginners. You can play around with p5.js in their web editor here.

I enjoy looking for new ways to express ideas and interaction so I made sure to make note of this in my little zine from Marie’s workshop. When I got a chance to check out the p5js.org site and saw interactive doodads dancing over a tutorial video, I had to make a mental note to check this out when I got my next chance.

A Kind of Play: An experimental workshop by Marie Claire LeBlanc Flanagan

This is my first time participating in Google Summer of Code and I’m happy to be working with the Processing Foundation to work on p5.js. I’m passionate about education and improving user experiences so being able to work on mobile usability for p5js is a great first open source project to work on.

If you’d like to know more about what I’ll be doing this summer (winter in South Africa), keep reading this guide on how I’m designing a test strategy for maintaining and updating mobile usability on p5js.

Preparations

The first month of my project is focused on gathering and exploring tools for testing. Before getting started, I had a peek at the Github issues related to p5js mobile usability. I took a note of the known issues marked area:mobile to have a good idea of what I can expect to address during the summer and what to consider during testing. Having a look at the Github issues, most involve input detection. That will be important to take note of when deciding the tools used during the project.

In preparation for this project, I also spent some time reading about Cross Browser Testing on the Mozilla Developer site. I recommend this as a starting point for those interested in beginning with web testing.

Defining the Testing Priority

The goal of this project is to make sure that users have a much better experience with p5.js in the current version of their mobile browsers and developers and contributors can keep track of issues fro mobile. Intuitively, we would ensure optimal usability by making sure p5.js works well in all browsers on all devices. But that’s a lot of devices…and a lot of browser versions.

To narrow down the scope I looked at the Google Analytics data of the p5.js site from January 2018 to June 2018, and identified how mobile users accessed the site. According to the site about 44.90% of users access the site on iPhone and iPad, 53.73% of users access the site on Android and the remaining 1.37% on various other operating systems.

Users accessing p5js.org on Android, iOS and other operating systems

So we can assume there’s a fairly even split here on Android and iOS usage. These will be the two operating systems we shall prioritize. iOS users access the site on iPhone and iPad. On Android, the most widely represented devices appear to be Samsung Galaxy S8, Google Pixel 2, Moto G4, Xiaomi Redmi Note 4 and the Acer Iconia Tab 8. These details will come in handy when we need to emulate devices for testing.

When looking at the browsers, the vast majority of mobile users access the site on Chrome with 73% of usage on the charts. Firefox and Safari are both accessed by approximately 11% of users. The remaining percentage access the site using Opera, IE and various smaller browsers.

Users accessing p5js.org on various browsers

Even with our browser scope narrowed down, collectively there are hundreds of versions of these browsers to test. But there is no need to test the much older deprecated browser versions. For this particular project we’ll focus on testing the two most recent browsers versions of the top 3 browsers. So this would be Chrome v67,v66, Firefox v60 ,v59 and Safari v11,v10.

Thus the testing priority specifications for this project are:

Testing tools

In addition my own devices available for quick testing (an Infinix Note 4 X572 and Lenovo Tab S8–50), I would need some way to emulate the devices I do not have. Moreover, a way to quickly replicate and record tests across multiple multiple browsers.

There are many live and automated browser testing suites out there and some are free or have limited free plans. A fair amount of my first few weeks was spent trying out various cross browser testing tools that would be suitable for this project. To make the most use out of these tools for the project, I figured the best way to decide on what I would use would be to extensively test out the free plans (which mostly lasted 14 days) and come to a conclusion at the end of the first 3–4 weeks on which tool I would like to use. Here are some of the testing tools I explored in the last few weeks:

  • BrowserStack
  • Sauce Labs
  • Testingbot
  • Browsershots.org
  • Endtest.io
  • Browserling
  • Lambda test

Some things I was looking out for when deciding which testing software to use.

  • Live testing on priority devices
  • Availability of most recent browser versions
  • Ability to record screenshots
  • Automated testing capabilities

With these specifications in mind and after trying these tools out, I ended up deciding on TestingBot. TestingBot is one of the few suites that had my target devices available for live testing, has the most recent versions of the browsers listed and also with a basic plan allows me to have unlimited screenshots and unlimited live testing. You can also take screenshots on up to 25 devices at once and view the results simultaneously. Great for quick cross browser comparisons.

Screenshots galore
Live testing on an emulated iPhone 6s Plus in testingbot

Test cases — Defining the scope

Now we get to what we’ll be testing with these tools exactly. 3 months may not be enough time to extensively cover all the functionality of p5js. So I decided the priority within this project would be to ensure the p5.js Examples work well on the listed mobile devices with visual UI testing. Through testing the examples, we can look at multiple functions operating at once and also see what learners experience when going through the site on mobile. Ideally I would also like to create tests from the references listed on the website as well. Once I have completed UI testing of the examples and addressing Github issues,

To keep track of what I come across during testing and the results, I’ll be using ye olde spreadsheets of yore. Here is a screenshot of what the current spreadsheet looks like.

Snippet of spreadsheet used to recordtest cases and results

The spreadsheet format is expected to change as I get into the emulated cross-browser tests.

Next steps

The first month of this project involved a lot of planning, learning, trial and error, and testing. With the scope now defined and an arsenal of tools to tackle this project with, over the next few weeks I’ll be working on the following:

  • Emulated crossbrowser tests and recording results
  • Generating visual results of expected behavior
  • Fixing listed issues on Github and adding further details
  • Hosting a p5.js coding session at my university to get user feedback on various mobile devices

We’re 4 weeks into the Summer of Code and have 8 more to go. What will you be working on this summer?

PS: If you’re interested in getting started learning programming with Processing, have a look at these tutorials and play around with the editor.

--

--

Sithe Ncube

Sithe [see-teh] Computer Science and Mathematics major. Involved in STEM education and game development. Zambian living in South Africa 🇿🇲 🇿🇦. Loves cake 🍰