A recap of the European Testing Conference Valencia 2019

Andrea Borg
Catena Media RnD
Published in
6 min readApr 10, 2019

Last February Dwane Debono and I, two of Catena Media’s QA Engineers, had the opportunity to attend the European Testing Conference 2019 in Valencia. This gave them the opportunity to attend keynote speeches, networking sessions, workshops and talks. Topics were varied, and with 3 parallel streams, there was something for everyone. To make the most of it, we attended different streams in order to get the best out of the conference. In this article, we will be giving a brief summary of some of the talks we attended.

Andrea(Left) & Dwane(Right) rocking their TestAutonation T-Shirts

Something which stood out from other testing conferences was the networking sessions. Three sessions in the conference program were dedicated to networking: Speed Meet, Facilitated Discussion and an Open Space.

Networking Sessions

First off, was the Speed Meet. Rules were simple, get a piece of paper and draw a personal mind-map. Then get yourself in one of two lines of people facing each other and use your “about me” papers to find common interests. Here’s the catch, after five minutes you have to rotate with a new person. The idea behind this is to continue the discussion during a coffee break or lunch. As conference ice breakers go, this one was pretty good. Warning: A room with 200 attendees all speaking at the same time may get pretty loud.

The second was a Facilitated Discussion/Lean Coffee with the awesome Lisa Crispin. “Lean Coffee is structured, but agenda-less meeting. Participants gather, build an agenda, and begin talking.” — http://leancoffee.org/ The idea behind Lean Coffee is for the participants to create their own agenda. Each topic is time boxed and prioritized (participants vote on which topics they wanted to discuss first), each topic has a 3-minute time box when the 3 minutes are up, participants vote to continue discussing the topic for an additional 2 minutes, or move on to the next topic.

The last networking session was an Open Space. You can say that the Open Space session was a Lean Coffee discussion on a larger scale. Attendees are invited to propose topics for discussion. Each discussion is then allocated a 30-minute time slot and separate room. If any other attendees are interested in a topic, they would just show up to a room and contribute to the discussion.

WORKSHOPS

Workshops played an important part during the conference. Conference attendees were given a choice of two out of five workshops to choose from. This was possible due to workshops being integrated as part of the 2-day program, instead of being held on separate days.

Approval Testing workshop by Mark Winteringham

Approval testingSuperpower your Automation Feedback, by Mark Winteringham was an interesting workshop. The main takeaway of approval testing is to drastically reduce the number of assertions in your tests by replacing them with one approval test. Approval testing is basically comparing two files. If files are different, this will result in a test failure. Approval testing can be used for both API and UI testing. In the case of API testing, a text file stores the Response and compares it with future responses. For UI, instead of comparing text files, screenshots (images) are compared. AppliTools is a popular tool which does this. https://github.com/approvals/ApprovalTests.Java

https://applitools.com/docs/topics/overview/overview-visual-testing.html

Coaching & Learning about API Exploratory Testing — This workshop was focused on how to map and test an API using tools such as XMind and Postman. Starting off, a Hotel Booking Service was used as an example of a web application to test. By using mind maps, we started extracting the areas of the available forms and fields within the application and list each field that we found. We then started looking at the Chrome Developer tools, specifically in the Network tab to extract valuable information on the API calls being processed. Postman was used as the tool to hit the discovered endpoints with different HTTP Methods and payloads, allowing us to find issues and possible improvements for the booking service. The session ended with useful tips on what to focus on while performing exploratory testing of an API.

Talks

Test Automation: Cure vs Prevention (P.U.M.A Tests) was an interesting talk by Jit Gosai. Jit introduces us to P.U.M.A tests which you may argue are the same thing as smoke tests, but giving them a different name and also having a more defined set of tests apparently engages the developers, according to Jit.

P.U.M.A stands for:

Prove Core Functionality, Understood by all, Mandatory, Automated

This is an interesting take on classifying tests. One can still use the above criteria to reorganise a smoke test suite (no need to call them PUMA tests if you don’t want to). Let’s face it everyone has one test too many in their smoke test suite.

Source: https://prezi.com/avbunzbzvk7o/tacvp/

Contract Testing — bye bye Testing Monolith? — In this talk, the main topic was that of testing applications in a microservices architecture. Maarten Groeneweg stated that the interaction between the microservices is generally the hotspot for bugs and thus should be tested extensively. Maarten described how having a contract of the expected behaviour of an API from the consumer’s perspective, allows the creation of a mock. With this mock, the consumers can test their own application against the provider’s mock. Seven anti-patterns were also mentioned, such as not to think of contract testing as a replacement of human interaction and highlighting the importance of covering the negative paths. Groeneweg mentioned some useful tools for contract testing, one of which is called Pact.

Playing Port Authority: TDD for ContainersMoritz Heiber presented the question of why do we test production code but not the infrastructure. With today’s shift towards containerised solutions, the infrastructure also relies on the correctness of the containers. The basic idea is that there are libraries which can check that any specific libraries are installed within the containers once deployed. These checks give assurance that required packages have been successfully installed and production code can execute. Other features can be tested, such as open ports, installed certificates and defined users. The tool mentioned during this talk is called ServerSpec and is available at https://serverspec.org/.

Keynotes

Angie Jones’ Keynote: A Tale of Testing the Untestable

A Tale of Testing the UntestableAngie Jones presented the first keynote of the Conference, in which she communicated the 10 Ps of Testability that describe who and what can make a system change testable or not. Defined by Robert Meaney, the 10 P’s of Testability are as follows:

People, Philosophy, Product, Process, Problem, Project, Pipeline, Productivity, Production, Proactively

With a bit of thought, it is easy to understand how all of these factors affect testability. The main takeaway from this keynote was that testability should no longer be an afterthought when developing a feature. A good story should provide implicit promises on what it should achieve. Understanding these promises would allow us to write tests for it. Having said that, Testability should be part of the requirements.

Conclusion

Conferences are a wonderful opportunity to meet like-minded professionals, as well as discovering new ideas to improve your skills. At Catena Media, this is of utmost importance as it allows employees to come up with new ways to do their tasks and improve their day to day at the office.

For more from Catena Media, have a look at the Product Team and Design articles.
http://product.catenamedia.com
http://design.catenamedia.com

If you’d like to work with us and our awesome colleagues, have a look at our open positions at CatenaMedia.com — We’re always looking for great talent!

--

--