Jetpack Compose with Robot Testing Pattern

Conio Team
Conio Engineering
Published in
4 min readNov 9, 2021

Written by Marco Cattaneo

In these months Marco Catteneo, Android Architect at Conio, has started to work on a new project based entirely on Jetpack Compose: a new world with new rules and also a new way to write UI tests.

The purpose of this article is not to explain how to write tests, because there’s a very interesting article on the official guide about that, but to expose how integrate the Robot Testing Pattern with the Compose semantic testing.

A bit of context about Semantic Testing

Jetpack Compose uses semantics to interact with the UI hierarchy, we should image it as a tree where basically every node is a Composable and for each element we can do a set of assertions.

Source https://developer.android.com/jetpack/compose/testing

Differently from the old world, where all our views were tracked by using IDs, we don’t have it on Compose so we need other precautions to be able to interact with the Composable that compose our Screens.

There are many way to do that, one could be to use a contentDescription or a testTag or again searching a node with a specific text.

And starting from that we can interact with all these nodes by searching in this ways:

How implement Robot Testing Pattern on Jetpack Compose

Now we have some small notions about the semantic testing on Compose and an idea about how we can interact with nodes, let’s introduce how use it with Robot Testing.

If you are not familiar with the Robot Testing Pattern we invite you to watch the Jake Wharton’s talk here: https://jakewharton.com/testing-robots/ , we are going to explain the pattern.

Basically, the idea is to have a structure (a Robot) for each Screens in our project, where this class contains a set of operations that normally an end-user could execute on the interface like: press a button, fill a field etc.

As in many tutorial, the best way to explain it’s starting from an example:

The robot Interface on left, the screen that we want to test on right

As you can see in this sample we have a set of operations that the user can execute on the Screen, our goal is to define an implementation of these operations by writing the compose tests, based on the searching of nodes inside the composable UI hierarchy.

And this is the final result, a class that contains a specific implementation for each user interaction.

Isolates test implementations, from Use Cases

The power of this pattern is that we are creating an abstraction layer in order to interact with the UI in a declarative mode, so after that we can easy write many tests to verify our use cases without boilerplate code, and also without maintenance problems related to a refactor for example; because if we need to change something like a testTag or a contentDescriptor that is changed we need only to edit the robot class and not all test’s code.

With this pattern we can definitely separate the test implementation from the test use cases. Last but not least, we have test more readable and elegant.

Conclusions

This pattern could be a good way to create an efficient set of UI tests, and also is something that you could implement screen by screen, composable by composable.

Marco wrote a sample on Github, it combines also a part of mocking by using Mockk.io

https://github.com/mcatta/compose-robot-pattern-sample

--

--

Conio Team
Conio Engineering

La voce di Conio sulle più importanti news del mondo cripto.