User Testing in the East Timor hinterland

Mautinoa Technologies
14 min readJan 29, 2018

--

TL;DR: To build our product, we had to take user research to a remote place. This is the story of what we did, what went well, and what didn’t.

Julian Finn leads the Development and Product Team at Mautinoa Technologies, a startup aiming to bring banking-like digital payment solutions to the most vulnerable communities in the world with no access to the financial system. The team assembles experts with backgrounds in IT security, software development, finance, humanitarian work and UX. Together, Mautinoa devised a digital payment system that does not have to rely on a stable internet connection or expensive smart card terminals, but rather builds upon smartphones, smartcards, and NFC technology.

Our Product

Right now our product is a testable prototype, and we have created a partnership with World Vision Timor-Leste and local businesses in order to roll out our platform throughout East Timor, especially in rural areas where subsistence farming is the main source of income.

East Timor nowadays has good mobile network coverage, and most of the country has access to electricity. However, as the country still only has a satellite connection and point to point microwave to the internet, connectivity is flaky, especially during the rainy season. Any technology that requires a “constant connection” is not an option. Hence, we developed technologies which can work offline for extended periods, while providing resistance to double spending via some almost-but-not-quite blockchain technology (subject of a future post).

East Timor

That’s why I travelled as far from home as I have ever been, to East Timor, the eastern part of an island in the far east of the Indonesian archipelago, north of Darwin, Australia.

East Timor is the first country to be recognized in the 21st century. It gained independence in 2001, after 25 years of occupation by Indonesia, who came in when the Portuguese left. After a referendum on independence and some violence, the UN moved in under Australian leadership. In the ensuing short conflict, the infrastructure got badly damaged, which is still quite evident more than a decade later. Malnutrition still persists to this day, with high levels of childhood stunting unseen in the rest of the region. (As a 6ft European man, you really tend to feel like a giant there, which is depressing). Health care is still touch-and-go outside of the capital, Dili, which makes travelling a bit more of an adventure than in other parts of Southeast Asia.

In the countryside, subsistence farmers comprise the largest portion of the population. Due to climate change, in combination with weather phenomena such as El Niño (or La Niña in the last year), the food security situation is quite fragile. On top of all of this, the political system is in deadlock, and the oil which is East Timor’s main source of national income is rapidly depleting, with some estimates placing the end of production as early as 2026.

I was invited by World Vision to come here, and after a few days in the capital Dili, we set out with a 4WD pickup truck to get to Aileu, the name of the capital as well as its district. It’s the second smallest in East Timor, and lies around two hours south of Dili.

Our goal is to come as close as possible to what we call user research, as well as user testing, with people who live in the local villages around the small town of Aileu.

Our “Field Office”

User research in the field

My job is being responsible for the product and development team, so undertaking a user testing and design research trip is not the logical thing to do. However, there were quite a few factors that determined me to be the first of our team to travel to East Timor. In the end, I am responsible for making sure that the people in the places we have identified as our first test districts accept and understand our product.

We have built our prototype with a certain understanding of societies in developing countries. This means, for example, subsistence farmers with a low literacy rate; communities in the countryside with a low level of technology adoption; and/or bad health care, which leads to many problems, for example people not being able to treat bad eyesight or afford glasses.

Why this situation is different

When we founded Mautinoa, we didn’t know whether our first deployment would be in East Timor, the Fiji islands, or in Syrian refugee camps in southern Turkey. As you can imagine, these groups don’t have too many things in common. They don’t share much in the way of culture or social customs, their access to money is entirely different, and we’re not even talking about visual design decisions that would specifically help with accessibility in the area in question.

Quick example: our prototype design comes with a guilloche (the geometric structure that is often used on bank notes) in the background to roughly resemble money. This is something that everybody in Europe would probably recognize. Here in East Timor, people use the US Dollar, which looks completely different. The whole “look, this app is about money” design idea is completely out of place here.

Prototype Design

Best Practices? Improvisation!

There is a lot in the classical playbook for doing user research. For example, you figure out what you want to know from people and conduct a qualitative interview in which you ask open-ended questions. Or you show the user your prototype and record their reaction, maybe filming this with a camera. Or you discuss particular usage scenarios, figure out what problems people have in these scenarios, and identify how your product may be able to solve this.

This becomes quite a bit more complicated when your interviewees don’t understand the basic concept of “user/market/whatever research” or even “interviews” and their language is Tetum. Thus, you have to bring a local translator with you, whose English is far from perfect and you know from the start that things are going to get lost in translation.

We knew most of this, and set the goal of finding out how people would understand the basic user flow of our system and how we could create trust, starting with the app’s design, over to onboarding, and finally getting a high acceptance rate within the population. In order to do this, we devised three blocks of roughly fifteen minutes each, so that we had 45 minutes per interview with a little leeway and a break afterwards — one hour in total.

The interview questions were set up to determine the basics of people’s lives. We started with simple, trust establishing questions like “what is your name?”, “where do you live?”, “what is your occupation?”, and “how old are you?”. Then we went on to questions about their relationship with money, such as “How much money do you have at your disposal a month?”, “Where do you get money from?”, “Who gives you money?”, or “For whom are you financially responsible?” Finally, we wanted to ask about their technology usage. “What technology do you regularly use?” or “With whom do you communicate using technology?”

Interview setup

User Testing

Next, we planned to show our participants our app to explore. Finally, we had the very specific aim of finding out whether people would understand the concept of “tapping an NFC card to a phone”. Prior to our trip, we had the app translated into Tetum.

Translation, however, is a complex process that can’t fully be achieved with an Excel file. It needs iteration, checking in on every screen of the app, seeing if the context is right, and correcting anything that is difficult to understand. That, however, requires the translator to understand the app itself. It becomes a long process for which we didn’t have the resources, as it would have involved training the local translators.

The interview itself also turned out to be much more challenging than we had initially hoped, as there was no cultural understanding of the situation. Open-ended questions such as “What do you spend money on?” led to a long series of follow-up questions trying to dig deeper. A typical interview went something like this:

Q: “What do you spend money on?”
A: “On rice and oil.”
Q: “Any more?”
A: “No, that’s it.”
Q: “Do your kids go to school?”
A: “Ah, yes, and for tuition and stationery.”
Q: “And do you travel on the bus from time to time?”
A: “Ah yes, I do that.”
Q: “And do you give money to your family?”
A: “Ah, yes, I have a younger brother who studies. I help him out from time to time.”
Q: “And what about buying things for your house?”
A: “Ah, yes, we sometimes have construction work done.”

In this case, it seemed that the concept of “answering in generalities” was difficult, and people only thought about the past few days or weeks.

Asking “What technology do you use regularly” was met with a blank stare. “Do you have a smartphone?”, a much narrower question, revealed that this concept was quite foreign. “Do you have a phone that can use Facebook?” at least gave us a “yes or no” answer and indicated that the participant was broadly familiar with smartphone use, but obviously was way less satisfying. (By the way, Facebook has 400,000 active users in a population of 1.2 million. Many people here are in some way familiar with the concept of using apps.)

The same way of answering appeared everywhere. The broader and more general the questions got, which should have been the start of an open-ended conversation, the more confused the participants became about how to answer them.

We started our conversation with something along the lines of “We are from World Vision and we trust you, so we want you to trust us and give us honest answers and help us to make our project better”. Still, as hard as we tried, we still had the impression that people were trying to please us. One interview ended with a man saying, “Thank you for coming all the way from Europe and bringing us the newest technology to help us.” That was cute and nice, but it was also far from the point.

We can only speculate as to why this is the case. Maybe it has to do with World Vision’s popularity within the communities (all interviewees were also participants in WV programmes), or it might be about power structures in place. Or, maybe, it is just the sheer unfamiliarity of the situation.

This, however, absolutely does not mean that the interviews weren’t helpful! We learned a lot about the financial structures of these tightly knit communities, where everybody helps out family members, sometimes loaning money to neighbours. We learned about what people spend money on in general. We learned how people use bank accounts (they usually go to pick up government money once every six months, from veterans’ pensions to school subsidies, and have empty bank accounts for the rest of the year). We learned about social pressure to help out with excess money, and how financial illiteracy means that people would rather get their pensions twice a year instead of every month — because then they could spend a little extra directly after payday. The interviews not only gave us ideas for countless extra features, they also helped us set the focus on the core product which we will now iterate towards a user experience that is much closer to the realities of people’s lives. The same goes for the business model and questions about rollout in these communities, which is now much clearer and is backed by actual research.

Showing the users our app showed the next challenge. Asking them to look at the app, play around with it, and show us what they think mostly resulted in reading out loud the writing on the buttons. We had originally brought a 3d-printed case with us, attached with a swan neck camera, to film the interaction of the user with the app.

This was all great in theory. Unfortunately, people tended to stare at the screen and be scared to really play around with it, mostly reading it and not understanding anything. So, we resorted to asking them to tell us what they think this app does. The basic concept of “design” is foreign here, so any additional request to “explain your thoughts about what you see” was lost. However, in general, it was quite insightful, as it showed us the hopes and expectations of people as to how their lives could change.

Explaining to our users what the app would do and recording their reactions also got some results we absolutely did not expect. We had envisioned that people would be keen on having a solution that didn’t require them to travel to Dili for certain transactions or that would make it easier to get government money (or remittances from family abroad) than queuing at the bank for two hours (plus another 2–3 hours to get to that bank). As it turned out, they were, but some users also gave answers that surprised us.

One benefit people saw is that a digital payment system would hide the amount of money people have. These communities are tightly knit, and “having money” immediately creates social pressure to give some of that money to family, whether as support in tough times, for their children’s tuition, or for family events such as weddings or funerals.

People also explained that they liked the idea of not losing their money anymore. Over 50% of the participants complained of having holes in their pockets. People liked the fact that they could keep their money safe. As financial literacy is not widespread, people tend to spend all the money they own, without saving any. So “this card can help me save some money” was an answer we heard often.

Users also understood the concept of tapping a card to the phone. We had worried about how to communicate this, but we had much less trouble than we originally expected. Even the older man we interviewed had an easy time with it. His complaints were more about the general process of using the app and not being able to read the writing, which is obviously something that can be fixed.

By the way: yes, user testing in rural communities is about like you imagine. At one point, a participant had to chase out a chicken that entered through the door to see what we were doing there.

Participant chasing out a chicken

Local Economy

The next morning, we took time to go to the local market, where farmers sell their produce.

The items on sale are mostly fruit and vegetables, as well as clothes that are clearly from Western charity donations. Local farmers sometimes undertake a two-hour journey to be able to sell their goods, sitting in the sun all day only shaded by an umbrella.

The local economy is largely based on this market, and you can find small kiosks everywhere. Supermarkets are non-existent. One larger store, run by a Chinese man, sells whatever else people need, including hardware and building materials.

Market Day in Aileu

Further Interviews

After two days of interviews, we had the feeling that we were getting the same answers over and over again. People here are way less individualistic than in Western countries, and most lives are pretty much the same. Also, the answers and our expectations differed significantly, so we tried something new and got a group of four people into a discussion. This, we hoped, would get people to open up, as the original setup — three researchers (my World Vision contact, the interpreter, and me) to one user — was likely an intimidating scenario for our users. By making the interviewees a majority, we hoped to gain better insights.

This partially worked, as the participants engaged in a broader discussion. However, a lot was also lost in translation, as the interpreter had a difficult time catching up.

Interestingly, one participant asked about security a lot. East Timor is not a very violent or crime-ridden place, but he consistently asked about being robbed. He was older, so this probably came from his experiences during the struggle for independence.

Conclusion

To recapitulate: user testing in these environments is anything but easy. Locality, language barrier, cultural differences and overall literacy pose many a challenge.

Translation

A great deal was complicated by the language barrier. We were prepared for this, but we still cannot say how well our questions were translated, whether they were posed as “open-ended” as we had requested, and what else was said between the translator and the interviewee. This could be fixed by hiring an external, professional interpreter, but greatly depends on the budget and that individual’s availability.

Locality

Distractions from noise and surroundings, suboptimal furniture, and inadequate lighting are all factors that make the interview itself, as well as recording it, more difficult. Not every place has electricity, and travelling around half the world limits the equipment available. Be prepared for animals like chickens, rats, or dogs running into your setup, motorbike noise, sudden weather changes, or just the simple lack of privacy disrupting your workflow. Also, be aware of these possibilities when choosing equipment. Want to record sessions with audio or video equipment? These are just some of the things that could make it difficult.

Cultural differences

Unless you have a design research expert or at least a sociologist available who is familiar with the local culture and who can lead you towards asking the right questions in the right way, you will encounter difficulties. Many of these are pointed out throughout this post, but there are surely also questions of gender and power structures within the community, as well as equality issues between the interviewer and the participants. If you are white and traveled across half the planet, or if you work with an aid organization that has been deeply involved in improving the communities’ lives for years, chances are that your interview subjects will approach you with hopes and expectations, and there is simply no such thing as “eye level”. This necessarily has to be taken into account for qualitative interviews as well as user testing.

Literacy

Literacy can mean many different things. Being able to read and write is a baseline, but on top of that there are other aspects of literacy that may appear obvious for people in Western countries. Financial literacy among your user base is a key factor when developing a payment app. We anticipated that we might encounter some issues here, but the interviews showed us this much more clearly. Technological literacy was one of our key research targets, and had we not anticipated this, our testing would have been stuck at a very early stage.

We also struggled with our own cultural literacy — that is to say, how to communicate our questions to our users in a way that was natural for them. Especially when translation is bumpy, this can be a difficult challenge that we definitely haven’t completely mastered.

Overall

We never expected to learn everything we needed to know in order to build a perfect product with just three days in the field and no further iteration. However, the goal of getting a better understanding and laying the foundation of adapting our product to the needs of 1.2 million Timorese worked out fairly well, given the circumstances. Given the solid funding of the project, our next steps are incorporating our findings into our prototype, doing more user research, and then iterating from there.

Next Steps

Our partners at World Vision will continue to do more interviews with segments of the population that are likely to be less challenging to approach, such as young students in the capital. The research also gave us enough insight to work on the app’s design and then resume user testing with a new version. From there, we will start a test project with some ten or twenty thousand users, learn, iterate, and create a strategy for mass adoption. The next months are going to be interesting.

Acknowledgements: Thanks to Emerson Tan, Jürgen Geuter, Henning Grote, Michael Finn, and Meredith L. Patterson for feedback and proofreading.

--

--

Mautinoa Technologies

We create security focused technologies for the developing world