TALL App Global Settings

Gaby Cluff
Jul 27, 2017 · 8 min read

Client: Missionary Training Center
Status: In Development

The TALL (Technology Assisted Language Learning) apps needed consistent global settings for selecting and managing language options, adjusting audio and activity-specific settings, giving feedback, and signing in and out of the app. While design work was mostly finished when I joined the project, I learned valuable insights from usability testing and implementing simple solutions with junior and senior missionaries.

Potential Problems

This project was almost finished when I took over for a departing designer. As I looked through what had been done so far, I wondered if the experiences for language management and reporting feedback would be clunky or confusing to the users.

After asking lots of questions and reviewing designs with the product manager, we realized there had been a gap in communication with the previous designer, and little to no usability testing had been done to verify decisions. We decided to take a step back and take time to test before pushing Global Settings into production.

Testing Current Features

The Process

The first step was to check if the overall settings experience was confusing or irritating to users. I prepared a list of tasks for users to accomplish in an InVision prototype (included later on), with specific things for us to look for and questions to ask throughout the test.

Tasks

  1. Sign in
  2. Select Language
  3. Open Settings
  4. Download an additional language (Japanese)
  5. Switch between languages
  6. Delete or remove a language (Japanese)
  7. Open Feedback modal
  8. Give a suggestion
  9. Send feedback
  10. Sign out

Things To Look For

  1. How intuitive is it to delete a language, especially on iOS? For senior missionaries?
  2. Can users find, write, and send feedback through the modal?
  3. Do they read and comprehend dialogs?
  4. How easy is it to exit settings?

Our Findings

Throughout the testing process, three main results were identified.

Swipe to delete
Tap Manage to enter delete mode
  1. All users familiar with iOS devices understood and expected the swipe-to-delete interaction for removing languages. And surprisingly to me, that included the senior users we tested.
  2. If a user wasn’t as familiar with iOS, they just as quickly knew to tap manage and edit languages from there. There was definitely value in having both methods of language management.
  3. Processes for sending feedback and signing out of the app were fairly straightforward and clear. Few if any users had any difficulty in completing the tasks we gave them. Even if the younger users didn’t read dialogs very well. :)

From these findings it was clear to see that the previous designer made pretty accurate decisions when it came to UX. Although it could have been easy to assume this testing was a waste of time, the value laid in that we now it was a positive experience for the users. We could pass these designs on to developers with full confidence.

There was also quite a bit of value in what I personally learned from this process. As we met with users, I realized that having specific tasks and objectives helped to create a more focused and productive testing experience, as well as one that was less confusing or uncomfortable for the user.

Preparation and practice helped me communicate more clearly and effectively with the users. Having questions thought out beforehand weeded out the bad from the good, and gave me a chance to toss and avoid any potential leading questions.

Although the tasks stayed largely the same throughout testing, it was interesting to see how my abilities and awareness improved. I got better at framing each task as part of a bigger story, giving the user context instead of direction only. As my questions improved, answers and feedback became more open-ended instead of simple yes/no. And I found that the more I prepared before jumping into a test, the more confident I was in listening to the user and going with the flow of the test.

Adding Language Pairs

Sketches for language selection layouts

As I finished testing current global settings features, user stories for selecting a language were also updated. Where before it was only possible to access language content to or from English, the TALL apps would soon give users the ability to go from any language to any language within the almost 60 languages TALL offers.

Designs needed to be updated, guiding the user through selecting their native language as well as the language they wanted to study. And of course, we’d need to conduct another round of global setting user testing!

InVision prototype

For this stage of testing, we used the InVision prototype included below.

Tasks & Questions

  1. Sign in
  2. Pick your first language
  3. Pick your second language
  4. What if native or mission language isn’t there? What would you do?
  5. Confirm your language choices — explain what is happening on this step
  6. What would you do if you accidentally picked the wrong language?
  7. Let’s say you need to learn Portuguese now. How would you add a new language pair?
  8. Sign out

One of the biggest questions we wanted to answer through testing was the following; Which language should a missionary select first — native or mission? Is there a right or wrong answer here? What do missionaries expect? Will one be less confusing than the other?

Things To Look For

  1. Which language does a missionary to pick first? Native or mission? Do they read the labels/headers on screen?
  2. Does the order of a language pair make sense? English to Spanish, or native to mission?
  3. Is it clear they’re adding not subtracting or replacing language pairs?
  4. Is it confusing or helpful to experience the same selection process for every language pair?

To help myself think through the language selection process, I decided to create a physical wireframe using note cards and labels.

Language selection and how it fits within global settings

The biggest benefits of this approach were that I could iterate quickly with low costs, and better visualize how the task flow could function without getting distracted by UI details.

Solving for the Right Problem

As we began going through our prototype with missionaries, we soon realized that it didn’t matter whether we asked them to select their native or mission language first. They didn’t read the screen headers initially anyway, and most missionaries expected to only select their mission language. If we asked them to pick native first, they often automatically selected mission instead. If it was mission first, they seemed confused at needing to select an additional language.

With neither option producing the desired result, we realized we were solving for the wrong problem. Since the majority of our users are native English speakers, it was very likely that they assumed the app already knew their native language, since all instructions are currently given in English. How then could we ensure users’ choice of native language is a conscious one without creating a confusing or slower experience?

Condensed vs expanded view

The solution was surprisingly simple — give visual priority and quick access to the most likely option, but include a clear way to view additional choices.

We limited the view of available native languages to one or two suggested options, but hinted at the other 60 available possibilities they could choose from if the suggestion didn’t match the user’s needs. We also decided it made sense to have the user select their native language first, as it’s the language paradigm they’ll be coming from.

Testing Results

As this updated language selection flow was tested with users, it was amazing to see how a small change could help the process go so much more smoothly and accurately! Every native English-speaking missionary we tested with no longer mixed up their native and mission languages, and had little to no confusion throughout the process. But this speed and accuracy didn’t extend just to English speakers.

One testing session that stood out was with a young missionary from Switzerland, learning English. When we asked him to go through and select his languages, he immediately expanded the native language options to find German, made his choice, and proceeded with the tasks we’d given him. It was then that I knew we were on the right track, and that our solution was useful not just for English speakers, but the thousands who may use these language apps around the world.

What I’ve Learned

  1. You’ll never know what you don’t test. In UX design, ignorance is not bliss — it’s stupidity. And though you can’t test for everything, you can’t afford to assume users will know how to navigate your product. Thankfully, however, in this instance we were able to verify that we weren’t too far off the mark.
  2. This requires asking a ton of questions and being willing to iterate and test over and over and over again. But the time we spent in discovery and research saved us the time, effort, and money of going too far down the wrong road.
  3. There are too many occasions where we overthink and complicate what the user needs, getting caught in the eddy of hypothesizing and deliberation. Often a simple solution can be found within the components we already have. We only need to be willing to test it out.
Gaby Cluff

Written by

UX designer & puzzle aficionado. Living within the shadow of the everlasting hills.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade