Applying research and analysis to new product design and testing

Our four-step process for moving from pen & pencil sketches to complete product design for our location-aware local news app.

Author’s note: Last month I wrote about how I kicked off my first design project in the Lenfest Local Lab with research and analysis for an app that lets local news find you in Philadelphia. Give it a read now, or skip ahead to read about the second phase of our UX design process.


After research, how we dove into design

A common user experience design process involves moving through three broad phases — (I) research and analysis, (II) design, development and testing and (III) launch and iteration. For our project we had completed our research and analysis and were ready to start designing, developing and testing the designs for our app.

It’s worth noting here that while most of the UX design process fits into those three categories, the actual process of moving through them isn’t always linear — especially if a team is building a new product and testing a lot of assumptions.

Product teams should be flexible and patient and move through the phases in a way that makes sure they actually get where they want to go. Teams might need to repeat a phase — like research — if their initial assumptions weren’t validated. Or they might need to temporarily skip a phase to develop a feature ahead of having full designs. They might even end up working on two phases at once, simultaneously developing and testing small features with users.

The point: it’s important to be smart and flexible with the sequencing of your UX design process, and never to launch something new without going through each phase at least once.

For the Here app, we moved from research and analysis into a hybrid design, development, and testing phase. Below I’ll describe the four steps we took and show you how the designs evolved over time. Let’s dive in!


Step 1: Create low-fidelity wireframes with placeholder items to define the app’s flow

To move beyond paper sketches of the app I used the design tool Sketch to create simple — or low-fidelity — wireframes. I started by wireframing two key features of the app: the screens people would see after downloading the app (called onboarding screens) and the app’s notifications. The onboarding screens are key because they set the tone for people’s experience with the app, and it’s also what they see when we invite people to grant the app the three permissions it needs. Similarly, the notifications are key because in a perfect world, people will never need to open the app to get value from it. They’ll simply receive notifications when they walk by a place that has a local news story attached to it — and if they have time and interest — they can read the story right there in front of the place where it happened.

In a low-fidelity wireframe it’s typical to create placeholders for photos and icons and use placeholder text (the standard lorem ipsum). During this phase you account for every “flow” or path a user might take in your product — in our case, an app. For example, we included a “fallback screen” in our onboarding wireframes to account for when a user might not give the app one of the few permissions it needs to function. The fallback screen prompts users to go to their phone’s Settings to enable the permissions needed.

Here are our initial low-fidelity wireframes:

Onboarding screens

List view and Map view

List view shows stories in a list; Map view shows stories in a scrollable carousel.

Notifications

Wireframes include a screen showing an expanded notification and an article being displayed in the app.

Step 2: Create mid-fidelity wireframes with actual content to optimize for real scenarios

After putting together low-fidelity wireframes and accounting for all of the user flows, I moved onto making more detailed mid-fidelity versions by adding real text and images pulled directly from Philadelphia Inquirer articles that would be in the app. We did this to test the designs with actual content, and see if there were places where the design needed to adapt. I also started adding placeholder graphics and few icons — for the Enable Location and Enable Notification buttons — to the onboarding screens. After three iterations of the mid-fidelity wireframes, we ended up making a few changes to the app:

1. We removed list view, making map view the main app interface.

We felt this would help differentiate the app from many other news apps that present lists of news headlines. Since we want to encourage a new type of local news discovery, we removed the list view to reduce the likelihood of people using the app like they would a traditional news app.

2. We added a Setting page to control app permissions.

We added a Settings page to give users more control over what device permissions the app could access. We also thought the page would be a place where people could choose to add more types of local news stories in the future.

3. We decided to link directly to Philadelphia Inquirer articles.

Linking directly to Inquirer articles helped us reduce the technical scope of designing and developing a custom article view. It also meant that the app aligned well with the Inquirer’s existing meter strategy — by increasing the likelihood someone might become a subscriber if they hit the meter and want to read an article about where they were.

The last version of the mid-fidelity wireframes is shown below:

Step 3: Create a visual design that can be used in high-fidelity wireframes

Now that our set of mid-fidelity wireframes were mostly complete, we were ready to create the visual design of the app. My first step was to create two mood boards for the team to review, with differing visual styles. The first one focused on the vivid and lively theme of arts, culture and architecture journalism — which were the types of stories we were aiming to include in the app. The second one zeroed in on a sleek and modern urban theme since the app was going to be focused on surfacing local news about Center City Philadelphia.

A mood board usually includes suggestions for app colors and fonts, and to emphasize the vividness of everyday life in the city, the first mood board included colors with warmer tones, including pale yellow and a light blue accent color. I chose two sans-serif fonts — Work Sans and Lato — to offer a clean, yet casual look for the app.

For the second mood board, which had a more urban and modern feel, I chose different shades of blue — from light to dark — to reflect the colors you might see downtown, such as the deep blue glass of a skyscraper. I chose to combine a sans serif font, Roboto, with a serif font, Lora, to convey a tech-forward look.

Two mood board examples to showcase different themes for the app

The team decided to move forward with the first mood board because we wanted the look and feel of the app to truly reflect the first set of stories that would be included — local arts, architecture and real estate news. We also thought that since it had a more dynamic color palette, it would give the app room to grow visually as more types of stories were added. This decision gave me the direction I needed to move forward with high-fidelity — and the most detailed — wireframes. In addition to applying the color and fonts to the interface design, I also used the colors to create the onboarding graphics and the app icon.

High-fidelity wireframes of the onboarding screens

Step 4: Create high-fidelity wireframes and iterate

The app’s final design and feature set has also gone through a few iterations — even since the first draft of this post.

We decided to add a motion detection feature to the app.

We added this feature in response to feedback during testing. A few people reported receiving a lot of notifications while driving or in an Uber. This wasn’t a great experience because people didn’t have time to read the articles, and they also passed by the locations too quickly to read the story in the place it was written about — which is the main goal of the app. Also the app will only send each notification once to avoid over-notifying, and so people who received alerts while in a car then wouldn’t receive them if they later walked by the place a story was written about.

The motion detection feature, which is only activated when someone gives the app express permission during onboarding, detects people’s movement and tries to only send alerts when someone is walking. For users who still want to receive notifications while they’re in a car, or on a train or bus, they can turn the motion detection feature on or off from the app’s Settings screen.

Before we implemented this feature, I made a quick paper prototype and tested it with six people in the office who had little or no experience using the app. The purpose was to see if they would find the motion detection feature useful and whether they would want to see this feature option during the onboarding process. The results indicated that:

  1. Most people don’t like to receive notifications while they’re driving.
  2. It’s nice to have the option to turn the motion detection on and off.
  3. The motion detection request should be included in the onboarding so people know this feature exists.

Based on the results of my quick paper prototype test, we made the updates needed to the high-fidelity wireframes.

The new onboarding screen requesting permission to detect motion

Other iterations we made

In addition to adding the motion detection feature we also made a few other iterations to the map view, app navigation and Settings.

Map view location pin design change. On the map view we changed the style for the story pins from “transparent fill” — which means you could see through them — to “solid fill” because the overlapping pin outlines looked messy when they were close together.

Added a navigation bar. The navigation bar was added later on because the app’s map alone didn’t seem to convey the identity of the new app enough on its own.

Moved the Settings icon into the navigation bar. As a consequence of adding the navigation bar, the Settings button — which previously had been floating over the map view — had to be moved into the navigation bar because that’s the standard placement when an app has a navigation bar.

Added a Motion Detection control to Settings. It also wasn’t enough to just add the motion detection feature into the onboarding screens and say that the feature was implemented. We needed to add a control toggle for it into the Settings screen, so people could turn it on and off, like they can the permission to send notifications and access location information.

Added basic app information to Settings. We added basic information about us and our app to the Settings screen, including our new privacy policy and terms of service, as well as a button for people to press and share their feedback with us.

Wireframes for the new map view and Settings screen

Up Next: Launch!

The Here app should be available for download in the next few weeks, and you’ll be able to see all of the final designs and features after downloading it.

Philadelphians and neighbors, we hope you download it and give it a try. We’re looking forward to hearing your feedback.

Local news teams elsewhere in the country, if you’ve been thinking about building a similar app in your area let us know. Part of our mission is share what we learn — and what we do — with other local media organizations, to help new products take root in Philadelphia and beyond. We’re happy to collaborate and we’ll be posting design assets as well as app code online in the next few months.


The Lenfest Local Lab is a small, multidisciplinary product and user experience innovation team located in Philadelphia, PA supported by The Lenfest Institute for Journalism.

The Lenfest Institute for Journalism is a non-profit organization whose mission is to develop and support sustainable business models for great local journalism. The Institute was founded in 2016 by entrepreneur H.F. (Gerry) Lenfest with the goal of helping transform the news industry in the digital age to ensure high-quality local journalism remains a cornerstone of democracy.