Photo by Priscilla Du Preez on Unsplash

Three Product Development Lessons from a Human-centered Startup

Dave Masom
The Startup
Published in
8 min readMay 23, 2018

--

Developing a product that isn’t an app or a SAAS is tough. Most advice about product development, even if it can be applied broadly, starts from the assumption that you are building a software product. By adapting these tools and techniques, I have come to appreciate the advantages of having direct user interaction as a core part of our product. In this article, I’ll share three lessons I’ve learned from my experiences managing a technology-enabled, human-centered product at Pack Health.

Think of MVP as a toolbox, not a tool

Most product managers would say that getting feedback early from users is a good thing, and that the best method of obtaining that feedback is to get users interacting with real products. But what does that look like? A common approach is to build a Minimum Viable Product (MVP).

There has been a lot of debate about what exactly an MVP is, or whether it is even the right term to use. I prefer to think about MVPs as a toolbox, rather than a tool. To practically implement an MVP, you need to consider what type of MVP best fits your situation. At Pack Health, we typically use a couple of types of MVP that we have found work best for us.

Concierge MVP

Our go-to model is the “concierge” MVP. In a concierge MVP, you manually perform the steps of your proposed solution to your customer’s problem. As a health coaching platform, we have a large number of employees who interact with our users every day. Adding a well-designed concierge MVP to their workflow is therefore relatively low cost.

Some advantages and disadvantages of a Concierge MVP

Using concierge MVPs has allowed us to test numerous ideas and iterate on the best. We’ve used this concept to test hypotheses related to the behavior change models underpinning our programs, the frequency and content of messaging, how and when to celebrate members’ progress, as well as exploring entirely new services to add to our membership.

As well as validating our ideas, we often get insights from our coaches into how to solve the problem at scale, too. They can tell us what parts of the MVP were most difficult to deliver, where there were points of friction with the member, and what elements were most repetitive. These issues can highlight potential candidates for automation. Finally, some of the ideas we’ve tested this way have been complex. Humans are great at managing this complexity and finding shortcuts through it.

Piecemeal MVP

The second most common MVP we use is the “piecemeal” MVP. This is where you cobble together existing tools to create your product. In fact, this was the first MVP Pack Health used, although we didn’t have the words for it at the time. When we first started developing the platform that our coaches use to manage their members, the idea was to hire some developers to build the software from scratch. We quickly realized this was going to be very expensive. At the same time, we identified that the core functionality we needed was very similar to a call center: the ability to track users and their issues, share solutions, and so on. So we pivoted and built our platform on a customized version of Salesforce’s Service Cloud.

Some advantages and disadvantages of a Piecemeal MVP

Leveraging existing tools when developing a new product or feature should be a no-brainer. Why reinvent the wheel? I think it often comes down to pride: “our product is unique” or “we can do it better.” Well, maybe. But why not get most of the way there with minimal effort, and then invest your resources in addressing the existing tools’ shortcomings? We no longer use Salesforce for every part of our product, but it’s still the backbone of our platform. Where we have developed our own features, it’s because we have hit a true limitation, and not just an internal preference. And we’ve done that with a small development team, because we’ve enabled them to focus on only a select set of features.

We’ve experimented with other forms of MVP as well, depending on what product or feature we are trying to test. Deciding what kind of MVP is best for your situation helps ensure you are building to learn, not building half-baked products.

Experiment creatively

At its core, our product is about the relationship and rapport that develops between our coaches and their members. Our coaches work with each member as an individual, with their own set of motivations, challenges, and circumstances. It is an intensive, months-long engagement process. As a result, we invest lots of time with a relatively small number of members.

This focus on individuals has the potential to conflict with a data-driven decision-making ethos. In the absence of large amounts of data, how should you make decisions? As with choosing a type of MVP, you need to pursue the right kind of evidence for the decision you are making. In the early days of Pack Health, we tried to run A/B tests on populations that were too diverse and too small to show a credible effect. This caused frustration, and rather than use a more appropriate method to gather evidence, we often made decisions on intuition.

Reframe your definition of data

The antidote to this challenge has been to expand our definition of data to one that everyone buys into. For example, I used to get frustrated with what I saw as knee-jerk responses to customer requests from our sales team. Refusing these requests outright was not helpful. Instead, we reached a consensus around the following principles:

1) These requests are an early form of data.
2) They are not sufficient validation in themselves.
3) We will commit to validating these ideas (and/or alternatives to them).

In fact, these requests are our sales team’s own forms of experimentation, to answer the question of which ideas hook our customers. By reframing them as experimental hypotheses, we have been able to avoid full-blown feature development in favor of MVPs. This is a win-win: sales get something to give to our customers faster, and we avoid sinking resources into a request that may be unique to one customer.

Assorted examples of evidence

We have found Design Sprints particularly helpful for developing this kind of consensus. There’s something for everyone. For the analytically minded, there is a clear structure and process. For the creatively minded, there are “crazy 8s” exercises and storyboarding. Getting buy-in for running Design Sprints has helped us promote the value of other learning-driven approaches, such as A/B testing.

Above all, I’ve learned that validation does not require statistical surety. It requires sufficient evidence for you to be comfortable with a decision.

You are a data company

Data is critical to success in modern companies. It is inherent in software products, whether it is used or not. But the link between a health coaching service like ours and data collection is not so clear cut. Despite this, data quickly became central to our business model in three ways: impact, innovation and income.

The first use for data we found was in the need to show the impact of our program. With the rise of value-based payments, clear demonstration of outcomes can be a distinct competitive advantage. Proving those outcomes requires data.

Many of our competitors have evidence of their outcomes. Most have published studies in academic journals demonstrating those outcomes (as we have). However, early on we decided to go further. We embedded outcomes data collection into the engagement process itself by asking our members to tell us about their goals, habits, and challenges, and tracking these over time. This allows us to demonstrate our impact for specific populations, clients and geographies.

In addition, these patient-reported outcomes are useful to our coaches and to our product and program teams. Using this data, we are able to find innovative ways to help our members address the barriers they face, on an individual level or more broadly. We can also validate those approaches over time. And as we scale, this data is informing our first forays into augmenting our platform with artificial intelligence.

Finally, the data we have been collecting has proven to be valuable to our partners and customers. Many healthcare organizations want to understand their patients better, to better serve them. Our dataset provides new insights into the challenges patients face beyond the doors of the doctor’s office. This data is hard to collect without the relationship we have with our members.

Don’t wait for scale

How do you figure out what role data plays in your company? Take time to figure this out, because embedding data collection into your product at the start is far easier than doing so later on. Don’t wait until you have “enough data.” Because data collection has been embedded in our product since the early days, and because that data is tied to the mission of our company, our employees take ownership for ensuring we collect our data robustly and completely.

It took us many iterations to figure out what metrics we needed to track, and the questions we needed to ask. Doing this process of iteration with small numbers allowed us to hit the ground running as we gained traction. You do not need to have a technology product to collect data. When we started, our coaches asked our members questions over the phone (and still do when needed). In fact, as I mentioned above, figuring out how to collect this time-consuming, human-centered data has helped us find a path to profitability.

Conclusions

With this article, I have summarized a few lessons I’ve learned from applying product development methodologies to a technology-enabled, human-centered startup:

1) select the right kind of MVP for your business and the problems you are trying to solve;
2) design and run experiments with pragmatism; and
3) develop a data strategy early on.

The specific examples I’ve given are not meant to be prescriptive. They are examples of ways you can leverage common principles in the tech startup world to suit your own set of business challenges. That’s what those principles are all about.

What lessons have you learned in applying product development ideas to your project or venture? What models have you adopted and which have you discarded? I’m eager to hear your stories and hear your thoughts on mine!

This story is published in The Startup, Medium’s largest entrepreneurship publication followed by 327,829+ people.

Subscribe to receive our top stories here.

--

--

Dave Masom
The Startup

CPO @ Conserv, former VP, Product @ Pack Health. London, UK → Birmingham, AL. Writing about product development, social impact and psychology.