Mission Possible: Assisting Mission Collaborative in Service Design

Matthew Johnson
10 min readOct 11, 2019

--

The Mission (Introduction)

Our mission — whether we chose to accept it — was to analyze and evaluate almost every aspect of Mission Collaborative (MC), a startup company that helps career changers find their next big thing. About two-thirds of Americans report job dissatisfaction, and many of them suffer through it rather than risk the reduction in salary or benefits that could come with a career change. The challenge is helping them discover what they want and how to get it.

We were assigned to teams of three or four and briefed on the five W’s of our client by its founder, Grant Schroll. Mr. Schroll informed us that MC has served 2,000 career changers so far and seeks to address their needs through education — specifically an eight-hour “bootcamp” and a 30-day self-guided online course. We were asked to focus our energies on the latter, which is the manifestation of Mr. Schroll’s unique “career design process.”

We were given two briefing documents outlining the main deliverables, and we will present our results as follows:

  1. Competitive Analysis
  2. User Personas
  3. Journey Maps (by Persona)
  4. Data Analysis and Visualization
  5. Suggestions on Data Collection
  6. Usability Tests and Accessibility Audit
  7. Final Recommendations

Each of these will be described in detail after we briefly discuss our process.

Our Process

Our professors (at General Assembly) introduced us to the “scrum” prior to the launch date of our project. A scrum is basically an operational meeting where team members share what tasks they’ve completed, what tasks still need to be completed, and any significant challenges or obstacles. We began in this way each day around 9:30 a.m. and usually moved — more or less organically — into a work session. Some tasks, such as interviewing and data visualization, were done individually, while others, such as affinity mapping or journey mapping, required more teamwork. We shared frequent updates and epiphanies through Slack, even over weekends and late at night. We worked simultaneously where possible, even remotely, using Miro and Google Docs. We also consulted outside authorities when necessary.

Competitive Analysis

Finding MC’s competitors was no straightforward task. Career-change services come in many shapes and sizes — and it’s not always clear what they do based on the name or even the mission statement — so comparing MC to Career Pivot and Career Shifters is not an apples-to-apples comparison. Nonetheless, the charts we created are useful in the sense that the average user will likely not notice any major differences between these career-change services when visiting their websites for the first time and that none of the stand-out features of one service would be irrelevant to the others.

Overall Insights

● Career Pivot stands out by providing a variety of career-related information in the form of blogs and podcasts

● Career Shifters stands out by presenting colorful and detailed client testimonials

● Mission Collaborative stands out by presenting all the essential information on its homepage

● Both Career Shifters and Career Pivot have separate pages dedicated to the positive press they’ve received (and client testimonials)

User Personas

Given the timeline of the project, we had roughly two weeks to conduct user research. Our team designed a survey (with screening questions) targeting individuals dissatisfied with their current job — sharing it on LinkedIn, Instagram, Facebook, Reddit, Meetup, and elsewhere — and got 75 responses. We were looking primarily for mid-career professionals who had taken at least one step toward a career move but still considered themselves stuck and in need of additional support. Based on our initial briefing, it was clear that MC was not designed to be a one-size-fits-all solution for career changers. Some demographics, such as early-career professionals, blue-collar workers, and soon-to-be retirees, would be better served elsewhere — at least for the time being.

We contacted those who left their contact information (or whom we already knew) and followed up with six individuals for a phone or in-person interview. We interviewed eleven additional people for a total of 17.

All of the 17 had either recently quit their job or were seriously considering it at the time of the interview. Most were in their 30s. One was too young to fit our target demographic but still provided useful insights into how MC could best serve younger professionals. There was diversity in race and gender. We collected this demographic data along with their educational background, career background, and future career aspirations. We then asked them to elaborate on the difficulties of their previous career, their thoughts and feelings related to changing careers, and their learning preferences. For the sake of making the personas more real, we took note of their personality traits and asked direct questions about their non-career-related interests and attributes.

We copied quotes from our interview notes to an electronic affinity map and color-coded them based on the individual user. We sorted our qualitative data to reveal patterns broken down into four categories that we determined were most helpful in understanding MC’s user base (displayed below).

Personas are (sometimes controversial) symbols of large groups of users. Given that our team is comprised of career changers, it was easy to empathize while difficult to eliminate personal bias. It was also challenging to connect our own original data to MC’s actual clientele. Nonetheless, even before we received data on MC’s real-life users a dominant persona emerged. Leya Bryan (“Persona 1”) is an educated, responsible, and conflicted young professional who was not necessarily miserable at her old job but is looking for more advancement, work-life balance, or flexibility.

Steven Ross (Persona 2) and Lucy Bridges (Persona 3) were the result of careful data analysis that revealed slight variations in demographics and attitudes among MC’s pool of career changers. Interviewing helped humanize said personas and decrease stereotyping. Steven is a more seasoned, less patient career changer looking to make a mark on the world while he’s still energized and in good health. Lucy is a youngish, indecisive woman at the early stages of her career. While each persona has the same overarching goal — career discovery (and success) — the journey through MC’s 30-day program looks different for each. Given the personal and professional responsibilities of a man in his 50s, Steven requires tools to help him manage his time during the fellowship, along with high-level networking and extended learning opportunities. Lucy, on the other hand, needs more one-on-one support from MC staff and opportunities to connect more with other career changers at MC — many of whom might be older and further along in their careers.

Journey Maps

Service design is often less concrete and more complicated than product design because it focuses on the various phases and touchpoints involved in the experience. The purpose of the journey map is to provide a detailed walkthrough of the major stages of a service including actions the user takes both prior to signing up for the service and following its conclusion. It is not intended to be a comprehensive “service blueprint” that covers all touchpoints and opportunities for data collection but to place each persona in his or her appropriate context.

We created a separate journey map for each of our three personas representing MC’s program as it exists now and another set of maps representing an aspirational state that would result from our suggested improvements (based on analysis of MC’s user feedback data).

While each persona goes through the same basic program phases, feelings, thoughts, and even actions change depending on the persona and the phase. We showed these variations and also listed several “opportunities” for each persona/phase. These opportunities, if manifested, will result in the aspirational (or future) journeys we present in the right-hand column.

Data Analysis and Visualization

All aspects of the personas and journey maps came from careful analysis of our own (original) research and the data provided to us by MC. The data was primarily in the form of user responses to feedback surveys designed by MC for the purpose of evaluating the program. A significant amount of demographic data and preliminary information was captured along with quantitative and qualitative feedback on weeks one through three of the program (only quantitative data was provided for week four, specifically) plus overall feedback upon completion.

Using Excel and Tableau, we were able to display user-satisfaction data for each week of the program broken down by persona. When we compared this to the level of effort that users were reporting, we noticed an interesting trend: The more effort put in, the greater the level of satisfaction.

One of our suggestions for ameliorating this is to provide either a lesson on “Growth Mindset” or a free book on the concept to fellowship participants to help them understand that effort leads not only to success but also satisfaction. To put this another way, our data visualizations suggest that the “hero” (of the fellowship journey or story) is the diligent Leya, Steven, or Lucy who most effectively utilizes the advice, education, and resources provided by MC — who serves as the guide or the “sage.”

Suggestions on Data Collection

In order for MC to tell this story even more clearly, we have provided a list of suggestions:

● Collect data at the end of each module instead of each chapter

● Narrow the satisfaction scale from 1–5 to 1–4 to get better insights

● Instead of “if you rate this class under 7, reach out to us,” set up an alert system that automatically follows up with unsatisfied students

● Track progress of user-defined goals before, during, and after the fellowship

● Add a satisfaction survey for week 4 specifically

Usability Tests and Accessibility Audit

The last major component of the project was to test MC’s website and Teachable platform (on both laptops and mobile devices). This was a parallel process that required many of the same aforementioned tasks — such as affinity mapping — and depended on three of our interviewees, who represented our personas. We recruited two others for our usability tests because around 85 percent of errors are typically discovered by a minimum of five human test subjects. We began with heuristic evaluations that our subjects completed on their own time and at their own pace.

Usability tests were conducted, to the extent possible, in locations appropriate for career changers enrolled in MC’s fellowship program. One test was conducted on a public bus in order to simulate the experience of a user with limited time. We standardized the usability tasks, allowed time for subjects to “tinker” with the website and Teachable platform, and asked only open-ended questions. We recorded the tests when granted permission and played them back to look for themes that we initially missed in our notetaking. Observations, errors, and direct quotes (categorized as “positive” or “negative”) were examined through a second affinity map, color-coded based on the individual test subject.

Overall, we noticed relatively few errors, especially on Teachable, but an abundance of critical comments on navigation and visual elements. We will list the most frequent ones in our “final recommendations” section.

One particular page — the Blogs page — is a microcosm of the entire project, from the research phase to the accessibility audit. Many real and potential MC users expressed the need for industry-specific information or advice (which can easily be provided in blogs), and we subsequently determined that access to this information is essential for each persona’s successful journey through MC’s fellowship. We also identified both usability and accessibility issues (based on user feedback) related to the Blogs page in particular.

Finally, we determined the site’s level of accessibility using Web Content Accessibility Guidelines (WCAG) 2.0. MC’s website received a failing score on seven items on the AAA-level checklist, two items on the AA, and one on the single A. Concerns included using text-based college logos without alt text, the lack of a full transcript of the introductory video on the homepage, and no means of providing help to those unsure of the site’s purpose or how to navigate it.

The only major flaw we found on MC’s Teachable platform is its contrast radio, which is also a major concern on its homepage:

Final Recommendations

We categorized our recommendations as follows:

  1. Website changes
  2. Teachable platform changes
  3. Missing content and features
  4. Accessibility concerns
  5. Data collection opportunities
  6. Workflow changes

We already discussed data collection, so we will conclude with our top six recommendations, one for each area. Additional recommendations are listed in a separate document.

  1. Improve website flow and overhaul Blogs page (website changes)
  2. Create student calendar with notification system (Teachable platform changes)
  3. Provide team-leader and technology training prior to start of course based on results from onboarding process (missing content and features)
  4. Sharpen website contrast to a ratio of 4.5 to 1 minimum — ideally 7 to 1 (accessibility)
  5. Create an alert system targeting dissatisfied clients for follow-ups (data collection opportunities)
  6. Reduce the fellowship groups from five clients to four (workflow changes)

It is our view that these changes will produce the greatest impact at the lowest cost for MC’s users.

--

--

Matthew Johnson

I’m a meticulous scholar, creative problem-solver, and passionate advocate whose bottom line is unlocking human potential through writing and research.