Embracing convergence with confidence

Wendy Ju
CMU x MHCI’23 inQ Capstone Team
6 min readJul 9, 2023

Embarking on the transformative phase from research to design, we face the formidable challenge of convergence. This is no ordinary journey as we evaluate business objectives, user mental models, and project goals altogether…

Testing mental models, not products

At the end of Sprint 6, we started making lo-fi screens to test ideas on implementing new sensors in space more efficiently and documenting work orders in detail. However, during our on-site testing with CMU facility managers, the relatively finished-looking visuals raised several technical questions regarding the feasibility of implementation, which was not the primary purpose of testing at an early stage. We soon realized that we had attacked at the problems in a product-centric way (e.g. “how to create more detailed work order documentation system”) rather than from a human-centered perspective (“how to enhance communications between facility managers so they can troubleshoot efficiently” )

Testing the riskiest assumptions

At this design stage, gathering user feedback is crucial to understand their motivations and mental models that shape their work. This understanding will allow us to generate ideas that effectively address their job-related needs. It is essentially a numbers game — developing a human-centric product requires testing with more people.

To effectively test user behaviors and mental models with our prototypes, we have devised success metrics that serve as our guiding stars. Additionally, we have identified the riskiest assumptions associated with the three primary problem areas we are determined to tackle: effective knowledge sharing, efficient troubleshooting, and predictive maintenance. To expand the scope of our tests, we reached out to our cohort as well as analogous domains such as administration and IT, as they share similar mental models when it comes to troubleshooting and knowledge sharing. We requested them to carry out a series of tasks, and have obtained some interesting findings:

1. Despite expressing distrust in informal learnings like forum posts, people often underestimate the effectiveness of crowdsourced information in problem-solving. Additionally, individuals tend to place greater trust in those with higher seniority levels when seeking shared knowledge.

2. Consolidating all information into a single platform does not necessarily alleviate people’s feeling of being overwhelmed, especially when they have to navigate between multiple software applications. We should explore methods to lower the barrier of entry for using inQntrol, such as a plugin that integrates with users’ existing workflows.

3. The most preferred troubleshooting method involves anticipating problems before they occur. By providing a list of potential causes when a problem arises, we can assist users in troubleshooting more efficiently and expeditiously.

user testing

It’s encouraging to see the team’s shift away from “analysis paralysis,” where they were previously overly focused on technical feasibility. Moving forward, we should embrace the mindset of “just make something, even if it means it could be wrong!”

Harnessing the power of decision making

Now that we have a handful of learnings from previous rounds of testing, where can we lead the product to successfully satisfy business goals while addressing users’ needs? With just two months left, it becomes crucial for the team to strategically allocate our resources and time. We made the decision to narrow down the scope of the problem we’re tackling. Currently, we find ourselves grappling with various challenges, including alert fatigue, lack of knowledge sharing, and juggling multiple software systems. However, amidst this complexity, we recognize the importance of a guiding question that can direct our design efforts.

voting on a problem scope to focus on

To determine this question, we placed two highly impactful and intriguing options on the board, and voted across six categories around which targeting at which problem will help us achieve minimum viable product given our knowledge of the users and the domains. After some careful consideration, we decided to focus on the problem of “alerts are not actionable”. This broad landscape allows for ongoing generation of feature ideas while providing a clear direction moving forward.

Aligning business interests

Based on previous rounds of prototyping and conducting a crazy 8 session, we have generated 16 initial feature ideas focused on actionable alerts in four areas: alert grouping, system-generated suggestions, geographically based features, and data visualization. To further evaluate the value propositions of these features and select the most valuable ones, we plan to gauge client’s business insights to determine which features can contribute to achieving our Objective and Key Result of increasing user adoption rates. Considering this objective, a trip to DC seems like a fitting course of action.

feature brainstorm session

Investing the first $100!

During our client visit, we organized two activities to gauge business insights. The first activity involved a pre-mortem exercise, where everyone envisioned reasons for a hypothetical scenario: “In 2024, inQ fails to persuade any university or healthcare facility management departments to adopt the inQntrol platform.” Realizing that clients are very conscious of the costs of switching being the top obstacle confirmed the need for our future design strategy to uncover the differentiating values inQntrol brings to users which set it apart from other vendors.

client trip: $100 activity

Next, we conducted a “Shark Tank” style activity where we presented the 16 ideas to our clients. We asked each client to allocate a hypothetical $100 budget across the ideas within each of the four themes, considering the perceived business potential and the ease of adoption for facility managers. This exercise proved highly beneficial as it helped the team prioritize features that would bring the most value moving forward. There were several criteria that went into the “investment” process, including ease of user adoption, the clarity of design in extracting valuable insights from complex data visualization, and the scalability of the solution.

$100 activity idea pitching

By the end of the meeting, both the team and the clients had a clear product vision. This experience taught us that when clients are uncertain of what they want, we need to assist them in envisioning the product by providing visual representations.

$100 activity feedback collection

Looking back and moving forward

Sprint 7 proved to be our most challenging sprint yet. There was a lot to unpack: we tested with 34 participants, generated 40 ideas, created 16 prototypes, and went on one client trip in a fortnight! Despite the initial feelings of uncertainty and overwhelm, we tackled the challenge head-on with determination and proactiveness. As we delve into inQ’s value proposition, we also want to remain mindful of users’ social and emotional Jobs-To-Be-Done. Excited to discover intriguing insights from further contextual inquiries and testing in sprint 8!

The work and knowledge gained from this project are only intended to be applicable to the company and context involved and there is no suggestion or indication that it may be useful or applicable to others. This project was conducted for educational purposes and is not intended to contribute to generalizable knowledge.

--

--