The Design/Business Challenge
Simulate a startup within the existing markets, applying interaction design skills to create not only a product but also a business.
It is a 12-week group project in the Creative Founder course taught by Kate Rutter.
Team: Nathalia Kasman (CEO), Claire Zhou (CMO), Leo Zhang (CFO)
Methodology: Steve Blank’s Customer Development process, Lean Start-up, Quantitative research, Market research, Prototyping, Usability testing
When students learn new things, solving problems alone isn’t the most easy thing to do. However, it’s also hard to ask for help because:
- The person who can help you might be outside of your social circle
- You might be hesitant in asking for help so as not to appear burdensome
Hatch, a company that provides a peer-to-peer skill sharing app, helps students to:
- Expand student’s connections
- Create mutual learning experiences where peers can help one another
Students will create a profile that provides information about what they want to learn and share, as well as specific learning goals. We will provide prompts to guide them throughout the process.
Based on their profile, we will suggest to him or her co-learning peers who have complementary interests and goals as well as attend the same school as them. They tap “connect“ if they think the suggested peer is a good fit.
Once both sides think they are suitable peers to one another, they can start to discuss their meeting details in the chat room. We will provide conversation guides to help them break the ice.
After they schedule the time and meet, we will follow up with them by asking their feedback, which will help us provides students with suitable co-learning peers next time.
The Endless Pivots Journey
Different from regular design courses, the Creative Founder course is a start-up class, which means that everything needed to be tested by market. Hence, we tried to validate the problem by talking to lots of our target users and going back to rethink ideas if the problem isn’t validated.
In the first part of the class, we focused on developing products for illustrators but the problems we targeted either weren’t real (not validated by users), or were caused by a big systematic issue that we weren’t familiar of.
It was painful, but after endless pivots, we decided to start everything all over again.
Research of students’ learning behaviors
Since we needed to catch up with the class, we decided to focus on the problem we all experienced ourselves as students— we had few effective resources when learning new things. Then, we did several activities to understand current students’ learning behaviors.
Firstly, we posted a huge poster around campus. We encouraged people to respond to the poster by writing down their most efficient way to learn a new skill. From the responses, we knew that asking someone who know the skill is the most efficient way to learn.
Then, we interviewed 8 students to further understand their thoughts and behaviors of helping their peers or asking help from them.
The key insights we learned are:
Helping others is also a form of learning.
Few students said that they get a recap of what they learned and understand the material better by helping their friends.
There is hesitation in asking for help.
People don’t ask help because they don’t want to appear burdensome and will only ask for help when friends who can help them are physically near them.
As our insights told us that students usually learned from peers to solve problems quicker, we thought that we could create a digital site/app for students to share their skills with each other.
Once we decided to make a skill sharing app, we did research on many other learning platforms in the marketplace that range from tutoring to learning exchange apps. We want to foster face to face interactions and mutual learning experiences, because based on our interviews, people are more open to seek help (and to also provide help) when they are physically close to each other.
To validate our concept, we wanted to know how many students would be interested to join our peer sharing community because more sign-ups would be an indication that there is a need that we can fulfill.
W recruited students through:
- On site recruitment: we set up a table to encourage students to fill out what they want to learn and share on our sign-up forms (with free food as an incentive definitely ha).
- Sign-up boxes: we put sign-up boxes around campus so students could fill out sign-up forms and left them in the boxes as they passed by them.
- Offline marketing: we posted posters, flyers, and stickers around the campus. Students can scan the QR Code and fill out our online sign-up form.
- Online marketing: we posted our posters and latest activities on Instagram and Facebook, as well as established our landing page.
We got 117 QR code scans and 81 sign-ups in only 3 weeks!
As we successfully recruited participants, we prototyped the learning exchange experience by using the wizard of Oz method — pairing students up manually by their complementary interests to pretend we were the algorithm of our app. In two weeks, we ran 12 experiences(6 pairs of students) in total.
In week 1, Before we ran the experience, we listed down the assumptions that we wanted to test.
Then we contacted students via emails and set up a 30-minute meeting session for them based on their schedule. While both students were met, we would introduced them and left room for them to chat. After they were done, we would interview them respectly to learn their experience.
We noticed that each group has different types of learning exchanges:
Students who already have specific questions in mind and just want answers.
Students who used the session to share their learning experiences and sharing best practices.
Students who shared about their own individual practices and were inspired by one another’s processes and methods.
No matter which type of learning exchange each group has, participants all thought that it was inspring to learn about other people’s way of working, which proved that our assumptions were true. Moreover, we found our more insights that we haven't thought of.
Based on the insights we got from week 1, we formed more assumptions and designed multiple versions of experiences to test them in week 2.
Experience 1.0: Specific info
The first insight we had from week 1 was that students wanted to know more specific information about their peers before meetings. Hence, we sent participants surveys for them to fill out more specific information (learning motivation, goals, background,etc) then forwarded it to their assigned peer so they could know their peer more prior to the meetings.
It turned out our assumption was true — the participants thought knowing more information about their peers beforehand make meetings become more efficient.
Experience 2.0: Self Browse
The second insight we had was that it was hard to find peers that had similar learning goals as them. Hence, we not only did the survey exchange (same as experience version 1), but also sent them an excel sheet to ask them to pick 3 ideal peers from the 10 possible peers we listed.
It turned out our assumption was only partially true— the participants thought self browse was“interesting, but is not necessary”, “picking suitable peers by myself might cost too much time and mental efforts”.
Experience version 3: Customization+conversation guide
The last insight we had was that we needed to help students be more comfortable during initial interactions. Hence, rather than scheduling locations and time for them, we put them into a group chat for them to discuss their meeting experience on their own and we pretended to be the robot that provided conversation guide as an ice breaker.
It turned out our assumption was true — the participants thought it helped them found more common topics with their peers.
To develop our digital MVP, I translated our successful experiences into 3 main features:
- Suggested peers
Moreover, I added a feedback section for user retention so the data could help us provide users more suitable peers next time.
At first, the feedback system I used was rating star and endorsement. But from user testing, I learned from participants that it wasn’t the best fit in this scenario both logically and morally — “While someone voluntary helping me, I didn’t want to rate or endorse them.”
As a result, rather than transferring the experience to any kind of objective system, I kept it more simple and more personal — users could write down their feedback and choose whether if they want to send it to their peers.
I mapped out the user flow which includes 3 key features we developed from the experiences — profile, suggested peers, and chatroom, as well as feedback section to retain users.
The Pitch Day
At the end of the course, we were asked to pitch our idea as if we were pitching for funding. We had an ask of $520,000 to start our product and achieve our next milestones and pitched this to a group of 11 venture capitalists. At the end, we got 9/11 venture capitalists’ funding, totaling the investment to $1,681,500!
Click my next story to see our pitch: