Finding the Startup Within (Part 1)

Christopher Savio
Prompt ai
Published in
6 min readOct 17, 2018

When you work for a large company, rarely an opportunity comes along to get back into start-up mode. That’s what I had a chance to do over the last year. Our team at LogMeIn had a vision for a new product — one that would solve a lot of IT support problems. It was a great idea and we were given the resources and latitude to see if this idea could live up to the potential. This week, we are bringing that product to market — Prompt ai. Prompt ai is an employee support tool using AI-technology to make self-service easier for employee and support teams. But this post isn’t a product push. It’s about the journey we took to get there. We learned so much bringing this idea to market, that I thought 14 months of key learnings has to be worth its weight in gold.

Getting started — Research, Research, Research

I’m coming it hot with my first key takeaway. While this may seem trite, getting insights from your target audience before even concept work begins is crucial. We conducted quantitative research, customer interviews, consulted market research and even conducted our own third-party research to understand where our hypothesis of a common pain — repetitive, simple to solve, employee requests — ranked in the list of employee support issues. Not only did this work help validate our theory, but we were also able to identify unexpected needs which will be crucial in future product development and go-to-market stages.

Side note: If you’re not comfortable building a survey instrument and fielding a study there are numerous resources out there. Whether you go the DIY route or get support from a full-service research shop, you won’t be at a loss for templates and respondent samples to tackle almost any scenario. The same holds true for qualitative research, though I would encourage you to read up on some basic interviewing methods to learn how to get the most out of a customer discussion — like using active listening techniques, making sure you’re always in control of the interview and not inserting any interviewer bias into the discussion. (I’d be happy to point to tools or resources I’ve used, so just comment below if you want to know more)

In addition, this research gave the team the necessary ammunition to build a sound, business case to move forward. The business case and target audience we rallied around was tackling AI-powered, employee self-service for small and mid-market audiences. The research clearly showed that they had the overwhelming pain as well as a desire for a solution, but at the same time rarely any adoption of technology due to limited availability of said tools for this market.

Time to Start Building — Sprinting and Co-Design

The need was confirmed so then it was on to the fun stuff — the solution! With a lean, mean, pirate machine of product, product marketing, product design, and engineering assembled, we kicked off a design sprint week. We followed the Jake Knapp, of Google Ventures, approach for a 5-day design sprint, and with just a day or two of prep, it went off very well. This approach was extremely valuable, not only for getting a validated prototype created but to start getting symbiosis among the team. We engaged in the typical diverging and converging discussions, and came out the other side with a short-term and long-term solution we all believed in.

During a design sprint, you typically want to build a prototype which will focus on a key use case and user story. Specifically, as it related Prompt ai, we knew that there were two impacted users with our product — employees and support agents — so we had to decide on whose journey to focus. This was clearly a significant fork in the road and a big decision which helped shape our long-term positioning. We landed on the employee. Sure they aren’t the buyer, but we knew if the experience was great for the agent and not the end user, employees wouldn’t adopt it and it would fail.

After the successful sprint and armed with a validated end-user flow, we turned to the support agent interface. We took two approaches at this phase. The first was creating clickable wireframes and letting users walk through use cases. We took the typical “think out loud” approach and also asked for feedback on specific areas if comments were not offered unsolicited. A big takeaway here is when getting feedback at this early stage, keep it simple and as linear as possible. While we received some valuable insights, feedback was somewhat scattered. Outside of the box thinking and curiosity from our participants about other aspects of the product led us down a few rabbit holes. It also didn’t help that we added in some functionality placeholders to the wireframes, which while “sexy” and aspirational, probably weren’t going to make it into a practical implementation — at least not in the first release. This resulted in considerable feedback on features which didn’t need immediate attention.

The second turn we took on the support agent interface was a co-design approach. To tackle this task we enlisted the support of our CX partners. We used some of the learnings from the wireframe tests specifically on the interface sections and problems we’d want to focus on, and the product introduction. Another key takeaway — don’t underestimate how important background and introduction is in wireframe work. Making sure all participants are on the same page regarding what this new product is and what it should do, without biasing them, is not easy, but if you can get it right, it will be a tremendous help later on.

The co-design work came out better than we expected. We kicked each session off with some creativity exercises to get participants in the right mindset, set the stage with some vetted messaging and visuals, and then went to work on a blank piece of paper. After a few small nudges and with the right probing questions (e.g., “you mentioned tracking feedback is a common problem, how could that get addressed in this screen?”), the participants basically built out a whole UX from scratch. While the designs were not the same for each session, the prioritization of information and features, and the general themes of why were. That’s really what you should hope to walk away from this type of exercise with because then you can hand that information over to your expert design team and they can synthesize it to a workable tool. (Want to learn more about co-design, check out this post by one of LogMeIn’s CX all-stars, Hilary Dwyer.)

This is just the beginning of the learnings we had on our journey. Stay tuned for more on how we validated the technology and messaging as well as kicked off a beta program. Spoiler, our biggest beta customer was a 3000+ employee organization — our own. While that may feel a bit like cheating, LogMeIn was actually one of our hardest clients. If you want to jump ahead a bit a hear how the launch of Prompt ai went at LogMeIn and it impacts on the product now available, check out this blog.

--

--

Christopher Savio
Prompt ai

Strategy Manager for LogMeIn’s Support Solutions business. Passionate about development and implementation of new technology — for work and for play.