Speak to people: How a Nimble Feedback Process Helped Us Understand Our Users
I spoke at Design Systems London #6, about how understanding user concerns is super valuable in helping to shape the direction of a design system in a positive way for consumers.
Here is a video recording from the meetup:
If you’d prefer to read instead, here’s the transcript:
I’m going to talk about the feedback-gathering journey we’ve been on at NewsKit, and how feedback is used to inform decisions. How speaking to people has helped us to understand our users, and improve the design system for everyone that uses it.
design system therapy
👋 Hi, I’m Mike.
I live in North Somerset (just south of Bristol), and I’ve been in the NewsKit design system team for coming up on 3 years now.
My role includes producing specs for components, writing documentation, customer support, and also speaking to people about their experiences using the design system.
But first — a bit of background on NewsKit
NewsKit is an open-source multi-brand design system, created for unique media brands across NewsCorp. Active users include teams from News UK and Dow Jones in the US. There are approximately 60 themeable and accessible components available, with more being added all the time.
The team here aims for high-quality in everything we produce, offering the best possible experience for the consumers of NewsKit. To support that goal, we’ve adapted several methods to gather feedback from active users, so we can understand their specific needs, and listen to their experiences, so we can look to improve the products we offer and the support we provide for everyone.
Methods used to gather feedback from users
A user survey using the Qualtrics platform is sent quarterly to individuals via our support Slack channels, to track user satisfaction, gauge confidence using NewsKit products, and share thoughts and ideas with the NewsKit team. The survey is kept anonymous so that people feel more comfortable sharing their honest feedback with us.
We try to keep the survey relatively short, with the average time for completion around 2 minutes. We highlight this in our messages so people know how much time is expected to complete it. We ask a range of questions designed to help us gain insight into how our users view the design system, pitched to help improve the NewsKit for everyone that uses it.
Some examples of things we found interesting to hear from the survey were that some designers find the handoff of designs to engineers to be a difficult process and that in general, the support available to consumers is well received.
After a couple of times sending out the survey, questions to gage the Net Promoter Score of the design system, and which department and job role the person filling out the survey, we moved to the top of the survey so that even if someone abandons the survey for some reason, we are at least capturing key information we want to know upfront.
We also tried reducing the number of questions asked to keep the time to complete shorter and made some questions conditional, which has yielded increased completed surveys and more data as a result.
This image shows the results of users' responses when asked to choose 5 words to describe NewsKit, which is a more descriptive way for us to track user sentiment, rather than a number scale.
Tracking sentiment over time allows the team to identify problem areas that can be resolved. The design system constantly improves and evolves based on this feedback. On average, we get approximately ~10–15 completed surveys per quarter, from a range of NewsKit users.
All of that sounds great. Job done, right?
Obviously, the more people that complete it in full the better, as the data we could get back is genuinely really valuable for understanding user needs and perception. However, surveys can sometimes be a low priority for people to complete due to not having the time to do so, or simply not having anything to share in that particular format.
Over time, we found that people would on occasion start filling a survey out, and then abandon it at different points — which is nothing new when it comes to asking people to fill out forms generally — even with the best intentions to keep it as succinct as possible (I know I let out an internal groan at times when being confronted with yet another form I’ve been asked to fill out).
We do try to encourage people to fill out the survey as much as possible, but with the best will in the world, this doesn’t always work to capture more responses.
There is also more ad hoc feedback that different team members receive directly, in meetings, or over Slack, and this of course can also be very useful for informing decisions. If it’s in a forum like a Slack channel, then other people can benefit from the responses if more people can see it and track updates or resolutions.
However, ad hoc feedback can get lost quite easily in a continuous Slack feed, and if something is said in person that’s shared in good faith, the original meaning may be taken out of context — which can make the source of truth hard to ascertain, so it is wise to judge each case accordingly.
Opportunities for improvements identified
We reached a point where didn’t feel that we were getting enough information from just the survey and ad hoc feedback alone.
What we wanted was to gather information faster, and at a greater frequency as needed, so we could better keep up with the rapidly changing pace of the business and the needs of consumers of NewsKit.
We wanted to find out more about our user's concerns and ideas that relate to NewsKit users by reaching out to them in a more direct and proactive way.
Not relying on consumers to reach out to us, but instead going to them directly with questions, and listening to see if there is something ongoing that can be fed back to the team so more people are in the loop, and action is taken if needed to get address an issue quickly.
This is something I wholeheartedly agree with. This sort of approach can help reveal unknown things, that might only come up when having a more informal conversation with someone.
Consumer interviews = chats with a purpose
Feedback on the NewsKit product is not just encouraged but actively sought. We run regular consumer interview sessions targeting consumers across all disciplines. Any concerns or criticisms can be raised, and feedback and ideas are encouraged.
Consumer interviews are short, and relatively low effort (for both parties) — meaning we can schedule as many interviews as we want to quickly, and with anyone from across the business.
The format is a video call (mostly 1-to-1, or sometimes group sessions with an accompanying FigJam board) lasting around 10–15 mins each.
There is an effort to keep conversations casual, light, and disarming — giving an intro at the beginning of each session introducing myself, and anyone from the NewsKit team that joins, and what we are looking for from a participant.
They are conducted online over a video call intentionally so that there is a bit of distance for participants, and a more level playing field — so if someone wants to keep their camera off for any reason, they can and there’s no pressure to attend if they change their mind.
We make a note of anything key in an interview, slowing the conversation down at times when there is a detail important that is captured accurately — which might be a technical issue, or acronym — being sure to clarify specifics to mitigate the need to follow up later.
Everyone is different, and — they might also be in a rush, on a deadline, or — like me — have a dog wanting to get in on the call — these are the best meetings by far and makes me feel better about my dog sometimes popping up on camera for a hug, as he thinks I’m talking to him.
We make it a point at the outset to participants that it’s a safe place for them to share honest feedback with the NewsKit team, and we are simply looking for any thoughts they have when using the design system, so we can improve what we offer to consumers — from components to documentation, or support offered, etc.
Feedback is triaged, and issues can be tracked on an open roadmap so consumers can see when they will be resolved.
This process helps to build strong relationships between teams and ensures consumers know any concerns are being heard.
All feedback can be useful, the challenge lies in spotting something worth expanding on, and pulling on a thread to try and get a clear picture.
So, have consumer interviews resulted in valuable feedback?
The short answer is yes, but it’s an ongoing process that continues to be refined and adapted as business and user needs change.
I’m not a user research specialist, I just enjoy talking to people and hearing what they have to say. These are some great resources that I recommend checking out, and have been useful guides on the subject:
🔗 Links from this talk:
- Interviewing Users: How to Uncover Compelling Insights by Steve Portigal | book
- Gathering feedback on your design system by Kristen Singh | zeroheight blog
- Build What Your Customers Want by Kareem Mayan | Medium blog
- Design Systems Handbook by Diana Mounter | designbetter.co
From this experience, I have some things that may help point you in the right direction if you are considering starting to conduct your own consumer interviews.
Invite other people from your team to get involved
A single person being the feedback point person is a danger, as they may leave the business. It shouldn’t be left to just one person to gather and manage feedback, it makes sense to share the load around the team.
Try out different methods and what works for your team
Change up your methods — as much as you need to, and continue with what works.
Something is better than nothing
Don't worry if time doesn’t allow you to do as much as you planned.
Focus on active users — and particularly those who share
Hopefully, the people you choose to speak to are open and come prepared with lots of useful feedback, but it's worth acknowledging that might not always be the case. Know when to cut an interview short if you feel it isn’t going anywhere — the person might be grateful if they don’t have much to say.
Keep track of who said what, categorising as you go
Record all notes in a tool like Airtable so everything is in one place, and can always be referred back to.
Share key insights with your team
From surveys or interviews with your team — and the people who make the decisions. At the end of each quarter, there is a meeting where the team goes through the feedback gathered so context can be provided.
Demonstrate you are listening to your users
With actions and following up with them, tickets in the backlog, tangible changes with a new release, etc.
Be sure to thank them for their time
Show gratitude for people’s time, and that it is valuable — because it is!
Do whatever works for you and more importantly the people that consume your DS
Surveys or other methods for gathering feedback have their place for sure, and can be just as — or more valuable — it depends on what you find out and from who.
At the end of the day it doesn’t matter what you try — speaking to consumers and giving people a platform to do so is the important thing.
Focusing on the people actively using your design system in my experience is what will throw up the most useful insights for your team.
💬 Get in touch
📖 You can find the slides from my talk here ➡️
Feel free to reuse and repurpose.