Bringing lean and agile UX to PPC Bee
We’re doubling down on making PPC Bee as enjoyable to work with as possible and talking to you to make that happen. We’re getting to know your needs, your workflows, and your understanding of PPC concepts, but also fixing the problems you’re running into and adding in core functionality to make your life easier. And we’re using lean UX methods to do that.
Basically, it’s a way to do user experience design at a fast, efficient, iterative pace — just like the lean processes you may know from the developer world. Besides speeding up processes, it also eases interaction with developers, who are used to doing things in quick iterations.
There’s no one lean UX process, everyone has their own take. Here’s what it looks like for us:
Our agile workflow
Our development team works using an “agile” methodology. They work on a 2-week cycle — each cycle being a “sprint”. And we’ve adapted the same workflow for our UX team. Every two weeks, we have a meet to plan the sprint — we look at the last sprint, decide which UX tasks to include in the next sprint, and set their priority.
Once tasks are assigned, each task is picked up one at a time based on its priority. Every task has these four stages:
It’s important to note that this process isn’t linear — if the prototype fails in the acceptance or in the testing phases, it goes back to the research or prototype stage. It’s also not exhaustive — there’s a separate process for visual design and for development, and we sometimes do user testing at the end of those.
Each of the components of this workflow is complex enough to warrant its own blog post. Expect follow-ups to come.
Where does the user come in?
The only way we can make good decisions is by understanding our actual target audience. However, given that finding participants and scheduling meetings can take some time, conducting user interviews and user tests doesn’t fit neatly within our fast-paced sprints. So we’ve taken them out of that process and are organizing them separately.
The process generally looks like this:
User interviews we generally use to understand people and their needs better, user testing to discover flaws in our designs or in the app itself. So far, we haven’t been conducting them at regular intervals, but we’re planning to introduce a monthly cycle.
You might be wondering what the difference between this type of testing and the testing that’s part of our sprint workflow is. In general, we do in-house testing as part of the sprint workflow. As we share offices with a friendly marketing team, we can always find willing PPC experts to test our product. And while the results of this testing can be skewed, they still help us uncover important problems . Anything that isn’t caught should be covered later by the testing we do with external PPC professionals.
There are also other ways we gather data from users — we continually collect usage data, we receive feedback, we take note of the support questions that we get asked, and we’re planning a few A/B tests down the road.
I want to help!
Great, let us know! Whether it’d be constructive criticism in the form of a quick message or arranging an hour-long interview, we’d love to talk to you, get to know you better, and then use that to make your life easier.
Get in touch with me at firstname.lastname@example.org.
(If you’re reading this not as a PPC expert, but as a UX expert, we can still chat and compare notes.)