Launching A Successful Redesign For 20 Million Students
When more than 20 million people use your service each month, how and when you roll out a redesign matters. When considering making a big update, we knew we had to make it better for our users not just aesthetically but also functionally — with no disruption to studying. Over the past year, we’ve been redesigning and updating the entire Quizlet website. What was the inspiration? With a website that’s over ten years old, we’d accumulated both aesthetic and technical debt over time . With Quizlet growing 50% year over year, millions of people are using it for the first time every day. First impressions matter a lot — while we’re proud of our organic growth over the past decade, we believed we could make users fall in love with Quizlet faster. To do this, we launched a brand reflecting the platform that 20 million students and teachers already knew and loved.
Redesigns are common for consumer products, and they’re notorious for causing pain. We’ve all heard the horror story of the redesign that took forever, or the redesign that backlashed in the face of its users. Furthermore, the bigger your scale, the bigger the risk you take on when making drastic changes to your website. With millions of students and teachers depending on us, we had to be strategic about our redesign. While our approach wasn’t perfect, we followed three key principles that guided us through an overall successful launch:
- Invest upfront to make future development easier
- Ship regularly to get feedback early and often
- Monitor performance and key engagement metrics
In this post, I’ll dive into each of these principles and how we applied them. While we implemented the redesign across all platforms — web, iOS, and Android — for the purposes of the post I will focus on web, since it’s the most feature-rich and thus was the biggest undertaking.
Invest upfront to make future development easier
As Quizlet’s features grew over the years, CSS was continuously layered upon other CSS. We often found ourselves entangled in stylesheets — whether it was digging through nasty nested selectors to uncover why that button wasn’t aligned, or re-writing the same CSS for a one-off banner that was almost the same as the last one.
12 years after the website’s inception, we knew all the features we wanted upfront. Thus we could design them in conjunction, rather than one at a time, independent of what would come next (a recipe for inconsistency!). Working with Odopod, an external design agency, our designers came up with a consistent design language and expressed it through a standardized library of design-sanctioned elements, or “parts kit.” The engineering team then worked with the designers and product managers to determine all the uses cases we needed. Finally, we translated the designs into reusable components.
A month or so of upfront investment led to a treasure trove of React components like the one below. Many of the APIs were largely inspired by Cloudflare UI, but we built our own for maximum flexibility. Although we didn’t build all of the parts we would need upfront, we had a strong foundation in place when we started implementing the new design across features. Along the way, we made tweaks and added new parts as needed.
Having a component library made implementing simple features like the modal below almost effortless from a styling perspective. Creating this modal involved writing almost no CSS.
The new parts kit was a big win for both technical development and user experience. Our site is now easier to navigate because related features look similar. For example: previously, all of our study modes had progress and configuration options located in different places on the page. We took the redesign as an opportunity to unify these designs and make all the modes share the same sidebar, progress, and options components. Now, users don’t have to reorient themselves every time they start another way of studying.
Having reusable parts lowers the barrier to building any new page or feature aligned with the Quizlet brand. Plus, following the DRY principle means sending fewer bytes over the wire to the user, which means faster page loads! It turns out that better UX and better code often go hand-in-hand.
Ship regularly to get feedback early and often
There are different schools of thought on how to ship a redesign. One of the biggest decisions is whether to ship everything at once, in one grand unveiling, or piece by piece, as different parts get converted. The clear advantage of the former is that you maintain consistency across the site — either it’s all old or all new. In the latter case, shipping as parts get finished means there will be periods when your site has a Frankenstein appearance — clicking around can bring you back and forth between old and new.
Shipping piecemeal has its advantages, though. When you ship the entire redesign in one go, you’re plopping users into an all-new experience and praying that they’ll like it. By exposing parts of the new design incrementally, you can gauge what people think early on, and make changes as necessary before going all-in.
Also, shipping regularly keeps team morale high. Working on something that’s an ongoing work in progress, hidden from the world for months on end, can be draining and demotivating. Not to mention, it can be a lot of work to maintain one big branch of production code. However, being able to say “look at what I built” every month or two provides engineers with a consistent sense of momentum.
With these benefits in mind, we took the strategy of shipping redesign changes incrementally at Quizlet. We followed this process:
- Split our website into logical feature groupings, ordered by most highly-trafficked first. That way we can make sure the design works for the most important pages before investing in it everywhere.
- For each feature, create a new feature flag, set to admin-only at first. Develop behind this feature flag and deploy to production daily, as we normally do. As the feature is fleshed out, Quizlet staff who were on the website as part of their day-to-day (e.g. User Ops team) can try it out and identify any bugs that pop up.
- Grab a list of highly active users and invite them to beta-test. Whitelist those who agreed into the new version and provide them with a link to give feedback. Catch more bugs and identify other issues through this process.
- Run an A/B test with a small percentage of users to ensure that key metrics are maintained (more on that in the next section). Make adjustments as necessary.
- Gradually roll out the new feature, catching errors along the way.
- Gather user feedback and iterate on the new design as necessary, incorporating feedback into subsequent features . Go back to step 2 and repeat .
One thing we learned early on was that our original design had fonts that were too big and there was too much whitespace — while it looked beautiful in Sketch, it was clunky in the browser. Especially with the increasing use of Chromebooks in the classroom, users have limited screen real estate compared to the giant Thunderbolt monitors we develop on. Making everything bigger and more spread out means making users spend more time scrolling rather than studying — not what we want!
One thing we did release upfront, all at once, was our new logo, fonts and colors. We did this right before back-to-school season to make each subsequent change less drastic, thereby diminishing the jarring effect in the middle of the school year. Understandably, some users were still frustrated with having to adjust to updates every couple of months. With any redesign, there will be a vocal minority of people who are averse to change.
By seeking feedback early and often, we did our best to make the transition to the new design as painless as possible. As an important side benefit, we were able to keep engineers’ spirits high throughout the process.
Monitor performance and key engagement metrics
When changing something that a lot of people love, it’s important not to make it empirically worse. Throughout the redesign implementation, we monitored performance and engagement metrics in our efforts to improve, if not at least maintain, our baseline.
For performance, we used a tool called SpeedCurve to measure page load statistics. An important metric for us is Time to Interaction (TTI), which indicates how long it takes before the user can start using the site. As long as we made the redesigned version available via public URL, we could use SpeedCurve to compare old and new results side by side. By examining the breakdown of page load information, we could identify when changes made TTI and other metrics better or worse. Based on what we deployed at the time, we could determine what factors may have contributed to it.
For example, we discovered that the fonts in the redesign were adding a significant amount to page load, causing pages to take a lot longer to render. We also were able to identify that we were loading bigger versions of profile images than necessary. Immediate awareness of these problems led to speedy fixes so that our users wouldn’t have to deal with a slow study experience for long, if at all. Once again, our incremental rollout allowed us to catch problems gradually rather than ship a bunch of regressions at once.
To measure engagement, we ran A/B tests to verify that the redesign wasn’t making it more confusing to use Quizlet. As a company whose mission is to help people learn, one of our main metrics is the number of people who use study modes on Quizlet. When we redesigned the set page, we made the buttons for study modes bigger and more clickable, hoping they would draw more users into studying.
It worked! The new design converted 21% more logged-out users to studying.
Not every A/B test was successful right away. On the set creation page, our main goal is to get the user to successfully publish a set. When we first launched the A/B test, we saw a 4% decrease in set publish rate. Having removed the borders on the input fields, we hypothesized that the new version looked less like a place for inputting content. We made some tweaks to guide the user, such as adding informative placeholders and autofocusing the first term to invite the user to start typing.
Through feedback from our beta testers, we also discovered that the new version introduced performance problems that slowed down the experience. By addressing these problems, we were able to get our set publish rate back on par with the old version.
In the process of rewriting more than half of our front-end code, monitoring both performance and engagement was critical to preserving the high-quality experience that makes millions of students and teachers love Quizlet.
Following these principles — invest upfront to make future development easier, ship regularly to get feedback early and often, and monitor performance and key engagement metrics — we were able to ship a beautiful and consistent new Quizlet in under a year, with little to no sacrifice in performance nor user experience.
Although we didn’t convert every page of the site, we covered the entire core user experience, which accounts for 95% of traffic. Because of our decision to ship the redesign in piecemeal, we were able to end the project at a satisfactory milestone and shift gears into higher value projects.
Now, not only are we proud of the new look but we’re proud of our code, which has already sped up development and will continue to do so down the road. Plus, we learned a thing or two about our users and improved our monitoring and rollout processes along the way.
Want to ship beautiful and performant experiences that millions of people love? We’re hiring! quizlet.com/jobs
 Our last redesign was in 2013. It served us well, but it only covered certain parts of the site.
 A large amount of feedback we received was rooted in change aversion. We took measures to filter signal from noise, asking frustrated users for specifics on their use cases: “How can we make this design work for you?” We also monitored aggregate feedback over a period of time before making any drastic changes.
 To be frank, this process wasn’t in place when we started working on the redesign. It evolved with each feature we implemented, and each mistake we made along the way. For instance, we didn’t seek feedback on our study modes until we launched them fully. As a result, it wasn’t until we heard from a large number of passionate students that we discovered we’d made the font too big on Match.