Making Adobe XD — Redefining Beta

Vincent Hardy
Thinking Design
Published in
8 min readJun 7, 2016

Creating and crafting software can be a very satisfying endeavor. I love it and I am passionate about it. I am lucky to lead a talented engineering team for a new initiative at Adobe, called Adobe Experience Design CC (Adobe XD for short). One of our main goals as we were working out the engineering process for Adobe XD was to rethink the way we build and deliver new software — to ensure that we’re engaging in a conversation with the design community and creating a product that fits their wants and needs. In my experience, building this kind of process hasn’t always been easy.

At one time in my career, I really wanted to move on to do something else. Before I confirmed my love for creating software, I was wrestling with two big issues.

The first was that I had worked on too many software projects where it became really hard to do anything: bug fixing was painful, feature addition near impossible. Touching any line of code seemed akin to trying to fix or add steam and water pipes in a massive industrial complex, without a map of the whole system, and no way to validate the operation. Only the original developers (the heroes or the gurus) were brave enough to attempt anything of significance in the code base, because they had more context than others. Quality kept deteriorating and maintenance became prohibitively expensive. Not fun, not fulfilling, not interesting — and most importantly, very bad for the end users, who got a suboptimal solution to their problem.

Which brings me to my second issue. Many projects I worked on, despite stated intentions to meet the needs of users, were often built in a way that rushed features. Features were rushed both from the engineering side (implemented too quickly) and from the product and design side (without proper user validation). So while we might have shipped with the target timing, what was shipped was a buggy implementation that our users did not really want in the first place.

Why does something so undesirable happen?

In software, you work with three key parameters: Quality, Time (of the release) and Scope (for features). In our team, we call this the ‘iron triangle’ (our version of the project management triangle). You can only constrain two of the three parameters in a given project. If you try to constrain the three, one will give. For example, if you constrain the date and the scope, quality will suffer. If you constrain quality and scope, then the delivery date will move out.

Too often, the desire to pile in features (increase scope) in a fixed amount of time makes quality (and performance) suffer badly.

Software’s “Iron Triangle”

With all that, why am I still working in software development then?

Well, of course, other people had run into these issues, realized there was a better way, and started developing methods and solutions. Really good ones.

Better engineering practices emerged in the industry, such as agile development or extreme programming, which focused on building quality into the engineering effort. The goal was to not compromise on the ‘Quality’ parameter of the iron triangle, deliver at a regular cadence and add features over time. So this gave us flexibility on scope: we could now develop more maintainable software because we took the time to build not only the features we needed, but also a comprehensive set of automated tests, guaranteeing that over time we would always know if a particular feature or piece of code kept working or not.

I first implemented these practices in an open source project at Apache and then in commercial projects. Adopting these better practices changed my software engineering life in that it made developing software much, much better. And I started enjoying creating software again!

The second big change was a major evolution in the way we shepherd the development of our products from idea to implementation. With methodologies such as Lean Product Development, there is a much higher emphasis on discovering real user problems, making specific hypotheses we can validate (or prove wrong) through experiments, and using data analytics instead of individual opinions to drive decisions. This has led to better-crafted solutions and a better fit between solutions and the problems they are meant to address.

Methodologies such as agile software development and lean product development are not new, but they did change my professional life as I have learned and adopted aspects of them into my work, seeing their power to deliver more robust software systems that better meet users’ needs and expectations.

And now, I am happy to apply these methods to an ambitious project: Adobe XD.

A few months ago, our team released the first Beta (called Preview) of Adobe XD, a solution for designing, prototyping and sharing experiences for mobile applications, websites, and other screen-based experiences.

The first element of this solution is the Mac OS X desktop version of Adobe XD. You can see a screenshot of this application below, with its simple and focused user interface.

Adobe Xd’s Design Workspace — Kept simple to minimize user distraction

From its inception, we have engineered Adobe XD for speed and quality. By that, we mean that it had to enable designers to design at the speed of thought, without any friction along the way. We heard from designers that poor performance or quality was not only frustrating, but also interrupted their creative expression and ability to produce and iterate on a lot of work quickly. This is especially important as they are creating experiences for many screens (such as applications, web sites and wearables).

Going into the project, we knew it would take time to get it right. We also knew we wanted to get our solution in front of users quickly in order to learn and iterate. And we knew we wanted to deliver a high quality experience.

How did we go about achieving that?

Start on a small but robust base. Then iterate.

We have all experienced “Beta” software that had a lot of features, but where the quality, the performance or both were “not there”. The first beta tries to show off all the features that are going to be in the final product, but at the expense of ‘feature depth’ (because the craft and details that go into really completing a feature takes time) and at the expense of quality (because ‘there isn’t time for that’). This is followed by releases that try to improve feature depth and quality. But in the rush to get so many features in for the first beta, corners are often cut that make it hard to do this “catch-up” work later. Trying to stabilize the entire product after the fact without a strong automated test suite and quality checkpoints all along the way is very difficult and time-consuming. This approach is illustrated below.

Non-ideal scenario: Building shallow features early and growing quality debt

There are many reasons why this is less than ideal. The main ones fall out of the very reasons why we release a “Beta” version in the first place. The reasons are to share the vision for the tool we are building and share the specific features we are building. We want feedback on both to inform the product direction and implementation. The goal here is to increase chances for the product’s success.

Getting too many features early at the expense of quality introduces a very high risk: the vision and product ambition can be tainted with a reputation of poor quality or performance, and the feedback can be centered on bugs and performance concerns rather than on the vision and features. So we lose an opportunity to learn the most important lessons our users can teach us.

To increase our chances to learn and be successful, we decided to start with a minimal feature set that:

  • Had depth, with the details that matter to users worked out
  • Had solid design and implementation blueprint
  • Were built to our quality standards

And then, we add value over time as we get feedback from users to solidify the existing features and prioritize new ones, one step at a time.

This is illustrated in the diagram below.

Adobe XD — Approach to releasing Beta versions and working up to a 1.0

What this diagram shows is that from the very first release, we aimed to deliver an experience that, in terms of quality and performance, meets the same expectations a user would have for a production release. Going back to the ‘iron triangle’, what we do is ship on time and at production release quality by reducing scope as necessary. With this approach, every beta release has the high performance and quality of a full release — just with fewer features. We believe this allows users to more truly engage with the product and gives us more opportunities to learn.

To achieve this level of quality, we started with a small, solid code base and set a high bar for any feature we add to it. That high bar takes the form of a strict definition of done (DoD). Our DoD includes things like code reviews, unit tests or code coverage, to ensure these practices are implemented systematically. We are committed to delivering updates to our users on a monthly cadence (which is high for most desktop software). In the medium and long run, we believe we are delivering features at a higher speed, cadence and quality than if we spread our releases more. And we get a new learning opportunity with each of the intermediate releases, as illustrated above.

So in summary, the approach gives us two major benefits:

  1. Rapid user validation. We are not assuming that the solution we build is spot-on. By releasing often, we get constant feedback and we can correct course when and where needed. For example, we have added tweaks to our tool’s color picker, pen tool and sharing features based on user validation.
  2. User engagement and iteration. After we release a new version, users tell us what they are still missing most in the tool. This allows us to adjust our prioritization for later releases to ensure we’re working on the most important things first. It can take a couple of releases before user feedback starts showing up in the product, because by the time we release a new version, the next one is already locked and loaded. This pipelined approach allows us to continuously deliver value to users while constantly improving the product.

Conclusion

We have received a great response from the community so far. We have been able to release three monthly updates of Adobe XD and the fourth one is on track. The feedback from users has been focused on features as opposed to quality or performance issues, and I am thrilled about this way of bringing beta software to our users.

I am looking forward to the learnings ahead of us as we continually improve our methods. We have new and exciting features in the making (see ‘what’s still to come’ in this article from Andrew, Adobe XD’s director of product management), and we’d love to hear your feedback. Test drive the XD preview for Mac OS X (Win10 is in the making too!) and please join other designers on UserVoice to report bugs, comment on features, or ask for new ones!

--

--

Vincent Hardy
Thinking Design

Vincent Hardy is the senior director of engineering for Adobe Adobe XD. He loves building teams focused on high quality software, graphic design and trail runs!