Building Project Parfait

with an eye on data and customer feedback

Joan Lafferty
6 min readJun 26, 2014

In the past, product teams would often only listen to the customers who spoke the loudest. Sometimes this worked, and other times, it didn’t. In building our new service, called Project Parfait, our team decided to try a different approach. We would listen to customers, but also back up claims and needs based on large amounts of data we collected.

We developed Project Parfait using this new methodology for Adobe. Instead of spending at least a year developing a product in a silo, we developed this service and shared it with customers for feedback throughout the process.

During the early discovery period, we found that most web comps were being designed in Photoshop. With this reality, the goal in building our service was to keep designers working in Photoshop while easing the friction of getting the designs to production code. Project Parfait was built to allow front-end developers to view a PSD in the browser and gather the information they needed to turn that comp into working code. This included getting measurement and positions, colors, gradients, font information and assets.

Once a team was staffed, we faced our first challenge: how can we render a PSD in the browser with the fidelity of Photoshop? The team’s goal was to render the PSD identically to how it would display in Photoshop. To test this, we collected thousands of PSD files from customer contributions and the Photoshop team. These files were run through an image comparison tool to see how close our browser rendition came to the Photoshop view of the PSD. The initial results of this comparison gave us a baseline percentage of around 66% fidelity. This meant 34% of customers uploading their PSDs would likely be disappointed with the view they saw. This was unacceptable. In studying the failures, we realized that a lot of work was needed to support blend modes. So, that is where we focused much of the team’s efforts. After that work completed, we were up over 95% fidelity!

While building, we needed to continue validating some assumptions we had about how people would work and the type of PSDs they would upload to the service. To get more feedback on the tool, we launched several usability tests on UserTesting.com. UserTesting.com allowed us to find our target customer and test some of our workflows like taking measurements and extracting CSS gradients. These tests answered many of our questions regarding whether customers understood our value proposition and if the tool were easy to use.

Eventually, we knew we had to get more users. We wanted this service to be measured on data and customer sentiment. During development, we added anonymous instrumentation giving us information to help the team understand how someone was using or failing in our tool. We also included in-app chat (powered by Olark) so that people could ask the team questions directly.

In April of this year, we were still experimenting with workflows and knew features were missing, but we were ready for an early launch.

In the first week, users uploaded thousands of PSDs to our service and initiated hundreds of chat sessions. The chat has proven invaluable: directly chatting with customers helped people get immediate answers to questions, report issues and make feature requests. Conversations that could have taken days of emails back and forth were now resolved in minutes. While chatting, several users expected us to be robots or tech support people far removed from the team, only to discover they were chatting with core team members. I laughed out loud rereading some chats like this:

We were not robots listening to customer feedback.

Thousands of data points from the uploaded PSDs, along with qualitative feedback, continues to help us prioritize which features to work on next. Some questions that we’ve answered include: “What layer effects are used most often?”, “What type of blend modes are people using?”, “What is the layer count of the files being uploaded?”.

Layer Effects breakdown from the last 7 days of PSDs uploaded to Project Parfait
Blend Modes used in PSDs uploaded to Project Parfait in the last 7 days.
Layer count for PSDs uploaded to Project Parfait over a set period of time

This last chart helped us to establish what limitations we set for uploads to our service. Upon launch, we chose an arbitrary layer limit of 750. If users attempted to upload files over that limit, they received an upload error. After studying the data over a week’s time, we discovered that we were shutting out as many as 6% of our customers. We decided to put some effort into improving the experience for PSDs with higher layer counts. As a result of this work, we raised the limit to 1500 layers, which would ensure that 99% of users could successfully upload a file to Project Parfait.

Besides the data we get from processing thousands of PSD files, we have been studying the anonymous data regarding what people are doing in our application. From this, we found that a small percentage of people were actually downloading assets locally.

Where did we go wrong with asset extraction?

Our theory was that the assets were not discoverable to the user after the initial extraction. In Project Parfait, when a user chooses a format to download, the extracted asset displays in the Assets tab. In looking at our data, we knew people were not going to that tab. Therefore, in a recent update, we added a notification telling people where to find the asset they downloaded.

After extracting an asset, we added a notification to “Look for it in the Assets tab.” so that people could find and download their assets.

A week after releasing this change, we found the percentage of users who extracted and downloaded assets jumped from 46% to 87%. Our change obviously made a difference in improving the discoverability of asset downloads.

(update) As of late August 2014, Project Parfait functionality has grown up and became an official service in Adobe’s Creative Cloud as “Extract” that you can still use for free. This allows people to take advantage of the Creative Cloud’s collaboration features such as sharing files, commenting and versioning.

We always hoped that the final home for our service would be in the Creative Cloud; however, we found it easier to experiment, collect data and customer feedback in the standalone service you see today known as “Project Parfait”. Using data combined with customer feedback has made decision making on our team a lot easier. We have identified customer problems sooner and responded with fixes and new features within weeks, rather than months (or years!).

We hope Adobe will continue with this model so that we, as a company, can move at the same rapid pace as our rich and growing community.

Keep giving us feedback, and we’ll keep building.

Project Parfait: https://projectparfait.adobe.com/ (experimental)

Extract: http://creative.adobe.com/files (official service in Adobe’s Creative Cloud)

Twitter: @projectparfait (old), @adobewebcc (current)

Forums for questions and suggestions: https://forums.adobe.com/community/project_parfait (old)

https://forums.adobe.com/community/extract-for-psd (current)

Adobe’s Creative Cloud: http://creative.adobe.com/

Special thanks to Divya Manian, Bruce Bowman, Justin Rainwater, Andrew Shorten and Elaine Finnell for contributing to this article.

--

--

Joan Lafferty

Product Leader. Optimist. Engineer. Group product manager at Adobe.