Case study: Building AI tools for UX designers
Five lessons learned from an industry case study
As researchers in HCI and AI, here we collaborated with an IT consulting and software development company in Munich, Germany. We investigated how to support their UI/UX design process with an AI tool, in particular at early stages of the process. We targeted participatory design workshops, in which clients get involved in generating UI ideas and sketches.
This article gives a short overview of the project and shares five practical lessons learned.
The Research Approach
We followed a user-centred design approach for our AI tool: We observed the participatory design workshops at our industry partner, interviewed the involved designers and consultants, and mapped out pain points to see what we might be able to address with a new (AI) tool.
Equipped with these insights, and through iteration and user testing as shown in the figure, we arrived at the following concept and prototype.
The Concept and Prototype
Our concept is called “paper2wire” and leverages computer vision and Machine Learning to detect GUI elements in hand drawn UI sketches on paper, to then automatically create the corresponding digital wireframes.
Concretely, our prototype used the Microsoft Custom Vision API for detection and was implemented as a Sketch plugin to allow designers to create and edit such wireframes easily within their main tool.
The workflow looks like this:
- Take a photo of the paper UI sketch.
- Load it into the wireframing tool Sketch.
- Our plugin parses the photo and creates the wireframe in Sketch.
- Modify the resulting wireframe, use it in further design steps, etc.
This short video (24s) shows the prototype in action:
The 5 Lessons Learned
We deployed our prototype in a user study with the designers and consultants. Here are the five key findings.
1. Use AI as “glue” between process steps
The designers valued the targeted activity (participatory design workshops) for communication and building relationships with clients. This turned out to be a bad initial target for an AI tool, since automation might take that away.
We learned that we could leverage AI to facilitate integrating the workshops into the whole design process. In this way, AI supports the design process not by automating a key activity directly, but by reducing repetitive or “bookkeeping” follow-up work that arises from the activity (in this case: translating the workshop outcomes into a digital format for further use).
2. Respect designers’ knowledge and tools
Our study revealed that AI tools should not aim to entirely skip steps in the design process: The designers especially liked that they could still fully edit the generated wireframe directly within their known software. This would have been impossible, for example, had we instead aimed to output UI code directly.
Of course, introducing AI tools might also change the design process itself, such as switching to digital prototypes earlier if this becomes easier. Investigating potential influences like this requires more long-term studies than ours here.
3. Support “thinking by doing”
We learned that seemingly repetitive work can play unexpected roles: For example, one designer valued converting the sketches to digital wireframes manually as a stimulant for reflecting on the design.
Therefore, we should be careful to not provide AI tools without a choice, even if they get us to the output faster: Designers might still prefer to conduct (some steps) manually, as this might be a part of the creative process that leads to more than obtaining the result.
4. Invest in process understanding before AI accuracy
Surprisingly, we learned that AI tools can be “too accurate”: Our first prototype placed GUI elements at the exact pixel locations detected in the hand drawn sketch — yet this was neither what designers expected nor what they needed, because it leads to imperfectly aligned elements.
Here, our first prototype violated an implicit assumption and lacked understanding of the design process, namely that going from paper to digital involves now caring about element alignment. Instead of optimising for pure detection accuracy, our tool needed to respect these assumptions and expectations. Concretely, we revised our prototype to interpret sketched locations as along an assumed grid.
5. Consider impact on consistency
Based on the feedback of several designers in our study, we learned that our AI tool might help them adhere to standards, because it would ensure choosing a digital element from a fixed set in a consistent way.
Interestingly, this was not something we had in mind when creating our prototype, and neither did it appear in the initial user research. It only surfaced when designers tried out the prototype hands on.
However, AI tools might also have a negative impact on consistency, for example through errors such as incorrectly detecting element types. This needs to be studied further, in particular also in downstream parts of the design process.
Conclusion
Despite the current AI hype and its many successful applications, AI seems to remain somewhat underexplored for supporting the design of digital applications and their UI/UX design processes. While the experiences here might not directly transfer to your cases and contexts, we hope to provide an illustrating point of reference for the vision that future AI tools for UI/UX design are developed in close cooperation with practitioners.
For more details see our paper on this project (which received an award at MuC’20, the largest German HCI conference).