UIzard’s AI Designer — review and perspective

Asbjørn Mejlvang
Bootcamp
Published in
3 min readJun 8, 2023

Reflections on the recently released AI designer by uizard.io

Last month, I wrote about ‘5 ways ai could transform the role of UI designers’. One of the five ways I highlighted was AI-generated designs. Almost immediately after pressing publish, I received a beta invite to UIzard.io’s new AI Designer. So, the future arrived within a day. But is it any good? I have two weeks to play around with it, and below, I will share my thoughts. However, as a summary, I won’t be using AI Designer as one of my daily tools as a designer, at least not yet.

What is this AI designer?

Here is how UIzard put it

“Like ChatGPT for UX/UI design… Generate multi-screen mockups with simple text.” — UIzard

The user writes a few lines about the concept and describes the style and then the ai returns a number of screens.

Review of the beta

Fun — but useless for now. It is magical to be able to write something and see it come to life in 30 seconds. But if we use the definitions resolution and
fidelity as described by Houde & hill (1997).

We interpret resolution to mean “amount of detail”, and fidelity to mean “closeness to the eventual design”. — Houde & Hill (1997)

The output of the AI designer is is of a high visual resolution, but the concept fidelity is often very low and very random.

It may look impressive at first glance, but upon closer inspection, it appears to be random components put together. The time it takes to generate something somewhat useful would be better spent sketching, doing desk research, or styling a template.

3 Potential updates

Balázs Kétyi

Reusing existing design

Instead of describing a style with text, designers should be able to use an existing component library as a style input. This would allow designers to use AI Designer to generate new flows that match an existing look and feel.

Transparency & References

bing

It would be super useful if the design was annotated. The AI Designer could explain why it chose radio buttons over checkboxes based on a post from NNGroup or a similar source.

More Control & Iterations

adobe firefly

Currently, you prompt once and get one result back. Examples of more control can be seen in Midjourney and Adobe. In Midjourney, we get four images, which allows us to pick a direction. Adobe’s new AI has shown us a new way to use AI on specific sections with their generative fill.

The AI designer is not quite ready to be a everyday tool — but keep on innovating UIzard. I would like a future where I can do digital design with my favourite pen and paper while chatting to an AI.

--

--

Asbjørn Mejlvang
Bootcamp

Designer interested in how tech will change the creative process.