Machine Learning and Design Tools

Rune Madsen
intelligentdesign
Published in
3 min readOct 14, 2016

This semester, we started a new research group at NYU’s Interactive Telecommunications Program. The Intelligent Design group — led by Patrick Hebron and Rune Madsen with assistance of Eve Weinberg — is devoted to exploring how machine learning will transform the field of design. We believe that machine learning will transform how people interact with design tools, programming languages, and operating systems. Conversely, we believe that design work is needed to transform machine learning — to take it past its computer science roots and into the more fertile ground of creative computing.

The goal of this group is to develop design frameworks, open-source software tools, writing and other educational materials in order to jumpstart the conversation around the emerging possibilities at the intersection of design and machine learning. Over the course of the last three weeks, we have been working to both define the focus areas of this research, as well as the methodology for our experiments. This post serves as a short summary of some of the early findings, and an instruction for immediate next steps.

Research Topics

While discussing the overarching theme and implications on current design tools, two broad categories have materialized.

Topic 1: Design by Description

Current design tools put a lot of burden on the user. Users must not only learn the specifics of their craft (graphic designers must master form, color, typography, etc), but also know how these ideas are implemented in the digital design tool they’re trying to use. This makes it even harder to create something from a blank canvas, especially if you’re a novice user. What would it look like if a design process instead was an interactive dialogue between designer and the machine? What if designers could produce designs through verbal descriptions or textual declarations, and iterate their way to a final design by responding to products created by the software? How can we think of the designer as an art director that sets up the basic parameters and intent of the design, and let the computer suggest designs based on these wishes? How can machine learning help solve both objective engineering tasks and highly subjective design tasks?

Topic 2: Adaptive User Interfaces

Current design tools have buttons and menus that stay in the same place despite of how much they’re being used. They are static. What if our tools could adapt their user interfaces based on how they’re being used? What if the software could automatically create new smart functionality when it detects recurring use of certain patterns? How can designers instrument and personalize tools by recording physical or digital gestures, and map these to custom functionality? How can we create these design tools without alienating users, and without just inventing a smarter, more annoying Microsoft Clippy?

Methodology

The research projects will be prototyped in five steps.

1. Project Ideation

First, students will work to define projects based on the above research topics. This will happen as in-group discussions, where smaller research groups will be created around specific ideas. Although the research topics are rather vague, these project ideas will need to be domain-specific.

2. Interaction Design

Next, students will design paper prototypes to visualize how users interact with the software. These paper prototypes will be discussed and reviewed in research group meetings.

3. Mechanical Turk Testing

Next, students will test these ideas on real users without implementing the machine learning algorithms. Either by having users interact with paper prototypes, or by creating screen-based test software controlled by group members, the basic assumptions of the interaction method will be tested. This will reveal common pitfalls and user needs.

4. Software prototyping

Next, students will work to produce Open Source software prototypes to be released on GitHub.

5. Documentation

During the project development phase, students will write about their progress and findings on this blog. The final software prototypes will have instructions for use on GitHub.

--

--

Rune Madsen
intelligentdesign

Computational artist. Former @nytimes and @oreillymedia. Currently writing https://programmingdesignsystems.com while teaching @ITP_NYU