Imagining new ways of living

Together with IKEA and SPACE10, we explored how tomorrow’s technologies can redefine how we live at home.

Bakken & Bæck
Bakken & Bæck
8 min readSep 11, 2020

--

IKEA and SPACE10 invited us to imagine new ways of living by conducting a collection of digital experiments. Together with Random Studio, FIELD, ManvsMachine, Philip Pries Henningsen, Set Snail, Norgram, Timi Oyedeji, Alonso Holmes & Strømlin, and CIRG we wanted to discover how we can work with advancements in technology to improve our homes.

Besides this, we collaborated with SPACE10 to define the overall creative communication and curation strategy for all the Everyday Experiments, and tie them together across various IKEA platforms in a playful manner.

Introducing extraordinary possibilities

With our five experiments, we wanted to challenge the role new technologies could play in our everyday living spaces. The experiments all constantly shift between the everydayness of ordinary life and the extraordinary possibilities of technological development, like blinds adapted to the rhythm of the outside world and designing the perfect seat in collaboration with an algorithm.

We worked at the intersection of front-end development, machine learning, and cross-reality, while our developers jumped straight into prototyping at the same time as our designers firmed up the storytelling. Even though our five experiments were meant to remain on an abstract level — not to be carried out and made into a physical product — we prototyped and built some of them.

Experiment 001: Shelve it

What if you could design the perfect shelf simply by looking at a wall?

Getting a new piece of furniture that will organically integrate with your home is a burdensome task. The organic fit tends to get lost when choosing and combining, neglecting the potential flexibility of modular products.

Shelve It is a technical prototype for an augmented reality (AR) app that would scan the topology in your home to reveal new ways for shelves to fit your space. By using the new LiDAR depth sensors available in the latest generation iPads, depth sensors allow us to understand where there is a wall, how it is connected to a floor, and if objects are standing in front of the wall.

An array of rays detects available areas, which are automatically filled with the essential modules of IKEA’s IVAR shelf system to give you an idea of how it would look.

The app creates custom fitting variations of the modular IVAR system that are unique to your place, for instance, smoothly fitting the shelf around existing furniture or obstacles.

Experiment 002: Techno Carpenter

What if algorithms could help design your next favorite chair?

On Techno Carpenter, we experimented with a novel way of interacting with and experiencing highly complex technology; by feeding an algorithm more than 6000 3D models of existing chairs, we trained it to recognise, read and analyse a chair based on its features. The result was a technical prototype for a virtual reality (VR) app.

Hand and finger movements shape the geometry of the chair in real-time.

We started with a bold vision of a seamless interplay and co-creation between humans and a machine learning algorithm, a vision we constantly challenged and altered throughout the development of the experiment.

When you move your hand in the VR environment to shape your dream chair, the computer generates a new model by matching your gesture to coordinates in the latent chair-space. Every slight movement of the palm and finger will be interpreted by the machine learning model and translated into an iteration of the chair.

This is where the co-creation between humans and algorithms begins to emerge. It invites us to collaborate with machines and serves as a virtual reality environment where you can interact with and shape your very own machine learning-generated chair. In your quest for the perfect seat, this app would allow you to save time by just designing your own.

Techno Carpenter was designed with simplicity and playfulness in mind. We made the interface as plain as possible, allowing anybody to use it without any massive limitations.

Experiment 003: Room Shuffle

What if you could see the true potential of your space?

While some people find it easy to decide if an environment fits their needs, it is generally more difficult and time-consuming to change an environment for the better. It needs proper interior design training and up-to-date knowledge to suggest significant improvements. Most people don’t have access to this resource, so we wanted to create an assistant that could help them furnish their space, see new possibilities, and unlock their creativity.

Responsive furniture placement

To see the potential in every corner and every room, we trained Room Shuffle on a collection of expert-created floor plans. It learned about different room types and familiar patterns of furniture, for instance how a sofa usually goes with a coffee table and a dining table usually belongs with some chairs. We also trained it on rules for space around furniture, and how the furniture is positioned in the room.

The algorithm tries to create a meaningful layout by respecting factors like space between furniture, positioning relative to walls, windows, doors etc

You can use Room Shuffle on empty rooms or on existing furniture pieces; to give you input on doors, windows, and lights. You can choose the furniture to be placed, or leave it all up to the algorithm.

And if you want to see the machine-generated furnishing, you can view it live in 3D directly in your space on your phone or tablet — allowing you to see how the suggestions mesh with your existing space.

Experiment 004: Spatial Embodiment

What if virtual reality allowed us to be more playful with our furniture?

Unlike the two-dimensional interfaces of our phones, tablets, and personal computers, the interface of immersive environments allows us to be physically “present”.

Here, we can walk, turn and look around and use our whole bodies while doing so. That said, human gestures are freedoms of expression, which makes them pretty subjective. Gestures are an essential part of human communication. From wild gesticulations to natural gestures that give important clues to the message, human gestures are powerful tools we use to make ourselves understood.

Traditionally, computers are machines that don’t work with too much context, it doesn’t understand our rich communication as human beings, which makes training a computer to track gestures, recognise the intention expressed by them, and perform related tasks a pretty messy affair.

Testing hand tracking function in the Unity environment

To explore how gestures and voices work together in immersive environments, we organised an interdisciplinary workshop with a product designer, linguist, front-end developer, and cross reality (XR) expert. It quickly became clear that the intuitive and straight forward way of interacting felt restricted by the existing hardware and current implementation.

That’s when we decided to create Spatial Embodiment — a technical prototype for an interactive system that would invite you to communicate with the space around you just like a human being.

Workshop to identify pain points of existing VR experiences and opportunities for hand tracking.

In this virtual reality (VR) experience, you can use gestures in combination with your voice to interact and communicate with your home and its objects naturally and intuitively. In the background, our system tracks the direction of your attention and tries to create a meaningful context and action based on what you say and what you do with your hand — making the virtual environment feel more familiar, simple, responsive and accessible.

Experiment 005: Home Applications

What if you could teach an old home a few new tricks?

Derived from three different experiments, we used data from publicly available APIs such as weather, NASA image databases, and air quality to enhance the features of IKEA smart devices, such as Tradfri Smart Lamp and Blinds.

Each proof of concept takes the form of a web-based user interface which serves as the controller for querying data, and then sends commands to the hardware — such as controlling the light hue or setting the position of blinds.

NASAs image of the day aligns the mood in your room with the colours of outer space.
The air pollution index is translated to the smart light, making it tangible.
The automated smart blinds open gradually over time, supporting your natural rhythm.

We decided that all three experiments within Home Applications should use a conventional and “standard” tooling to enable as many people as possible to hop onto something familiar to them immediately. That’s why we built the web user interfaces with a static site generator Gatsby (React), and each experiment pulls in open-source dependencies and libraries as needed.

Curating for all possibilities

In addition to making the experience, we were responsible for all of the creative communication — which included copy and visual treatment, social media packaging, and an explainer video. When we outlined the overall communication strategy, we kept the barrier for experimentation low, ensuring that the outcomes of the experiments were easy to understand, share, and identify with for the many.

We also made sure to create a coherent sum of the experiments without over-polishing their unique qualities, defining a communication and curation strategy that can be used by all participating studios — now, and in the future.

To explore all the experiments and imagine new ways of living, visit everydayexperiments.com

--

--

Bakken & Bæck
Bakken & Bæck

We’re Bakken & Bæck, a digital studio based in Oslo, Bonn, Amsterdam and London. We define, design and develop all things digital.