At Augmented World Expo (AWE) this year, I wanted to challenge to inspire all of us. I believe now is the time to look beyond the technology of AR/AI and invest heavily in the underlying design framework and best practices that will shape humanity as a whole.
AR with AI support is early but growing fast. We’re at that tipping point where the software, hardware, and infrastructure are poised for mainstream acceptance. But then what happens? In this talk, I start to answer this question, but it’s going to take a village, as they say…
I want to start by asking you to imagine a happy future, a place where technology, is as precious as reality.
Today, I don’t feel so good about how much time I spend in front of my screens. And fortunately, most tech companies feel the same way. They have invested a lot into, speech, vision, gestures, and other sensing technologies.
We all want better solutions for a happier future. So now, is the time to invest heavily in designing the relationship we want to have with conscious machines. Conscious machines that reflect our chosen values that support a healthy society.
So for the next 15 minutes, I’m going to share how Adobe has been building this design foundation for reality.
I’m Silka from Adobe, eight years, Matt Miesnieks & myself founded Dekko, an AR startup. We build vision sensing technology, the ARCore, and ARKit of today, and design solutions we felt would make for a better future.
Today we have vision sensing technology (AR) that is robust enough to build value responsive tools and interactions.
AR Technology Foundation
Here is how the technology and design foundations stack up…
- Tracking: Lets you place digital objects in the world.
- Multiplayer: The AR Cloud is the Games Center for local and remote collaboration.
- ML Platforms: Machine Learning platforms recognize objects, people, speech, and much more.
- Hardware: AR-enabled mobiles, and soon we’ll see consumer-grade AR glasses.
On this tech stack we can invest in the design foundation for reality.
We see a future where “our children’s children will grow up in a world devoid of two-dimensional technology” Alex Kipman, Hololens. To this, a reality design user interfaces will be flipped on its head. Designs in AR will be able to adapt to any uncontrolled world environment, not the controlled rectangular screen environment.
So last year Adobe released Dimensions CC, a tool for our 2D designers to start learning about 3D design.
Last month at we announced Project Aero a simple new augmented reality (AR) authoring tool. It challenges to bridge the creative gap between the physical and digital worlds. Signup for early access for Project Aero.
2. Reality Principles
Eight years ago, designers only thought about screen interactions. Today, we are inventing completely new interactions with unlimited space. At Adobe, we are excited but also confused about what is possible. So we want to continue sharing what we learn.
3. Ethical Design
Eight years ago designers spoke up for human-centered design and earned a seat at the c-level table.
Today, designers with engineers, scientists, and many others are speaking up for all of humanity. We want to proactively be building ethical systems that act well now and in the future.
We have all experienced the ‘Unintended Consequences’ of AI, like fake news and privacy violations.
Can we design for Intended Benefits using sensory data?
For example, cars, buildings, and numerous objects are filled with sensors from cameras, speakers, bio-sensors, IMUs, and location sensors. Connect these sensors, and we start to see a functioning responsive ecosystem, a bit like our human nervous system. When one part the body is hurt our whole body immediately knows and responds in support of it.
What if sensors in our car could send a signal to the local council telling them about pothole I just drove through. Or a building could ‘sense’ how happy or uncomfortable it’s occupants are and feed this data back to the architects. And how incredibly valuable would it be for industrial designers to learn how their products were used or thrown in the trash? Designers could use this data to make products better, reduce the trillions spend on ads, and reduce landfill.
4. Sensory Design
At Adobe we got frustrated that our current design methods didn’t work for AR. So we’re forming a new design system that lets us combine sense, like an artists combine colors, to create new magical experiences.
To get started, we looked to Material Design by Google. The most successful design language for screen interactions. We needed to extend this design language for digital & real-world interactions. Material Design utilizes the sense of touch, and we needed to extend it all other human & machine senses.
After all, if our human brains are wired to believe what we see and feel, then sensing is as crucial as cognition in design for reality. So, we must invest in knowing how our senses work together.
It has bee proven that proprioception, our we sense the space around us, is a very effective tool for learning. And Agency, the sense of choice, as useful in self-directed tasks. But little else is known about how our senses work together. So, I’m going to share a few examples of what we’ve learned about designing with sensors.
First, we broke up the senses to build a sensory framework. We took human senses and identified how they are reflected in technology. For example, proprioception, our sense of space, is computer vision’s understanding of what it sees, on our phone.
So, when we combined and emphasized different senses, we saw vastly different outcomes. We believe animation is a core skill needed in AR. So we built an app that let you capture natural motion from anything you can see through your camera. Then we give people transferred it to their digital creations. It was fascinating to see that powerful magic of sensing technology.
While I can’t show you our work, I can show you how FX artist Randy Cano might use it.
In this example, we focused on the Sense of Agency. (Agency: a desire to make creative choices to speed up a design process).
At Adobe, we are looking at how to speed up the creative process using AR for spatial layout and collaboration with our Dimension CC and Aero tools.
Next, we wanted to understand how AR will be used in ways no one has thought about yet. So we reached out to Zach Lieberman & Molmol Kuo AR artists/coders passionate about making simple creative tools for everyone.
They were interested in using face tracking was a tool instead of for beauty as Snap chat and FB are doing. So they mappable face movements to events. Like blinking eye would play a sound, trigger animation, etc. They made a face an instrument. It became too hard to develop today
Blended actions to engage multiple senses increased the emotional and cognitive experience.
Next, they discovered moving that makes AR because you see & play with 3Dness.
Here is the Weird Type iOS App’ first ‘Hello World’ experience.
Now we could learn how others used words in space to expressing a feeling by using depth in a new way.
These experiments with type in space led me to discover Dong Yoon Park’s function experiments with type in space.
We’ve seen how multi-sensory designs enhance the joy of an experience. Maybe that is why we love to hang out in person with friends rather than online since it engages more senses and builds human connection.
In 1970 industrial designer Dieter Rams famously wrote ten principles for good design. Today we live in a world where design can push back, respond, or sense anything.
And although you may feel uncomfortable by this new age of augmented reality powered by machine intelligence.
It’s a remarkable period of human history.
We are the people that are building the foundations for this period. It’s us, the designers, engineers, cognitive scientist, entrepreneurs, and many others.
If we challenge ourselves to look beyond technology and focus some energy towards building a good design foundation, we can build a future that is a little more empathic by nature.