Four Highlights From iXD 16

Greg Cavanagh
ELSE
Published in
8 min readMar 22, 2016

By Greg Cavanagh

This year myself and Natalie (Principal UX at Else) attended the IXDA Interaction 16 conference held in Helsinki, Finland. Running from the 28th of Feb to the 4th of March, the topics spanned much of current and future thinking within the field of UX.

With some time to reflect, I’ve gathered my notes and pictures with the aim of sharing some of the stand out talks and give you a sense of what it was like.

1) Why haven’t we “broken the grid” in interaction design yet- Nour Tabet, Emma Sherwood-Forbes, Campfire Stage.

Nour Tabet and Emma Sherwood-Forbes discussed the idea of what they referred to as ‘sameness’, how this has come about and what we can do as designers to combat it.

“Everyone using the same resources, same tools, moving at the same pace, leads to sameness”

They explained how copying the look and feel of other sites ‘sells our industry short’ and limits our ability to communicate through design. They also mentioned that part of the reason for this is the fact that we are supplying and feeding from the same resources in the form of Pinterest etc. Part of the problem is that our clients are also looking at the same sources of inspiration, they might look at a direct competitor and say “I want one of them”.

On the flip side the fact that there is now an abundance of high quality best practice patterns to influence our thinking, has made it increasingly hard to design a car crash website. “The bar has been raised”, it also allows you to make a judgement call when to spend time wrestling with a problem, and when to look at how others have solved it. Nour and Emma mentioned that the point we have arrived at is like a springboard to design from.

“These advances should be encouraging us to do more, not the same”.

Here are 4 ways they suggested we, as designers, could break away and put our own stamp on a project:

1) Be aware of the context of what you’re designing (research, fully explore the parameters of the brand you’re working with);
2) Micro-interaction — after thinking big, focus on the smallest detail and make it beautiful and relevant;
3) Be inspired by the random world around us — when we’re inspired by different domains we can bring new things to the table;
4) Motion — a great way to tell a story & inject tone and personality, without being distracting.

2) The dawn of Agentive Technology: The UX of ’ soft’ AI — Chris Noessel, Finlandia hall main stage.

Another highlight for me was Chris Noessel’s talk on ‘Agentive Technology’, which took place in the main auditorium.

Chris Noessel began his talk by asking the question ‘who’s afraid of AI?’ of which the majority of the audience put their hands up. It was an interesting start as it was an acknowledgment of the possibilities of when the thing we create overtakes us (or worse).

He then gave some interesting examples of existing and emerging ‘Agentive’ tech starting with ‘Get Narrative’, a tiny camera which clips to your clothes and automatically takes a picture every 30 seconds if it senses light. Once connected to wifi it automatically uploads the photos to the cloud. It then sorts your photos for you using two algorithms — the first looks at your photos and works out what you were doing. The second works out the best photo from that scene. Lastly it sends the image to your phone and asks you to rate it.

Chris referred to this product as ‘a camera without a photographer’. After seeing this it made me question the intrusive nature of the product and the general idea of being watched without knowing it.

Another example which you might already be familiar with is the ‘Roomba’ vacuum. The vacuum which heads out of its charging block at a set time, detecting dirt as it goes and then returning after completing the task. Chris mentioned that he calls his ‘Dusty’. This shows an emotional connection to the object — humanising of a machine (a topic also covered in Kate Darling’s keynote on the Thursday morning).

Agentive tech that no longer requires human operation, provides opportunity for an aesthetic rethink, e.g. no longer a need for a handle — a pivotal feature on a standard vacuum.

After touching on a whole host of examples including the Google driverless car, Chris went on to explain why ‘Agentive tech’ is so groundbreaking. He mentioned that, although you may argue ‘Agentive tech’ already exists under the guise of automation (e.g. autopilot first invented in 1914) ‘Agentive’ is what happens when UX is thrown into the mix. He referred to it as “Agentive tech gives the user a promotion”, giving them the ability to skip on laborious tasks such as vacuuming. The technology does the task, allowing the user to manage or step in only when required e.g. the Roomba vacuum getting stuck under the sofa or being bullied by the dog.

Users still want to play — He gave the example of a robo advisor platform he recently worked on saying: “you’re not going to just let them run off with your money, you’re going to need a trial run, to see what they can do”. I think this is an interesting point which was also raised in John Rousseau’s talk on autonomous driving. It’s important to factor what tasks we actually want AI to take over for us, and which ones are fun e.g. the joy of driving, or the art of taking a good photo.

3) Passenger hood: on the road to autonomy — John Rousseau, Finlandia hall main stage

John Rousseau is Executive Director at Artefact — a digital innovation and design firm in Seattle. His current work focuses on bringing together Augmented Reality, automotive and entertainment. I found his talk fascinating, particularly as it provided some interesting insights into how the role of UX might evolve in the near future.

During his talk, he gave 4 examples of these potential opportunities in relation to the automotive industry and how our role as designers may change in the rise of autonomous vehicles:

  1. Engaging the co-pilot
  2. Transparency Systems
  3. Adaptive interfaces
  4. New Affordances

Engaging the co-pilot — John mentioned that in an autonomous vehicle, the driver’s role alters from driver to co-pilot or passenger. Engaging the co-pilot could be as simple as providing them with opportunity to direct/dictate the route. The car could play back the user’s preferences, e.g. “playing their favourite Netflix show”, or surfacing a suggested route based on their routine. There is opportunity for the user to tune in to the data being produced by the car if they wanted to.

Transparency Systems — Making the inner workings of the car transparent. The car needs to convey what it sees and forecast what it can’t e.g. miles to next fuel pump and re-route accordingly. It should prompt feedback from the user when required.

Adaptive interfaces — There is great opportunity for UX when focusing on the moment the car switches from autonomous to manual and visa versa. Particularly from a safety point of view, with a complete change in attitude and focus required from the driver. John showed a prototype which demonstrated the change within the digital interface on the dashboard when more engagement is required by the user. Certain controls enlarged and road monitoring aspects fell away.

New Affordances — John mentioned that the whole car could become the interface, marrying the bridge between physical and digital. The driving experience could become a multi sensory one. With haptic feedback or a change in orientation or form within the drivers seat based on an altered driving experience. Or the steering wheel extending to the drivers reach, to emphasis a change in function.

John finished by explaining: “Tech will augment rather than replace human drivers — it may change the way we approach driving.”

4) Nature’s Notifications: What technology can learn from a buzzing bee or a thunderclap- Emmi Laakso & Philip Tiongson, Helsinki-Sali Stage

Emmi Laakso & Phil Tiongson from the agency ‘Potion’ based in NYC took to the Helsinki-Sali stage on the Friday afternoon.

They spoke specifically about notifications, how their purpose has evolved over time and what part they play in our day to day lives. They then highlighted how we could draw parallels and influence from nature and the world around us to design them.

The talk began with an example of a phone going off during a philharmonic orchestra performance and the conductor having to pause the performance until the phone rang out — this emphases how notifications are often unwelcome and have an intrusive nature. In their nature, they are hard to ignore. As a result, tolerance to them goes down and they get turned off, losing their overall purpose.

Emmi Laakso talked about a week long solo hike she did as a part of a process of disconnecting herself from technology. Although her phone wasn’t in her pocket, she said that she was “still receiving information”. She used the concept of weather updates talking about the sound of thunder being her notification, she knew she needed to pay attention when the thunder increased in volume.

Here are 4 things she learnt from sounds within nature:

  1. Distinct vs Ambient
  2. Volume & Repetitiveness
  3. Complexity
  4. High & low pitch

Emmi explained that we as humans are designed to take in and process vast amounts of data — our ears are tuned to recognise a familiar voice, or a person calling your name from across a noisy room. These attributes are not exploited in the way notifications currently format, making it flat, impersonal and impossible to distinguish between.

“Notifications are very blunt instruments right now”

They finished their presentation, by suggesting that as well as sound, context and hierarchy should also be factored in when designing notifications. When you receive a message in your pocket you don’t know who it’s from and whether it’s worth your attention, “you have a bias, you may assume its really important”.

“We don’t know the sender or receivers context, are they on route to A&E or sat in a pub”?

They mentioned how notifications form a natural hierarchy in how we approach them. Phil explained messages from his wife always had a ‘personal importance’ whereas other messages might be important at times and not at others.

If we could improve this, it would make notifications more valuable and reduce the need to reach for your pocket for unimportant or irrelevant messages.

Summary

These were just some of the talks I went to, naturally there were a few clashes and I missed out on some others. The good news is that all of the talks are due go live on the Interaction 16 site in the next couple of months!

Originally published at ELSE.

--

--