Design perspective: Apple WWDC 2023
Apple organizes the World Wide Developer Conference (WWDC) every year to highlight software improvements in all of their operating systems. When there are big changes in the platforms due to new capabilities offered by a product, such as the Vision Pro, they are also presented here to allow designers and developers time to work on the novelties.
This year, if I had to summarize the entire week with a single word, that word would be integration. Let me explain why.
Widgets: integrating apps with context
Widgets were born on the Mac back in 2005, they later appeared on Windows and, finally, came to Android with great impact. It took a long time for widgets to get to iOS, although they have been evolving quickly.
This year, Apple operating systems expand the usefulness and presence of widgets. Now they are interactive on all platforms, can be integrated more visibly into the experience, and present relevant activity with improvements in Live Activities and the published design guidelines (which Android introduced this year under the name of Foreground Activities). You can display widgets on the Mac of apps installed on the iPhone, and even turn your charging iPhone into an information center where widgets abound.
Considering the number of apps we have, it is very difficult to stay relevant all the time. Widgets and the intelligence that surrounds them allow you to retain or increase points of contact with users without requiring them to enter the app. This “design for disengagement”, which one of these days I should go into detail, requires a good understanding of the use cases and mental models of users in order to be translated into an enriching experience and that results in better retention.
Ecosystem: integrating devices with each other
Apple's products and services are probably the most robust case of an ecosystem that we can find. This year they have reinforced this integration in subtle ways that, when combined, have the potential for a larger impact. Using the cameras of one device in a video call of another, the transition of the same app handed over from one device to another, etc...
Features such as Continuity and Handoff continue to evolve as something worth considering if we aspire to cover more use cases when designing. There is a specific session that, although quite technical, outlines very well the bases of designing an experience where both an iPhone and an Apple Watch participate in a coordinated way, you can watch it here.
Furthermore, the improvements in SharePlay also allow us to integrate our devices with other people’s. Apps with video conferencing capabilities can now better take advantage of the cameras, microphones and integrate with other apps to create richer experiences.
The example of several students collaborating locally and remotely at the same time on a project is a great case study that can be used by designers learn how to approach the definition of a product benefiting from these the creation oa novel approach to the value proposition for a product that takes advantage of these improvements.
Social network: integrating people in the ecosystem
This is a subtle movement that I find tremendously interesting. This year I think I have seen Apple executing a strategy similar to Facebook’s but in reverse. Previous attempts bythe Cupertino company to build a social network in the past have been disastrous, but I think it has finally hit for them. They realized the social network they need already exists. Every person has their own circles of influence, other people they interact with. We use different apps and services, but always through our devices.
Last year they already improved how the OS surfaces to what friends recommend to us through Shared with Me for links, photos, music, etc. This year they have improved SharePlay, they have turned our contact list into our friends list, and through Posters (sort of an avatar) we can define how other people see us when we call or message. Add a layer of security, such as the automatic Check In feature or the detection of nudes and sensitive images that let receivers decide if they want to see it or leave it blocked, and an super easy way to connect, exchanging contact data simply, and sharing, with improvements in AirPlay for files, and stickers for photos, and we have connected all the elements that make up a social network. Only this time, instead of being a specific app, it is the aggregation of features that facilitate social dynamics in real life.
It also is not surprising that privacy has once again been a key message throughout the week, with new guidelines. Probably because all these new features can be difficult to design with data protection built-in.
This concept of an intrinsic social network is not part of Apple's narrative, this is a personal interpretation of mine that I think leads us to think about the social interaction being mediated by devices, and how to make them more natural and less intrusive. Again, the need for excellent research and hard work on the value proposition is decisive to achieve that perception.
Apple Vision: integrating virtual within real world
The definition of metaverse is "a perpetual and persistent multi-user environment that fuses physical reality with digital virtuality" that Meta has interpreted as an opportunity to create a next generation of social network in a controlled and fully monetizable medium. Apple's bet is diametrically opposed, presenting XR (extended reality) as a space where to expand your capabilities, anchored in reality by default with an option to escape it with a deeper immersion.
Without being perfect and with many doubts to be resolved, the Vision Pro shows that Apple has at least been concerned with understanding first posibilities of this new medium, what barriers can stop its adoption, and has explored ways to balance it. The pricing tag has suffered from that, but that always happens with first generation devices. I try to take confort in the Pro label, we will always have to hope for the Vision Air or SE to be less offensive to the pocket.
In any case, as a new product category, they establish many more opportunities than the current XR market. And as designers, the ball is in our court. For now, the value proposition focuses on four use cases: productivity, entertainment, communication and memories (photos and videos). They are four scenarios well supported by this product iteration with a multitude of technical and design resources that can help us develop new ideas in spacial computing and immersive experiences.
YouTuber Jaylen D. Bledsoe analyses the narrative Apple has used to unveil the Vision Pro, taking a few lessons on how to communicate and present design proposals. Some of them are not just narrative decisions, but also evidence of design decisions made during the product creation process.
Demonstrating the role of design in business strategy
When you look at Apple feature releases over the last several years, the roadmap towards this headset becomes evident. Hundreds of iterations, functional and design decisions have been paving the way for visionOS to fit perfectly into the ecosystem, for its interactions to seem natural and feel intuitive enough so the learning curve is as flat as possible.
This degree of commitment to strategy is unparalleled. Zuckerberg announced the metaverse as Meta's stronghold. After a year and a half the company has changed their mind without seeing the return of 13 billion dollars invested and losing 600 billion in stock market valuation along the way. And then came the announcement that Meta will focus mainly on AI. Obviously here there is neither strategy nor thought, just a pressing hurry to do something that allows to remain relevant in any way.
We designers rarely have a relevant weight in business strategy, hopefully this will serve as a reminder that design is strategy, when design participates at all levels of organization that commits to a shared vision, the outcomes can be revolutionary.
Accessibility: integrating tech with people’s needs
Apple siempre ha destacado por su foco en accesibilidad, y este año además de las sesiones específicas de accesibilidad, también han replanteado la sección de accesibilidad de su documentación. Está lleno de reflexiones sobre cómo incorporar tecnologías asistivas, pero también cómo enfocar el diseño de experiencias incluyendo la consideración de diversidad funcional de todo tipo. Es un recordatorio genial de que debemos ir más allá de la WCAG, y la cantidad y calidad de material de referencia es destacable.
Apple has always stood out for its focus on accessibility. This year, in addition to the accessibility-focused sessions, they have also reorganised the accessibility section within their documentation. It’s full of guidelines and best practice examples on how to incorporate assistive technologies, but also how to approach the design of experiences considering functional diversity of all kinds. It is a great reminder that we must go beyond WCAG, and the quantity and quality of reference material is remarkable.
In addition, the transcription features in messages or voicemail are positive for us all. Or asking Siri to read aloud the content of a webpage. Even the ability to create our own voice and that can be used to say out loud the messages we write.
And what about Artificial Intelligence?
Google mentioned the term "IA" up to 140 times during its presentation, and Microsoft did not fall short either. For these companies it is critical we do not perceive them as being left behind in the technological race.
However, Apple did not uttered this expression even once. They talked about "device intelligence", "machine learning" and other references, but they stood away from the buzzword. And in each AI-related featured introduced they always paired it with an explanation of how privacy by design was a priority. Almost everything announced is leveraged in some kind of artificial intelligence.
In the same way that Google's discourse is worth studying, Apple's is too for different reasons. Their narrative always revolves around the value proposition, and only after they tell us how its made. But for Apple technology is always only a medium.
Applying that to the design domain, it’s much more important to understand what can be done with technology in order to achieve solid value propositions that aligns with users’ needs and expectations. Trendy words may help for publicity and media mentions, but they have no value on their own.
Conclusions
Technology is an enabler to articulate value propositions. Knowing how something works and what possibilities unlocks is helpful, but we always have to pivot around users’ needs.
Experience is not confined to an app or a device. Both Android and Apple are offering increasing capabilities to extend the experience to other spaces of the OS, or even across devices within the ecosystem. With each new feature introduced, the possibilities multiply, and designing services benefits from more options to support the customer journey end to end.
The "spacial computing" concept coined by the Apple Vision Pro, along with all the documentation Apple has published, will reshape the entire industry, boosting expectations about design quality and standardising best practices. Extended reality is no longer just about the metaverse, it is a fertile ground to explore other concepts.
And as a bonus, Apple debuts in the Figma community sharing official iOS and iPadOS 17 resources for the first time.
Don't forget to watch all WWDC 2023 design sessions and bookmark the new and improved Apple HCI Guidelines, which is now easier to use than ever.