What Does WWDC 23 Mean for Developers and How Will It Change the Industry?

Why Widgets are the Key Announcement for iOS, macOS, and watchOS Developers

inDrive.Tech
inDrive.Tech
6 min readJun 16, 2023

--

Published by Aziz Ismailov.

As a developer, widgets are my favorite update from WWDC 2023. They’re interesting for plenty of reasons:

  1. Widgets have been updated across a whole bunch of Apple platforms and can now be more useful than ever before.
  2. Interacting with them can improve app retention.
  3. Now, Live Activities gives you the chance to display the most significant information on the Lock Screen of iPads and Macs.

As an inDrive developer, I worked with WidgetKit during one of our internal hackathons. That’s why I consider this a key announcement. With the latest updates, widgets have become a crucial interface element, available throughout the entire Apple ecosystem from Apple Watches to Mac.

One of the most important metrics in mobile development is user retention. Widgets were created to be interactive, and users typically place them on their home screens. That’s why widgets are important to any developer: they allow for user engagement, even outside the app.

And the latest widgets update lets developers add features that don’t require users to open the app. For example, ride-hailing users could order a car through a widget without opening the app. Just one click — for instance, selecting “ride to home”.

The game-changer for developers is the increasing number of platforms where users can interact with applications. Now, this can be done on iPhone in landscape mode, on the locked screen of the iPad, on the Mac desktop, and even on the Apple Watch.

However, it is essential to use widgets correctly, and to include the most relevant features. The choice of widget options allows users to interact with an app in different ways. For example, the calendar can now be placed horizontally alongside the clock on the lock screen.

At inDrive, we use Live Activities so that passengers can track the progress of their trip. This makes it easier for them to use the service: they can see at a glance how much time is left before their car arrives and before the trip ends, without even having to unlock their screen.

But we haven’t stopped there; we’ve also created a widget for drivers. They can use this to check their earnings statistics and top up their account if needed. The most important information is right on the screen, and the widget saves users their most valuable resource: time.

One of the more significant changes is the interactivity of widgets. Previously, any tap would open the main app via a deep link on the main screen. But now you can perform quick actions without even opening the application.

We are entering a new era of MWP — Minimum Widget Product. In theory, it becomes possible to build an entire application around a widget, eliminating the need for users to open anything.

Now it’s easier than ever for developers to distribute their products in the Apple ecosystem. You need to make one widget and minor adjustments to release it on iPhone, iPad, Mac, and Watch devices. Combined with “Live Activity” and the main application, this provides users with a complete, seamless and convenient product experience.

Apple Vision Pro: Not Just a Device, but a New Platform

“One more thing”, as they say, from Apple, and this is truly special: The Vision Pro headset is the first new product category from Apple in a decade. They’re calling it “the spatial computer”.

How Apple pitched the Vision Pro, and the cases in which they suggest using it, surprised me. Apple’s headset has a “transparency” mode, allowing users to immerse themselves in virtual reality or overlay interface elements on top of the real world, anywhere and at any size.

Surprisingly, the Vision Pro has no controllers. Everything works through gestures, eye movements, and voice commands. From a development perspective, this is a significant step into the future: previously you needed to use additional controllers.

One of the most interesting features is how Apple solved the real-time event processing problem. Since the Vision Pro is worn very close to the eyes, any delay would be noticeable. Rendering tasks need to be handled in real-time, so Apple has rethought how to prioritize threads. Priority will now be given to the task that requires the most resources, such as a game or an image renderer for the glasses.

However, as a developer, I was hooked by the ability to open an unlimited number of screens and arrange them as I like. This sold me on the headset. For example, you can have Xcode on one screen and Slack on another browser, separately. Everything revolves around the MacBook itself; you just have to wear the glasses. What’s great is that you can set the transparency and, if you need to focus solely on your work, you can remove the background completely and replace it with a view of a lake.

Apple has shown many more examples of how to use the headset. The most useful to me were:

  1. Working with multiple monitors;
  2. VR gaming and viewing content;
  3. Disconnecting from reality on airplane flights, so that you can’t see or hear anyone else, but can watch a movie as if you were in a cinema.

Developers will now have the opportunity to expand the reach of their applications to another market. The basic idea is that applications already available on the App Store for iOS, iPadOS, and macOS can be extended to VisionOS.

Augmented reality opens up a whole new world of scenarios for interacting with apps and UI elements. You can create three-dimensional interfaces, add 3D models directly to the stage, and explore new ways of using ARKit.

Developers can reimagine how objects and gestures are manipulated through RealityKit and become innovators in the mobile development community. Apple promotes the use of SwiftUI across the board, further standardizing apps across different platforms.

However, as a developer, I have my concerns about the Vision Pro. It stands out among the competition with an impressive price tag of $3 599. Perhaps this was intentional, to limit the ability to buy it and make it affordable only to enthusiasts.

There is also the risk that, with the resources spent on developing and supporting the new VisionOS, it will not be profitable. But as an enthusiast, I’m willing to try something new from day one. Even if it may not take off in the first few years, I see it as a promising prospect. For now, my advice is to try and explore how it works on your pet projects.

The Vision Pro sets the bar for the entire VR and AR industry. However, there are already plenty of cheaper gadgets available from competitors, so it’s worth keeping an eye on the industry’s progress in the coming years.

One thing is clear: to give the new VisionOS system a boost, Apple will actively promote apps that support the headset. This is a good way to get apps promoted.

How Development Will Change after this Keynote

Many developers have been eagerly awaiting Apple’s AI announcements amid the growth of ChatGPT. In fact, all of the updates involve machine learning (ML) in one way or another. However, they didn’t emphasize AI as loudly as Google did.

Familiar apps like Photos have become even smarter; Notes has learned to work with PDFs and fill in fields, and autocomplete now suggests text completion right in the line of typing. Almost every update involves neural networks.

There were other significant updates specifically for developers, including:

  1. Macros in Swift that allow you to set breakpoints and see how they work;
  2. An updated Xcode, which will offer improved stability;
  3. Preview for UIKit elements: Now, people can save countless hours and even weeks that would have been spent checking the layout while building large projects.

As an iOS developer, I was particularly looking forward to the developer software update. While all the updates are great, what’s most important is the improved stability of the IDE and the simplification and acceleration of the development process.

For example, Xcode Cloud has accelerated the workflow by 2 times, and it’s great that Xcode itself has been reduced in size.

At WWDC, Apple showed many new SDKs, which developers should start studying now. For example, the simplification of C++ code bridging makes it possible to port libraries in this language more easily.

All the new tools make it possible to do complex things faster and better. But it should take at least a year before people start using what has been presented now in production.

So I think we will see an influx of projects for AR/VR very soon. This will inevitably create difficulties for developers, but this will not stop enthusiasts.

--

--