External Display Support on iOS
Apple has provided support for connecting external displays to iOS devices for many years with their Lightning Digital AV Adapter. The Screen Mirroring feature in Control Centre effectively simulates connection to an external display using AirPlay to stream compressed video to an Apple TV. By default, an external display simply mirrors the contents of the iOS device. This will be letter-boxed or pillar-boxed to fill the display without changing the aspect ratio. For example, a portrait iPhone X is tall and thin with huge black borders to the left and right when viewed on a landscape 16:9 external display.
Apps have been able to detect the presence of an external display and provide a custom, screen-filling, user interface since iOS 3.2. Session 233 at WWDC 2018, Adding Delight to your iOS App, discusses this feature first. Why the sudden push from Apple to support external screens?
Steve Troughton-Smith found a huge clue in Xcode 10.1 beta. The iOS 12.1 simulator had a new 3840×2160 (4K) option which, he noted, couldn’t be supported over an existing Lightning connection. This strongly suggested a future device with USB-C, which does have the bandwidth for such a large resolution. Sure enough, Apple’s October 2018 event introduced the 11" and 3rd generation 12.9" iPad Pros with a USB-C connector instead of Lightning.
With the new devices (and a new USB-C Digital AV Multiport Adapter), there seems to be a renewed interest in external displays. Should you add support to your own apps? What might you want to display? What code is required?
Supporting External Displays
Apple’s article Display Content on a Connected Screen and the UIScreen documentation describe the process. The basics are surprisingly simple. The WWDC video linked to earlier is also a useful reference. Jordan Morgan wrote a nice tutorial on Supporting External Displays. A few days later John Sundell also provided some sample code in his article Building iPad Pro features in Swift and posted a video of a prototype using an external display to show a preview for a Markdown editor.
Enthused by how simple it seemed and Apple potentially being on the lookout for apps that make good use of an external display, I have added external display support to a few of my own apps. This article was originally envisaged as a tutorial based on my experience.
But then Andrew Fitzpatrick from the Big Nerd Ranch released a blog post: Adding External Display Support To Your iOS App Is Ridiculously Easy. From a tutorial perspective, it’s almost perfect and I felt it would be pointless for me to write my own! However, there are some subtleties and, hopefully, interesting discoveries that I came across supporting external displays and so this article changed to focus on them rather than the actual code.
On the subject of code, Andrew’s is clear, well-structured, and more thorough than the other samples I’ve seen. Firstly, it doesn’t assume there can only ever be one external display. I don’t think there’s an iOS device (yet?) that can drive multiple external displays, but Apple perhaps hint at the possibility in their article (my emphasis) “Each additional screen represents new space on which to display your app’s content”. Their sample code uses an array, not a single variable, to keep track of the windows on external screens. 🤔
I said Andrew’s code is almost perfect. In my opinion, the missing piece is covered by Jordan’s in his post: listening for the
UIScreenModeDidChange notification to trigger a layout pass in the external view.
What Makes a Good External Display Experience?
If you want to know how to implement external display support, watch the video and read Andrew’s post. But what kind of information could your app show on an external screen? Is it even worth supporting it at all if you can’t add any value over the default screen mirroring?
In their WWDC video, Apple discussed some of the things you should consider when deciding whether, and how, to add external screen support:
- Size An external display could vary wildly in size from a small TV to a projector displaying your content onto a wall. It is likely positioned at a larger distance from viewers than your iPhone or iPad is to your eyes. The content your app shows on the external display needs to be readable at different sizes and distances.
- Public vs Private Your iOS device is personal whilst an external display is potentially very public. Be careful what kind of information your app shows on the external display. Is your app likely to be used in an office? At the office of another company that your user is visiting?
- Non-Interactive An external display is just a display; there is no user-interaction (at least, not yet!) The user will need to continue interacting with your app’s interface. You should probably avoid placing standard iOS controls on the external display that could give the misleading impression of interactivity.
- Custom Interface Is the external display showing the same content as your app (or some subset, such as a photo), or does your app completely customise its behaviour when an external display is attached? It’s arguably a lot simpler to augment your existing app behaviour with another view controller than it is to change the appearance on the iOS device too. That’s what I’ve been doing with my own apps 😀
- Display Lifecycle An external display could be connected or removed at any time. You need to handle this gracefully. The more complex and custom your external display support, the more difficult this will be to get right. A simple augmented view should be relatively easy to deal with because it doesn’t affect the appearance of the main app. In their video, Apple used an example of a photo browser app which would show the currently-selected photo on the external display instead of pushing a new photo view controller on the navigation stack. If the user disconnects the display, the app needs to push a photo view controller to ensure it switches to the state it would have been in had the external display not been connected when they first selected the photo.
- App Lifecycle If the user switches to another app, the external display will no longer show your app’s content. Unless that app also has explicit support for an external display, the default screen mirroring behaviour will return. Mirroring also occurs if the user returns to the home screen. In my testing, turning off the iOS device screen whilst your app is in the foreground kept the external display showing the app’s content. However, the app will quickly be suspended so it cannot keep updating the content. It might be better to hide the external content when your app goes into the background rather than allowing the display to freeze and show potentially misleading or out-of-date information. In my own apps I made the external display view controller hide/show the root view in response to notifications for entering the background/foreground.
I’ve added external display support to a few of my own apps. I’d like to share a few screenshots and some of the reasoning I used to decide how to support external displays. There are also some corner case behaviours to consider.
Meeting Cost Timers
This app is designed to track the time and cost spent in meetings. The main display is dominated by a giant timer label. This shows either the actual elapsed meeting time, the cost of the meeting so far, or the elapsed person time. The more people there are in the meeting, the proportionally faster the person time accumulates. By configuring the hourly rate for different roles or individual attendees the cost of the meeting can be calculated.
Since the app’s interface was already dominated by the timer label, it made sense for the external display to be almost identical. I removed the buttons for controlling the timer and accessing the Settings and Attendees screens. A giant increasing cost on an external display or projected onto a wall can be quite thought-provoking during a lengthy meeting with lots of attendees!
I debated whether I should try and show which timer mode is active on the external display, but a segmented control would imply interactivity and a greyed-out (disabled) segment would be hard to read. Sizing it to be readable at a distance would also have been problematic. So I kept it simple: just the timer label is visible on the external display. The timer mode selected on device (and an inverted colours setting) also affects the external display. The different appearance and rate at which the timer accumulates for the different modes means there is very little chance of misinterpreting the content.
Whilst the meeting timer is running a refresh timer runs in the main view controller in order to update the timer label. As a performance optimisation, the refresh timer is stopped if another view controller is presented full screen over the main view controller. If an external display is connected, the refresh timer cannot be stopped, otherwise the external display would stop updating!
An even more tricky corner case is when the refresh timer has been stopped by this optimisation when the main view controller was hidden whilst an external display was not connected. Connecting an external display then requires the refresh timer to be started (but only if the meeting timer is running, of course), even though the main view controller is not visible to the user on their device.
Having a separate refresh timer for the external display view controller might have simplified some of this, but would lead to a pair of timers running, which is unnecessarily inefficient. It could also risk them occasionally firing at slightly different times and very briefly displaying inconsistent data between the main app and external display. I found using a single refresh timer, owned by the main view controller (which is how the app worked before adding external display support), was easier for me to manage and extend to support an external display. I just needed to handle the awkward corner cases to ensure the refresh timer was running when needed.
The main interface for my Pomodoro timer app, Pommie, shows the current timer state and period (work or break and which number in the cycle). There are buttons for controlling the timer and to open a Settings screen.
As with Meeting Cost Timers, I wanted the external display to focus on the bare essentials: the timer state. The timer view was already adaptive and scaled to different sizes to (nearly) fill its superview. The font size for the label is proportional to the size of the view. It looks clear and readable at any size.
This app also has a single refresh timer that is used to update the main on-device view and any external displays. Again, the refresh timer cannot be stopped when the main screen is hidden whilst an external screen is connected and needs to be restarted if an external display is connected whilst the timer is running.
There is also a state change timer which periodically checks the underlying data model that is shared with the Today Widget and Siri Intent extensions. The user can affect the timer state using the Widget or by invoking a Siri shortcut. That could happen whilst the app is in the background or front-most. If the Pomodoro timer is started whilst an external screen is connected, the refresh timer needs to be restarted (even if the user is not viewing the main screen on device).
Having app extensions which can affect state increases the complexity of keeping everything consistent. For Pommie, it would have been simpler to always keep the refresh timer running, but stopping it is an important performance optimisation that I didn’t want to discard on the off chance that an external display might be connected (which will be relatively rare).
The main interface for Adaptivity shows the screen size, bar heights, layout guides and readable content guides. The app presents similar information for popover, form/page sheet, tab controller, split screen controller and uses view controller containment to share most of the functionality in different contexts. Adding support for an external display was reasonably simple. This app, in particular, led to discovering some interesting behaviour of external displays.
Using an external display with the same screen resolution as a 9.7" iPad (1024×768) shows identical layout. The missing navigation bar title seems to be an iOS bug. It appears on iOS 10 (before Apple rewrote
UINavigationBar layout to support large titles).
In all my testing, 1 point equals 1 pixel on external displays (even at 4K). Despite this, I chose to keep the Points/Pixels segmented control in the external display for consistency with all the other view controllers in the app. It is greyed out to indicate it is disabled and cannot be used to switch modes.
I’m unsure where the crossover point is, but using the simulator a 640×480 external display has a compact width and regular height. At 720×480 and higher resolutions, an external display has regular width and height.
The readable content guide may not be very helpful on an external screen because it defines a narrow region down the middle of the screen. A larger screen just gets bigger leading/trailing margins. No changes are made for the physical size of the display or pixels-per-inch (which is possibly unknowable).
The height of a navigation bar on the external display varies according to the top Safe Area margin of the iOS device to which it is connected. Depending on iOS version, iPads have larger bars than iPhones; with the newest iPad Pros larger still due to an increase in the height of the status bar.
The extended top Safe Area to avoid the notch on the iPhone X/XS/XS Max affects the height of a navigation bar on an external display. The extended bottom Safe Area to avoid the Home Indicator, however, does not affect the height of a toolbar. 🤷♂
The external display doesn’t display a status bar, but leaves space for one in the top layout guide and Safe Area top margin if the main view controller is showing a status bar. Using
prefersStatusBarHidden in the external display’s view controller seems to have no effect.
Adding support for external displays is reasonably straight forward, but the devil is in the details. If your app supports iOS 10 or earlier you need to account for a display already being connected at startup. Keeping the external content up-to-date might require changes to some assumptions in the main app (as in my examples of needing to have a refresh timer running even when the main view controller was not visible).
With such inconsistent behaviour in size that is determined by the iOS device type and orientation at the time the external display is first connected, it’s probably best to avoid showing navigation bars or align content with the Safe Area top and bottom margins. One could argue that an external display’s Safe Area and bar heights should not be affected by the model of iOS device to which it is connected.
For my own apps, I decided that simple screen-filling content worked best on external displays. Depending on the nature of your apps and how/where they are used, something more complex might be more appropriate. Altering the appearance of the main app itself when an external display is connected will increase the complexity but also, perhaps, the quality of user experience.