iPhone XX Futurology

Thinking about a 20th Anniversary Apple iPhone

Mike Rundle
20 min readSep 28, 2017

(Note to the reader: this article, if printed, would run about 30 pages long. Sorry, just warning you now.)

Steve Jobs showed the world the iPhone on January 9, 2007 at the Macworld Expo. It had a 3.5" display and a 2 megapixel camera. Since then, Apple has become the world’s most valuable company and the iPhone started the smartphone revolution which has transformed the lives of billions of people.

“Every once in a while a revolutionary product comes along that changes everything” –Steve Jobs

Earlier this month, Tim Cook stood in the new Steve Jobs Theater and unveiled the brand new iPhone X, the first major rethinking of the iPhone since its debut ten years earlier. Until the iPhone X, every previous iteration had a home button under a sharp-cornered rectangular screen. The new iPhone X ditches the home button and has an OLED screen with distinctly rounded corners and a brand new way of navigating in and out of running apps. At the top of the phone is an array of new sensors and components no iPhone has ever had, allowing for a broad new range of interactions including Face ID unlocking and facial gesture recognition.

Ten years from now, in 2027, an Apple executive will once again grace the stage to show eager fans the new iPhone. This iPhone will be the 20th anniversary model with 20 years of improvements, refinements and technological achievements under its belt.

In this piece I’ll lay out what I believe this iPhone XX will look like and how it will fit into an accelerating future of technology.

But First, On Flying Cars And Holograms

This is not a sci-fi article. The iPhone of 2007 and the iPhone of 2017 look generally similar to one another, so let’s talk about some off-the-wall futuristic iPhone ideas and why I don’t believe we’ll be seeing them in 2027.

Why still a rectangle and not a square? Or a circle? First, books are not squares or circles, they’re rectangles. Humans read text on a page in a particular way and I believe a portrait screen ratio will be sticking around for at least the next ten years.

Why still think there will even be an iPhone? What about AR glasses and VR goggles and flying cars? Technology doesn’t move as fast as people think. 100 years ago people were convinced we’d be living in colonies on Mars and food would materialize from pills. Instead we haven’t put a person on a new celestial body in 50 years and Soylent not only tastes bad but made people sick. The iPhone will still be around in 2027 and will still mostly look like the smartphones we use today.

That’s not to say that VR goggles and AR glasses won’t exist in 10 years, they just won’t be replacing the devices in our pockets.

What about holographic displays and hand-wavey Minority Report stuff? In 2027, we’ll still be using computers on desks with monitors and phones we hold in our hands. Truly futuristic hand-wavey interfaces with floating digital representations of data that can manipulated in any direction and flung to other devices won’t be here by then.

What about foldable, rollable or completely flexible displays? I think this is the most likely wildcard out of all the futuristic scenarios. Samsung has been working on flexible displays for almost a decade, and their head of mobile products said the Galaxy X with a flexible display is coming in 2018. No one knows if the whole phone will be flexible (needing a flexible battery and internal components) or maybe just the screen area (with a thick chin holding the inflexible components) but either way, if Samsung introduces a groundbreaking flexible smartphone it could start an arms race in the industry, driving Apple in that direction too. Time will tell.

With that out of the way, let’s get into what the iPhone XX could look like.

The Shape and Display

The iPhone X is clearly not the ideal, finalized state of a handheld computing device. The large and protruding cut-out at the top of the screen is showcased by Apple as a marvel of sensors and face-tracking mechanics only because it’s what Apple chose to do in 2017. If they could have slipped all those sensors and cameras below the screen in some magical way they probably would have. The notch was somewhere between Plan B and Plan E.

By 2027, I believe the notch will be completely gone, and the realization of a dozen Apple patents will provide futuristic technologies that place nearly everything behind the screen. The screen will be the phone, the phone will be all-screen in the human and non-marketing sense of the word, and software will take over the entire experience of using a phone. It’ll finally be like the tablets in Westworld or the phone in Her where the bezel has faded away and the device is just a bright, glowing rectangle.

This is clearly the ideal state for an interactive device we hold in our hands.

To explain why I’ve come to this conclusion, we have to venture back to a patent that Apple Product Marketing Manager Michael Uy submitted 13 years ago in June 2004.

Patent #7535468 describes a fantastical technology where in-between the thousands of pixels that comprise the display, there are thousands of tiny cameras lenses that can take photos and stitch together the result. Here is Apple’s summary description of the patent:

“The invention pertains to an integrated sensing display. The integrated sensing display includes both display elements and image sensing elements. As a result, the integrated sensing device can not only output images (e.g., as a display) but also input images (e.g., as a camera).”

At the time, the only Apple news site to cover this patent was AppleInsider, and here’s how they described it:

“The idea behind the invention is to wedge thousands of microscopic image sensors between the LCD cells that make up the display. Each sensor would be responsible for capturing a piece of the overall photo. These pieces would then almost instantly be pieced together by software to form a complete image.”

In 2004, this technology didn’t exist, this was Apple planting a stake in the ground. Michael Uy’s tenure at Apple lasted 4 years and during that time he was a leading product manager on a variety of video-related Apple software products including iMovie, Aperture and Final Cut Express. He was only the 2nd software product manager at Apple, so he obviously had some serious geek chops. As the sole inventor listed on the patent, I can only imagine that this idea just struck him one day in the shower.

Since then, many additional Apple patents on sophisticated under-display cameras and sensors have been filed, all referencing and building on this initial patent that started it all.

In 2007, Apple filed a patent that expanded on the idea of a camera system behind the display, and went into greater detail about how it could work, specifically on a laptop display for video conferencing:

“The display elements are configured to cycle between an active state, in which the display elements are illuminated to display a display image on the display screen, and an inactive state, in which the display elements are darkened and at least partially transparent. While the display elements are in the inactive state, the image-capturing mechanism is configured to capture a photographic image of objects in front of the display screen through the display screen and the display elements.”

Three years later in October 2011, Apple filed the mother of all “behind the display” patents that outlined specifically how a variety of components (cameras, sensors, speakers, etc.) could work if they were placed behind the display.

From AppleInsider’s reporting on the patent:

“Awarded by the U.S. Patent and Trademark Office, Apple’s U.S. Patent №9,543,364 for “Electronic devices having displays with openings” describes a method by which various components can be mounted behind perforations in a device screen that are so small as to be imperceptible to the human eye. This arrangement would allow engineers to design a smartphone or tablet with a true edge-to-edge, or “full face,” display.”

This patent greatly expanded on Michael Uy’s initial vision for how microscopic perforations between pixels in the display could allow sensor data to flow through the screen, even while the display pixels remain active.

The following diagrams are from the patent, and the image on the right shows (the circle in the center) where these sensors would live within the display: between microscopic pixels. Pretty wild stuff.

Apple wasn’t granted the patent for this until January 2017, and after it was granted, pundits thought this technology would make it into the 10th anniversary iPhone which we now know did not happen. We got the notch.

I’m sure these technologies are being prototyped deep in a lab inside a nondescript building in Cupertino right now, and they’ve probably been working on this for many years. In 2027, it will be 23 years after Michael Uy’s initial vision for this technology, and 16 years after Apple’s incredible follow-up filing that went into great detail about how these technologies would actually work.

The notch is clearly a stop-gap for what Apple truly wants to build and it’s only a matter of time until their vision is achieved.

Now that we’ve discussed the notch, let’s talk about the screen itself.

The recently-unveiled iPhone X has an incredible screen. I don’t believe that Apple needs to innovate much further on the screen’s fidelity, even for a hypothetical iPhone XX to be debuted in 2027. Here’s the iPhone X display’s specifications:

  • 5.8" OLED HDR display
  • 2436-by-1125-pixel resolution at 458 ppi
  • 1,000,000:1 contrast ratio
  • True Tone display with P3 Wide color
  • 625 cd/m2 max brightness

OLED HDR screens (with pure blacks and incredibly vibrant colors) are clearly the future, so unless a fundamentally better screen technology comes along in the next few years, I think it’s a fair assumption that the iPhone XX will have an OLED display similar to what Apple just unveiled in the iPhone X.

As for the resolution, 458ppi is the highest pixel density Apple’s ever shipped in a device. Even though it’s lower than the 571ppi in the new Samsung S8, it’s still well beyond Apple’s definition of “retina” and pixels are impossible to discern at normal viewing distances. Jumping to a dramatically higher resolution display (running at 4x or higher) wouldn’t be a smart decision, even ten years from now, since the increase in pixel density would be imperceptible by users.

What will make a big difference on the iPhone XX display is Pro Motion.

The Super Retina HD display in the iPhone X is incredible in every way, but it’s missing Apple’s new 120Hz refresh technology called Pro Motion. Currently only available in the new iPad Pro, Pro Motion doubles the refresh rate of the display from 60Hz to 120Hz, and although it’s hard to describe in words (or capture on video!) it’s really a treat to use in person. Interface elements stick right to your finger when scrolling, animations are wicked smooth, and it almost feels like a CGI recreation of an iPhone interface rather than live software you’re interacting with. It feels otherworldly.

I don’t know what the holdup is for bringing Pro Motion to the iPhone, but I suspect it’ll be arriving a lot sooner than 2027, possibly even in next year’s iPhone X upgrade.

Lastly, the brightness level of the iPhone X is 625 cd/m2 which is the highest brightness value ever in an iPhone display. It’s incredibly bright, but display brightness is certainly an area that can and should be improved on in the next ten years, considering the Samsung Galaxy Note8 screen is double the brightness with a peak of 1,200 nits. I expect the display on the iPhone XX to be dramatically brighter than the iPhone X, but since Apple is always conscious to conserve battery life, I don’t expect the iPhone XX display to be so bright that the phone can’t last through a full day of regular use.

iPhone XX Display Prediction

Notch-free ~5.5–6.5" OLED HDR with rounded corners, 450–550ppi at 3x, Pro Motion, 800–1200 cd/m2 brightness

The Cameras and Sensors

Since 2007, smartphone cameras have primarily been used for taking photographs and then sharing those photos with family and friends. Hell, since the 1800s when film photography was invented, this is what cameras were used for.

Just take a look at Flickr’s top cameras chart by percentage of photos shot on each camera. Notice I didn’t say “smartphone camera” because this is a chart showing all camera types, not just smartphones.

The iPhone takes 1st, 2nd, 3rd, 4th and 5th place. The iPhone is absolutely the world’s most popular camera.

With the introduction of the iPhone X and the arrival of ARKit, Apple is signaling that “taking photographs” is clearly just the first act of the camera as we know it, and perhaps the least interesting thing that future iPhone cameras will be capable of.

If we dig back through the archives of camera-related patents and company acquisitions by Apple, we’ll find a number of interesting ideas that, when read together, paint a clear picture of where the puck is headed for the iPhone camera.

We’ll also discover why 2015 may be remembered as the year when Apple cemented their pole position in augmented reality and imaging technology for decades to come.

An example of the camera module offerings from LinX

To start off, let’s take a look at Apple’s acquisition of LinX, an Israeli camera module company, in the early spring of 2015.

“The Israeli startup’s hardware was targeted at tablets and smartphones specifically, and could not only offer the kinds of background defocus that’s popular on low aperture lenses paired with DSLRs, but could also help achieve better low-light performance, ideal for taking pictures indoors or at night without using flash. […] One of LinX’s big stated hardware features, according to the company’s own communications, is achieving selective focus post-capture.”

LinX specialized in multi-lens cameras of varying apertures and focal lengths and the software needed to combine the data from each camera into a special, interactive photo that could be manipulated in ways regular photos cannot, like changing the depth of field dynamically after a shot has been taken.

Some other startups have tried to achieve infinite depth-of-field manipulation via multi-lens cameras, most notably Lytro, which has since pivoted to become a VR platform. The latest camera company to try and pull off this trick is Light and their incredibly unique L16 camera.

The L16 camera has 16 lenses: six 150mm f/2.4 telephoto lenses, five 70mm f/2 midrange lenses, and five 28mm f/2 wide-angle lenses. All these lenses are folded up horizontally, snap a photo at the same time, and then combine to create a monster 50MP+ image that lets users change the depth-of-field after a shot has been taken in a way that’s never been done before.

We’ll get back to the L16 camera and folded lenses in just a bit.

Less than a month after Apple acquired LinX, Apple filed an incredible image sensor patent titled “Time-of-flight depth mapping with flexible scan pattern” with the following description:

Imaging apparatus includes an image sensor, which acquires an image of a scene, and a scanner, which includes an optical transmitter […] a processor identifies an object in the image of the scene, defines the non-rectangular area so as to contain the identified object, and processes the output of the optical receiver so as to extract a three-dimensional (3D) map of the object.

This clearly describes the technologies on the front of the iPhone X.

Cameras and depth-mapping sensors that combine to create a 3D representation of an object in the photograph (namely, your face) that can be manipulated in software. In iOS 11, this is used as a toy to manipulate emoji with your facial muscles, but ten years from now this technology will be foundational to the future of the iPhone.

LinX wasn’t the only sophisticated imaging technology company that Apple acquired, in fact it wasn’t even the most incredible acquisition that Apple made in that same calendar year.

Less than a week after filing their massive image sensor patent, Apple acquired Metaio, an augmented reality software company who had been building advanced AR systems for over a decade and had impressive clients like Ferrari who commissioned Metaio to produce virtual car walkaround iPad apps for showrooms.

Back in 2015, augmented reality wasn’t a thing that Apple was publicly focused on. ARKit didn’t exist, there were no AR demos on stage, so this was a surprising acquisition at the time.

Until later in the fall when Apple surprised the entire movie industry.

From a Faceshift technology demo showing their advanced motion capture solution.

Just a few months after Apple’s surprising Metaio acquisition, rumors were circling around the motion capture and CGI communities that Faceshift, a real-time motion capture and image processing software company from Switzerland had been acquired by Apple. This was confirmed later in the fall, and here’s what TechCrunch had to say about the acquisition:

“Its main focus, so to speak, was on visual effects in areas like gaming and film. In a world where animation technology can be costly and time-consuming to implement, the startup’s main product was marketed a game changer: “Faceshift studio is a facial motion capture software solution which revolutionizes facial animation, making it possible at every desk,” according to the company. […] The technology is also making an appearance at the highest level of wow: it’s used in the latest Star Wars film to make non-human characters more human-like in their expressions”

Instead of wearing a head-mounted rig with dots all over your face to achieve accurate motion capture, Faceshift developed an advanced markerless software technique that could analyze faces in real-time and drive an animated character to match your facial expressions. It was some really futuristic stuff, and they had been working with major movie studios at the time of the acquisition.

Here’s a video showing off their technology from 2015 right before Apple snatched them up.

The reason I’ve been walking through Apple’s acquisitions and imaging patents of 2015 is because we’ve only just begin to see them bear fruit with the iPhone X, its TrueDepth camera and ARKit. Apple has always played the long game, and their products don’t typically utilize patented technologies until many years after these patents were filed.

If we play Apple’s camera story forward 10 years, here’s the imaging hardware I predict we’ll see on the iPhone XX:

  • 3+ rear cameras of varying focal lengths
  • 2+ front cameras of varying focal lengths, hidden behind the display
  • 3D object sensors on the rear of the iPhone
  • 3D object sensors on the front of the iPhone, hidden behind the display

Multiple lenses of different focal lengths on both sides so users can manipulate the depth-of-field of a photograph after it’s taken, and the addition of rear-facing 3D object sensors so iPhone software can know where objects sit in 3D space regardless of where those objects are in relation to the device.

Hey, remember the crazy multi-lens Light L16 camera from above? Well Apple filed a patent in early 2014 describing horizontally-oriented camera lenses with folded optical components that can dynamically move to change the focal length, so Apple has been working in this area for a long time.

With the iPhone X and the introduction of Apple’s sophisticated new TrueDepth camera, Apple is making it clear that the iPhone camera will no longer be a simple image input device used for making photographs. Instead, we are going down the path of the iPhone being a sensing object, aware of all that is taking place in front of and behind the phone.

Now what will truly make the iPhone XX stand apart is not just the advanced new camera systems and 3D sensors, but the software that will take full advantage of the data coming off those sensors. This is the true magic that will make the iPhone of 2027 feel like the future.

Some examples of incredible software enhancements to look for in the years to come:

  • Making the iPhone aware of its surroundings with always-on cameras continuously scanning, mapping and tracking objects in 3D space that are near the iPhone
  • Eye-tracking that allows for software anticipation, making facets of a software interface be guided completely by gaze (Apple acquired SensoMotoric Instruments earlier in 2017, a world-leader in eye-tracking technologies.)
  • Biometric and health information derived from camera data of a user’s face (what’s my pulse, etc.)
  • Advanced image-manipulation algorithms that make sure FaceTime calls always show your eyes looking at the other person
  • Machine learning advances allowing for instant counting of objects near the iPhone (how many people are in this classroom, how many cars are between me and the stop light, how many pencils are on the table, how many shirts are folded in my closet, etc.)
  • Instant measuring of object and space dimensions without the need for gimmicky AR rulers (how long is that wall, how wide is that opening, how tall is that lamp, etc.)

But the biggest advance, I believe, is that with a true edge-to-edge screen and various 3D sensors, the iPhone can become a lens onto the world. Software can identify where your eyes are in relation to the screen, and where the iPhone is in relation to objects in front of it, and manipulate the rear camera’s focal length in real-time such that holding an iPhone in front of you will make it appear as though the phone is completely transparent and is just a piece of glass that you’re peeking through, continuously shifting the field of view as you move it to maintain the illusion.

This is the holy grail of augmented reality.

From Magic Leap, showing what is possible with their technology

Once this is achieved then reality can truly be augmented, with information overlaid on top of objects on the screen with such precision that it appears as if it’s present in the real world. Magic Leap has been working to achieve this for many years, however they’re focused on bringing this technology to a headset. Apple can make it happen on a device hundreds of millions of people already own.

iPhone XX Camera and Sensors Prediction

3+ rear cameras of varying focal lengths, 2+ front cameras of varying focal lengths (behind the display), front and rear 3D object sensors, advanced eye-tracking capabilities, futuristic software to stitch data from cameras and sensors into an augmented reality world, turning the iPhone into a seamless digital lens.

The Battery and Internal Components

In many ways, guessing at the internal components of an iPhone from 2027 is easier than guessing at the larger software advances or reading the tea leaves of patent disclosures. Here’s the only statement in this piece that’s guaranteed to be true: the processing power will be dramatically greater and the battery capacity will be higher in the iPhone XX. Why? Because of the inevitable march of technological progress.

Starting with the battery, Apple’s goal has always been “all-day performance” in the iPhone, and they assume you’re going to charge it at night after using it moderately throughout the day. Not once in the history of the iPhone have we seen Apple sacrifice thinness for a larger battery or touted battery life that exceeded 1-day of normal usage. They just don’t think multi-day battery life is important in an iPhone, and they don’t think that phones should be thicker to support a larger battery. In general, even though battery capacities have steadily risen in the last 10 years, devices have gotten thinner and power management features in iOS have gotten better. The displays have more pixels and are brighter, the processors are way faster, but Apple always solves for around 1 day of battery life for an iPhone. It’s a multi-variable equation and Apple packs in a battery just large enough to solve it for about 1 day of usage.

If we chart all the battery capacities of every iPhone, the trend is upwards but the slope is moderate compared to Android’s battery evolution. There are Android phones on the market today with 5,000 mAh batteries, and a number of Android phones beat iPhones in battery life tests, so you could imagine Android phones of 2027 having utterly massive batteries relative to the smaller capacity of a svelte iPhone.

Apple is essentially telling its customers, “Want more battery life? Go buy an iPhone battery case.”

Now that’s not to say that Apple hasn’t innovated in battery technology in the past, in fact one of Apple’s patents that most directly translated into a shipped product was their non-rectangular battery patent filed in 2010:

This stacked and terraced way of laying out battery cells made its way into the razor thin MacBook that debuted in 2015. Here’s what Wired had to say:

A typical lithium ion battery “pouch” type cell comprises layers of a thin sheet of aluminum or copper, coatings of a specialized material that can absorb lithium ions, and layers of plastic. Each of these layers is mere microns thick.p

What Apple has figured out […] is how to fit these stacked electrode sheets into any size cell they choose. These different-sized cells can then be stacked on top of one another, allowing its engineers to pack as much battery as possible into any given space.

The current line of MacBooks would simply not be possible without the innovation from this foundational patent, so there’s precedent for Apple to publish battery patents and turn around and use that technology in a product just a few years later.

In 2013, another interesting Apple battery patent was filed, this time for a new laminar battery system that provides for dramatically faster charging of batteries, an improved lifespan, and a cooler average temperature. Now this isn’t very sexy on its own, but clearly Apple is trying to squeeze every ounce of performance out of every millimeter inside their portable devices, and that means layering battery cells in new configurations and adopting new technologies that make the batteries dramatically more efficient.

As far as the processor is concerned, it’s clear that Apple is utterly dominating every other phone manufacturer in the world and is not slowing down.

Apple has been steadily making incredible progress with their A-series of processors which now include an Apple-designed GPU as well. Over the next 10 years, we can expect this trend to continue, and unless Intel miraculously gets its shit together, it really seems like Apple is running away with the crown.

Just comparing the new iPhone 8 with the A11 Bionic chip to high-end Android flagship phones is a joke. That damn thing is even faster in a heads-up test against a brand new 13-inch MacBook Pro.

The iPhone XX of 2027 is guaranteed to have a dramatic increase in processing power compared to the A11 Bionic chip in the newest iPhones. Apple loves to show this chart at keynotes, so in 2027 you can imagine just how steep that curve will be.

Battery and Internal Components Prediction

5,000–6,000 mAh laminar battery with an extraordinarily compact and terraced layout, ~A20 CPU with 8–10 mixed performance cores, ~800–1000x more powerful than the original iPhone, same chip may also be in a variety of Macs once macOS can run on ARM

Final Thoughts on iPhone XX

The purest state of a handheld mobile computing device is simply a screen that fits nicely into your hand that you can flick comfortably with your thumb.

As foreheads and chins on smartphones have gotten smaller across the industry, it’s clear that one day there will simply be nothing but screen in your hand and the physical device will melt away. The software you run on this glowing rectangle will be the entirety of what a user experiences.

Back in February of this year, designer Georgy Pashkov designed an “iPhone 8” concept that I think is almost exactly what we’ll see from Apple by 2027. A full-screen device where advanced imaging sensors can understand and adapt to the environment to make it truly feel like a lens you’re holding up in front of the world.

Once the iPhone is simply a screen with a bunch of sensors and cameras, then software designers and developers can truly build fully immersive experiences. It’s coming, and it’s only a few years out.

Thanks for reading,
Mike

--

--