11 Apple Vision Pro Ideas to Change the World

Life as we know it will never be the same.

Noah Miller
Predict

--

Is it humanly possible that I can find a way to try one?

If you missed my first two posts on futuristic features and questions for the Apple Vision Pro, they’re both great places to start:

Introduction

The Vision Pro will be the most transformative piece of technology the world has seen since the inception of the personal computer.

Don’t be alarmed by its price tag of $3,500 when it’s for sale in 2024. The cost will come down over time until both you and I own one for ourselves.

When Apple’s first personal computer debuted, the Macintosh 128K, it retailed for $2,495 in 1984 (equivalent to $7,000 in 2022).

Yet, in only 6 years, when the Macintosh Classic was revealed, it sold for only $999 (equivalent to $2,240 in 2022).

I believe we all have a duty to educate ourselves on the future of Mixed/Virtual/Augmented Reality, not just computer science PhDs.

It will be a major step in human history that lies right around the corner and will affect all of us — whether you like it or not.

While I am incredibly excited about the unlimited possibilities and all the good this headset will bring the world, there will be consequences. The earlier we begin talking about this new existence and the repercussions, the more we can prepare accordingly.

The following ideas consist of both my imagination and tangible research.

It is a thought experiment building on my previous writing on what may and possibly should lie ahead for the Vision Pro. I hope you enjoy reading it as much as I did writing it.

1. World editor

There will be no shortage of options to augment your real-life space or start from scratch in the digital world.

Whether it’s new decorations, virtually fixing a hole in your ceiling, or placing a million-dollar view on your blank wall — there will be endless options.

Want to have a stereo across the room that streams music?

Add it — in a couple clicks.

And better yet…control the volume through subtle micro-interactions in its direction.

Think “the force” in Star Wars, but better.

The music can seem like it’s coming from the speaker itself or as surround sound thanks to spatial audio.

(If you have Airpods, they already have this functionality built-in. YouTube search “8D song” and click any video while wearing them…your mind will be blown)

Personally, I am excited to choose from an infinite catalog of plants to place around my room (~zen) and add pseudo walls to block out distractions while in public.

2. Haptic gloves

Imagine being in the Metaverse and you can feel piano keys under your fingertips, raindrops falling on your hand, or a mouse jumping into your palm.

This isn’t science fiction.

Companies like HaptX and SenseGlove have been working on this technology for years. Yet their products are cumbersome and cost prohibitive. Therefore, this surreal experience is unavailable to the ordinary consumer.

Source: “Introducing HaptX Gloves G1” by HaptX on YouTube

But not for long.

Apple is developing its own haptic gloves and has a few patents to prove it.

If this will not take mixed reality to the next level, I don’t know what will.

3. Immersion peripherals

The future of mixed reality will bring many innovations that will help give you a greater sense of presence — well beyond just sight and sound (which Apple seems to have figured out).

For example, let’s say you were virtually transplanted courtside at an NBA game. You would hear/see the game, but may not feel like you were actually there with just sound and sight alone.

While it’s impossible to perfectly replicate that “energy”, I theorize you can actually get fairly close.

And the technology will (what may be a surprise to many) be easy to both invent and implement at home.

My theory is that the key lies in the sensation of a crowd reverberating through your body.

These vibrations could be recreated through a surround sound setup of subwoofers paired with the spatial audio of the headset.

Then, you will really be able to “feel” like you’re there, whether at a game, concert, in a movie, or anywhere else.

Also, I don’t think this technology will make live events obsolete, but if you can pay $50 for a front-row virtual seat instead of $200 for being in the 20th row in person, I know which I’d pick.

For many people, I theorize that once they try this experience for the first time, they won’t ever want to go back.

Not to mention, I can certainly see third parties developing ways to make these types of experiences and movies even more immersive through optional peripherals.

Some examples could be fans, mist sprays, heat rays (which could conflict with the Vision Pro’s rumored onboard A/C unit, according to one patent), scented air canisters, and moving seats.

The Vision Pro and all the accessories (bought separately, of course) will engage a lot more of your senses.

4. Self-projection

Have you ever wondered what it would be like to play the superhero in a movie (even just a cameo)?

Or how about a (slightly) adjusted replica of yourself dunking on Lebron James in a video game? (Ok, maybe a massively adjusted replica — or at least me)

Or how about seeing yourself in a car commercial driving that latest Rivian or trying on the latest clothing drop from Apple? (Yes, in my futuristic world, they make clothes now too — and it’s my entire wardrobe)

The Vision Pro’s advanced sensors will bring this type of simulation to the masses, integrating it with daily life.

5. Guided tutorials

Given enough user training, the Vision Pro will be able to autonomously walk you through almost any task.

At first, it will start with grabbing a light bulb and how to twist it into a socket, but eventually evolve into something as complex as changing a car’s oil.

Remember the Ikea bed frame that took you 4 hours to set up (even though the instructions said 2 hours)? Those days are done.

Virtual walkthroughs will now direct you (with audio) on when to pick up what pieces and what to do with them.

Or how about Thanksgiving dinner? Your headset will calculate the perfect moment to take your linguini off the stove by factoring in all the relevant variables — temperature, volume of water, etc.

As for the stove, you’ll see a convenient countdown on when the Apple Pie is ready.

My favorite feature is how easy it will be to track your nutrition via the auto-measure functionality.

Time to throw out those measuring cups…

6. Accessibility mode

Not everyone is equipped to use their hands, but this won’t stop them.

Apple will allow voice commands and subtle head gestures to control the headset’s cursor. An audible *click* with your mouth may serve to make a selection, while a double-click sound could be tied to a ‘right-click’.

Worried about accidental twitches or movements? The headset recognizes the difference.

It get’s better…

Before we know it, all actions will be conducted through the use of neural sensors monitoring your brain activity (do I hear a Neuralink partnership?). No physical movement is required.

While this may start as an accessibility feature, it won’t take long before it becomes a productivity feature — making users an order of magnitude more efficient.

Thus, it will become the primary means to interact with the device across all users.

Children will grow up laughing at the idea that people had to move their hands on a physical keyboard “remembering where all the letters were” to type.

Just like rotary dials nowadays.

7. Physical data

Eventually, companies will capture your “physical data(if I just coined this term, I would like full credit — thank you).

We all know web data (i.e. how you behave online) through the keyboard/mouse and biometric data (i.e. fingerprint, eye, or palm scan for identity verification), but now there is a new format for collecting your activity.

(Note: there’s likely some overlap between these types)

Through your eye-tracking patterns, facial expressions, and sound (already commonplace), companies will know more about how you feel than you know how you feel.

Your behavior and intentions will become even more predictable (if not obvious enough already).

Does this not sit well with you? Worried that this data won’t be encrypted by Apple?

Well, this is likely to be the public’s concern in the beginning.

Although, as time goes on, even you, are going to care less and less about safeguarding this data.

To interface with a superior experience, you will need to share your physical data with whatever company’s service you are accessing.

In fact, you will be begging to share your facial expression data when X.com’s news feed doesn’t serve you positive news when you’re feeling upset — shooting you with endorphins so you feel better.

Believe it or not, Apple has already been granted a patent (dated July 2023) of this exact concept. Here’s a real snippet from the filing of how the system could show you a literal cat animation to then alter your mood.

Source: U.S. Patent No.: 11,703,944 B2. Date: July 18, 2023.

You will see that there are various visuals in the “CGR Content” column on what to show someone so they move to a “Target State” from their “Measured (current) One”.

Although, the one point I would disagree with is that showing me a cat while in any state will then make me feel the happy emotion.

Even better, maybe a dancing cat.

Further to this concept of websites understanding us beyond what we consciously convey — I am interested in how this will affect UI/UX design.

As a YouTube fanatic, I believe this will lead to the total removal of interactions such as “Like” or “Subscribe”.

Eventually, they will become obsolete as there will be no shortage of physical/biometric data points to determine if you “Like” or “Dislike” something.

The hassle of clicking a button will become a nuisance.

As for the “Subscribe” button, YouTube pretty much solved for what content you like to watch years ago.

It’s now mostly there to give you a semblance of free will and a bit of control.

After incorporating more advanced AI/ML and data, you will be served such irresistibly addictive content tailored to every morsel of your being that it will be impossible to look away.

8. Telescopic mode

All of us wanted the superpower to ‘zoom in’ our vision as kids — now we can. Squint your eyes and your sight will be magnified into the focal point. It will be interesting to see how the Vision Pro’s capabilities will compare to rumors of the yet-to-be-release iPhone 15 Pro Max’s Periscope Lens.

9. Sound screens

This is an odd one, but I think there could be applications in busy environments.

Hear me out.

The premise is to artificially flatten sound variance around a Vision Pro user to aid more effective noise cancellation.

If I lost you, learn more about the physics of how noise cancellation works here. Essentially, it entails neutralizing a repetitive sound by attempting to play the inverse of that noise at the same time.

These “sound screens’’ would emit a white noise — even at a low decibel. Whereby, the form factor could be speakers built on the outside of the headset or even separate modules separating doorways like air curtains.

The ideal use case I envision would be a room of people all using Vision Pro’s for different purposes with the sound screens drowning out each person’s voice from the others.

10. Application for the blind

The visually impaired may stand to gain the most from the Vision Pro in the short-to-medium term, even though they won’t buy it at launch.

The headset’s ability to understand a dynamic environment and advise its user accordingly is a level of sophistication current solutions like the eSight could never accomplish.

Through both audio and haptic feedback, the possibilities of living an independent life traveling, reading, and caring for oneself, will be transformed.

In the meantime, if you would like to volunteer virtually to help the blind whenever it is convenient for you, please check out “Be My Eyes”. It’s a free app for giving live video support to the visually impaired.

11. External sensors

While the state-of-the-art technology in the Vision Pro will do an effective job capturing depth of your peripheral vision — there are blind spots, creating limitations.

Source: “Introducing Apple Vision Pro” by Apple on YouTube

Hence, Apple will introduce additional, external sensors (similar to the Wii’s Sensor Bar, but smaller) so you and your environment can be better understood.

The more perspectives you can feed into VisionOS, the more accurately it will respond.

The most beloved benefit will surely be for full body movement — aka working out.

In the Vision Pro’s current form, you’re probably (hopefully) not going to be doing any high-intensity exercises, but as the headset gets lighter over time, you probably (hopefully) will.

These sensors will also be usable without the Vision Pro present so that you can still capture an environment to revisit later in virtual reality.

Final thoughts

Are you convinced yet that we’re on the precipice of something special?

While it’s hard to fathom the staggering size of the estimated $20 billion Apple invested in building the Vision Pro, perhaps the 5,000+ patents filed will give you an idea of how significant this device will be.

To quote myself, “You don’t get to be a $3 trillion company without being able to see the future.”

The developed world will evolve more in the next 10 years than mankind has ever seen — a lot thanks to the Apple Vision Pro.

However, it is up to us, now, to shape that path. Will we be…

glad that we did or man, wish that we didn’t?

“The Web as I envisaged it, we have not seen it yet. The future is still so much bigger than the past.” — Tim Berners-Lee, Inventor of the World Wide Web

-

My previous Apple Vision Pro articles on Medium:

“11 Productivity Hacks for the Apple Vision Pro

5 Existential Questions for the Vision Pro (to save humanity)

--

--

Noah Miller
Predict

Passionate about start-ups, technology, storytelling, and productivity. millernoah.com