“Visual Intelligence” Coming to iPhones — What It Means for Brands
And other noteworthy announcements from Apple’s latest product launch event
Like clockwork, Apple announced its latest product updates on Monday via a pre-recorded keynote presentation named “Glowtime” — a nod to the glowing ring that surrounds the edge of the screen when Apple Intelligence is evoked. Indeed, despite it being a hardware event, the real star of the show is undoubtedly the set of AI-powered features that Apple is set to unleash with the new iPhone models this fall.
Overall, the keynote ran for a breezy 98 minutes and unveiled the latest updates to iPhones, AirPods, and the Apple Watch, with Apple Intelligence being the underlying thread that ties together the user experiences across devices. As always, MacRumors has published a thorough list of everything that Apple announced, ranging from the sleep apnea detection feature added to the Apple Watch, to the hearing health features coming to AirPods Pro 2.
Here, let’s take a close look at three particular aspects of Apple’s latest announcements that we believe may entail further implications and strategic insights for brand marketers.
“Visual Intelligence” as a Star Feature
First announced during its worldwide developer conference (WWDC) in June, Apple Intelligence aims to leverage its proprietary large language model (LLM) to primarily handle practical tasks on-device. It benefits from having access to the trove of personal data that iPhone users store in native iOS apps like Calendar or Mail, allowing it to deliver highly personalized results and, eventually, carry out tasks across apps. During the keynote, Apple highlighted four main groups of use cases for Apple Intelligence: self-expression (such as through the image generation tool), reliving memories (ask Siri to find a particular vacation photo), prioritizing attention through AI-generated notification summaries, and get things done via the supercharged Siri 2.0.
Besides reiterating the AI features introduced at WWDC, Apple unveiled a brand new “visual intelligence” feature for the new iPhone 16 models. Essentially functioning as Apple’s own system-native visual search tool, this new feature will allow iPhone users to simply search for information straight from the camera. Thanks to the new “camera control” button added to all iPhone 16 models, this visual search feature is easily accessible via a simple click and hold of the button.
Apple showcased three use case scenarios in the demo video: point the camera at a restaurant and receive a pop-up information card about the establishment; take a picture of a concert poster and automatically save the event to your calendar; or recognize a dog’s breed by snapping a picture of it. All of the computer vision tasks in those use cases are performed by Apple’s own AI model, either on-device or via Apple’s Private Cloud Compute system. Apple also emphasized that it would “never store any image” taken while conducting visual search, aligning with its general user privacy practices.
Moreover, Apple is teaming up with other AI services to broaden its use cases. For example, users can access Google Lens via “visual intelligence” to search for similar products to buy, or ask ChatGPT to solve a complex math problem written on a piece of paper. This bifurcated approach is in line with Apple’s overall strategy with Apple Intelligence — focusing on delivering the personal, specialized tasks and outsourcing the more complex tasks that require access to broader world knowledge to third-party partners.
For brands, the addition of a built-in visual search function to the most popular mobile device in the U.S. will no doubt usher in a new era for visual search as a brand touchpoint and potential marketing channel. The business info cards that “visual intelligence” pulls up when searching for local businesses seem to be coming from Apple Maps, which itself primarily sources information about U.S. businesses from a trio of sources: Yelp, TripAdvisor, and Foursquare. Given that Apple Intelligence also searches Google for product information, it wouldn’t hurt to make sure your product is properly indexed for Google Lens as well.
Adding a built-in visual search function to iPhones will no doubt usher in a new era for visual search as a brand touchpoint and potential marketing channel.
Beyond ensuring your stores are properly indexed on Apple Maps, brands can also consider leveraging visual search to enhance their discoverability and engagement through targeted content. For example, by optimizing product images, logos, and signage to ensure they are easily recognizable by visual search algorithms, brands can create a more seamless customer experience. This includes ensuring that their visual assets, from packaging to storefronts, are consistent and clear across all digital and physical touchpoints.
Camera as Input & Capturing Spatial Content
In recent years, Apple has been positioning new photography features as key selling points for new iPhones, especially what it calls “computational photography,” aka. software-enhanced camera features for taking better pictures and videos. This year is no exception; In fact, Apple doubled down by adding the aforementioned “camera control” button to all iPhone 16 models. Unlike the Action Button introduced last year with iPhone 15 Pro that replaced the mute switch, this new camera button comes with additional haptic feedback, which means it functions like a tiny touchpad, with its capacitive control surface for other camera control, such as switching between the various photography styles with fingertip swipes. Apple mentioned that third-party apps, such as Snapchat, will also be able to leverage this new camera button to build new controls.
This latest hardware addition to iPhones underlines just how important the camera has become as an input device for mobile interactions. If a picture is worth a thousand words, then the ability to communicate to your phone what you’re looking at by simply snapping a picture will probably save us all a lot of time in writing out the description. Leveraging the camera as an input device is increasingly central to the smartphone experience, and it is about to become more prominent with the introduction of the camera button and visual intelligence.
Of course, this also helps Apple to move one-step closer to multimodal AI, which allows input and output across text, speech, or visual. Siri is not quite there yet, but with the introduction of visual intelligence and more conversational skills, one could see that multimodal capability lurking on the horizon.
Siri is not quite there yet, but one could see that multimodal capability lurking on the horizon.
Another important camera upgrade for the iPhone 16 models is that they can now capture spatial content, an immersive media format optimized for Vision Pro, Apple’s mixed reality headset launched earlier this year. Previously, only the iPhone 15 Pro can only capture spatial video, now the iPhone 16 lineup is capable of capturing both spatial photos and audio. Notably, spatial videos can also be viewed on other headsets that support the MV-HVEC format, such as Meta Quest. This means that Apple users will be able to easily produce more immersive content that could be used to drive interest for Vision Pro, especially given Apple’s aspirational positioning of its iPhone camera as a professional creator tool.
Overall, this continuing emphasis on the smartphone camera as not only an input device, but also a tool for capturing immersive content, should inspire more brands to explore camera-centric brand interactions and marketing campaigns. This might include exploring mobile AR features that overlay digital information on product packaging and promotional materials, such as OOH ads. Additionally, brands can capitalize on immersive content creation by building a library of user-generated spatial photos or videos of their experiences with the brand, perhaps tied to specific locations or events, and make the collection available for viewing on mixed reality headsets.
Playing the Long Game on AI
Apple is playing the long game with AI, taking a measured approach rather than rushing headlong into the hype. While the new iPhones hit shelves next week, the much-anticipated Apple Intelligence features won’t be available right away. Instead, users will see a phased rollout, with the first set of features — like an AI writing assistant and the photo “clean-up” tool — arriving in beta in October via a software update for the compatible newer devices. As for the rest of Apple’s AI features, you might have to wait until next spring to try them. Axios has compiled a helpful listicle on when each of the promised Apple Intelligence features are coming, if you care to know the details.
For the pro-Apple camp, this prolonged rollout cycle likely means that the initial features won’t immediately overwhelm users while also giving Apple more time to finetune its AI models to improve accuracy and usefulness. To Apple’s critics, however, this is clear evidence that Apple is still lagging behind its competitors in integrating generative AI into its products. Notably, the Washington Post reports that the unreleased Apple Intelligence makes too many things up, after testing a prerelease version of the software.
Taking the staggered release of AI features into consideration, the new iPhones seem unlikely to spur a “super cycle” of upgrades, as some industry analysts had predicted based on the arrival of Apple Intelligence, at least not this year. However, it is entirely possible that, by the same time next year when the next generation of iPhone is unveiled, more people will likely have had a chance to experience Apple Intelligence firsthand, or at least to observe and judge its usefulness through other iPhone users. That first-hand exposure could be the key to motivating upgrades in the future, once the real-world value of Apple’s AI features becomes clear.
Apple’s measured, step-by-step integration of AI offers a valuable lesson for brands: take the time to get it right. Rushing to adopt new technology without fully understanding how it serves the user can lead to half-baked features and disappointed customers. Brands can adopt a similar strategy by prioritizing model quality and user experience over speed. It’s not just about being first to market — it’s about offering something that meaningfully improves the customer’s interaction with the brand. In a world where trust and consistency are key to gaining consumer loyalty, rushing into AI might prove more risky than rewarding, especially for incumbent brands.
Apple’s measured, step-by-step integration of AI offers a valuable lesson for brands: take the time to get it right.
At this point in the AI hype cycle, it’d be unwise for most brands to jump on the AI bandwagon at full speed. Instead, brands can take a page out of Apple’s playbook and leverage a staggered rollout to build customer trust and anticipation. By communicating transparently about upcoming features and setting clear expectations for what customers can expect, brands can sustain excitement while managing potential delays or setbacks.