Geek Culture
Published in

Geek Culture

Apple’s Next-Generation “Big Killer”: AR

Image Credits: 9to5mac

Apple’s Developer Conference (WWDC) this year does not seem to have brought too many surprises to fans, mainly focusing on software and operating system upgrades. But if you look carefully at this developer conference, it is not difficult to find that Apple is focusing on a technical field-augmented reality (AR).

Image Credits: Apple

Apple did not spend much time introducing AR at the opening meeting on Monday, but announced several updates in the technical section and released several new AR tools and technologies for software manufacturers, indicating that AR is still Apple’s long-term One of the core projects.

CEO Cook has also said that AR is the “next big thing” in the technology field.

To be successful on AR devices, the final analysis requires a lot of software support, just like Apple Maps, YouTube and Safari browsers and other applications promote the debut of the first generation of iPhone. Apple is now launching new AR tools and technologies for developers, allowing them to participate in software development, which undoubtedly paves the way for Apple to create one or more “killer apps” opportunities.

At present, Apple has not announced any plans to release AR hardware, but market rumors have long been overwhelming. Apple may release AR headsets as early as next year. And Apple’s “first brother,” Guo Mingchi said in a research report earlier that Apple will launch its first AR helmet in the second quarter of 2022.

The Unremarkable WWDC or the Calm Before the Apple Innovation Storm

GeneMunster, the founder of Loup Ventures and senior Apple analyst, pointed out: “From a high-level point of view, the Apple Worldwide Developers Conference (WWDC) this year or next year is the calm before Apple’s innovation storm. At present, Apple is working hard to develop wearables around AR. New product categories for equipment and AR transportation technology.”

2021 Apple WWDC AR Technology and Product Highlights at a Glance

The news about Apple’s AR products in the main venue mainly focused on software and development platform updates. The following points are worth paying attention to:

  • Apple Maps is expected to open AR navigation functions in several specific cities;
  • Apple updated the RealityKit2 framework and launched a new developer tool Object Capture;
  • ARKit 5 is updated, and the anchor point function is upgraded;
  • iOS 15 adds a real-time text recognition function.

Benchmarking Google Apple Maps will Support AR Navigation

The new system iOS 15, iPadOS 15, and macOSMonterey’s Apple devices will open up a function similar to Google Maps AR real-time navigation. In a specific city, the nearby scenery is captured by the camera, and then the database is used for comparison, providing a 3D-style walking navigation function.

Apple’s new city navigation feature in Apple Maps. [Image Credits: Apple]

Different from traditional navigation maps, the new version of Apple Maps supports 3D navigation, which can display three-dimensional street scenes and buildings near the user, making it easier to better indicate the geographical environment of complex road sections. Currently, this function only supports London, Los Angeles, New York, Philadelphia, San Diego, and Washington.

The new version of the map will be updated to include new details such as business districts, docks, buildings, etc., such as altitude, new road colors, and label settings. If the driver uses Apple Maps, he will more intuitively see the turning lanes, middle lines, bus lanes, etc.; Pedestrians using Apple Maps will see the sidewalks and intersections.

The detailed changes of these new features of city streets are more competitive than the updates made by Google Maps. Apple will not only update the night 3D landscape light and shadow transformation, but real-time street views are also approaching Google. Apple said it will update the display of sidewalks and bicycle lanes in the future.

RealityKit 2 Framework Update Supports Rapid Generation of AR Previews

RealityKit is a rendering, animation, physics, and audio framework built from the ground up by Apple for augmented reality. With the help of native Swift API, ARKit integration, high-definition rendering, etc., developers can easily prototype and produce high-quality AR experiences. Combined with the LiDAR scanner on the iPad Pro, you can use video texture, scene understanding, position anchors, face tracking, and improved debugging tools for AR development.

Object Capture is a new development tool in RealityKit2. All developers can use Apple devices to take 2D photos and create 3D objects within minutes; macOS Monterey can import these photos into Cinema 4D and generate AR previews.

ARKit 5: Anchor Function Upgrade Adds “Immediate Recognition” Function

ARKit is a new generation of AR applications, and Apple hopes to change the way people connect with the world around them.

The user can use the positioning anchor to connect the virtual object with the longitude, latitude, and height of the real world. Use LiDAR to scan and get the depth scene display on iPad Pro.

It is reported that the new version of ARKit 5 has been updated, using positioning anchors, people are more likely to feel the location-based AR experience. This feature allows developers to build a better AR experience. Apple said that developers can easily discover and accurately locate their virtual content when using ARKit 5.

The positioning anchor function allows the developer’s AR object to be fixed at the set latitude, longitude, and height, and the user can move around the AR object and observe. This feature currently supports iPhone XS, iPhone XS Max, iPhone XR, and is only available for some cities.

ARKit 5 also improves motion tracking and supports face tracking in the iPadPro (5th generation) ultra-wide-angle camera. With the new App Clip Code anchor, users can fix virtual content in App Clip or ARKit applications to printed or digitized App Clip Code.

iOS 15 New text Recognition is Expected to be embedded in AR glasses

Apple cameras can now help you to extract the text (handwritten or printed) in photos, make it digitize so you can search for it on the web, copy-paste that text if it’s a phone number — call that number. Apart from this, it has other smart features including recognizing animal species, landmarks, etc…

This function is similar to image recognition software, allowing the system to find information in photos, such as identifying flower varieties or landmarks or providing technical support for Apple AR glasses in the future. Currently, users can experience this feature on iPhone, iPad, and Mac.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Aizaz Alam

Aizaz Alam

Engineer by profession, writer by passion……