ARKit for iOS
ARKit for iOS

ARKit for iOS

Shubham Dhingra
AppleCommunity
Published in
3 min readSep 22, 2019

--

ARKit helps to integrate iOS device camera and motion features to provide feature of augmented reality experiences in your application, introduced by Apple in WWDC 2017.
Augmented reality (AR) allows to look and feel virtual objects (in form of 2D or 3D elements) in real environment from a device’s camera.
ARKit is broken up into three distinguish layers:-

Tracking

Tracking is the core functionality of ARKit . It is the ability to track your device in the real time.
* World tracking:- With the help of this world Tracking , we are able to get the device relative position in the physical environment.

* Virtual inertial odometry :- use camera images and motion data from your device in order to get the precise view for where the device is located and how it is oriented.

No external setup is required and there is no perquisite knowledge required about the environment.

Scene Understanding

Based upon the tracking , Scene Understanding have the ability to determine the attributes and properties of the environment around your device.
* Plane Detection :- Plane detection have the ability to understand the plane & surfaces in the physical environment around the real device. i.e ground floor or table.

* Hit-testing :- In order to place a virtual objects hit testing functionality come into existence. by determine the intersection with the real frame so that you place your virtual object in the real world.

* Light Estimation :- It is used to render and correctly light your virtual geometry so that it matches with the real world.
The contribution of all these above features you are able to integrate a virtual content in a real frame.

Rendering

Easy Integration :- Apple provide an easy integration of rendering any kind of virtual object in the real frame.
ARViews :- A views like camera images , tracking views and the scene understanding.
Custom Rendering :- We are able to do the custom rendering by the template provide in Xcode.
By using the Sprite Kit and Scene Kit libraries the rendering process integration will be come into existence.

ARSession Process.

Capturing :- AVFoundation and CoreMotion framework is used for capturing and tracking the particular objects.

ARSession :- ARKit is session based integration , the ARSession will create that will control the complete process of the augmented reality application.

ARSessionConfiguration :- ARSessionConfiguration and all its subclass render what kind of tracking want to run on your session. By enabling/disabling the properties you will get the different scene understanding and get your different process for your session. It order to run your session you will simply call run method after provide the ARSessionConfiguration.
Once the session is start , AVCaptureSession and CMMotionManager created that is used to track the image data and motion data, once the process is done ARSession output is the ARFrame which is nothing but the snapshot in time.

In Order to get the current frame used the properties of the ARSession and there is a pre defined delegate method is provided to get the new frame when it is available in the physical environment.

So this is all about the basic tools of ARKit used in iOS .

!!! HAPPY CODING !!!

Thank you for reading, please hit the recommend icon if like this collection. Questions or Doubts? Leave them in the comment.

--

--