Exploring ARCore: Digging fundamentals of AR
Augmented reality is picking up the heat nowadays. All the major companies such as Facebook, Apple, Microsoft, and Google are pushing their operating systems towards augmented and virtual reality.
In late August 2017, Google released ARCore SDK in the preview. (Here is the announcement blog post.) This allows the application or game developers to easily integrate augmented reality functionalities into their Android and web applications or games.
In this blog post, we will talk about the fundamentals of ARCore. This blog post aims to provide fundamental concepts of AR and in the next posts, we will start playing with ARCore.
But, before we talk about that, it will be helpful if we truly understand what augmented reality really is.
What is augmented reality?
Augmented reality (AR) is a live view of a physical, real-world (the world surrounding the user) whose elements (such as table, floor, cars) are enhanced by computer-generated information. Computer can provide the information in the form of visual, haptic, voice or any other.
Difference between AR and VR:
Don’t confuse it with VR or virtual reality. Both are immersive computing.
The main difference between AR (Augmented Reality) and VR (Virtual Reality) is that in VR application tries to replace the surrounding environment with digital/virtual objects and the person does not have any direct connection with surroundings. While AR takes a different approach. It doesn’t replace the surrounding environment. Instead, it adds computer-generated digital information to the current environment and presents the user with the enhanced view of the surroundings.
Why AR may be the next big thing?
AR is the completely new way of presenting the information to the user. AR has the remarkable ability to change the very perception of the world around us.
- It allows the computer to present the right information at the right time and in the right place without distracting the user.
- AR allows the user to observe the scene or the object from any angle.
- It can enhance the surroundings by adding a new object and allows the user to clearly visualize the object in the 3D world and interact with them. This can revolutionize the consumer and advertising industry completely.
AR development is still in its initial stages. Everyone in the industry is trying to figure out what they can do with the AR. Possibilities are endless!!!
Fundamentals of AR:
For any AR application, understanding the surrounding environment is very important. There are three fundamental technologies, those allow applications to understand the real world and provide rich AR experience to the user based on that information.
- Motion Tracking:
Motion tracking allows the AR application to understand and track its position relative to the world. It records person’s or object’s movement into the 3-dimensional space.
Using this types of positional tracking AR applications can know the exact pose (position and orientation ) of the object or person in the real world and place the digital content into the specific point in the real world.
As you might know, the things which are closer to our eyes look bigger in size compare to the things which are far away from our eyes. By utilizing the motion tracking and calculating the distance between the user and any point into space, the computer can decide what should be the size of any virtual object at that point into space.
2. Environmental understanding:
It allows AR applications to understand surrounding environment for object or person.
Environmental understanding includes detecting flat surfaces and detecting feature points.
Using this information AR application can place the virtual object onto the real world surfaces (e.g. On top of the table) and tilt the virtual object according to the tilt of the surface.
3. Light estimation:
This is very critical part of the augmented reality applications. In the real world, every object has a different amount of shining and have shadows of different length and shape based on the direction of light source and intensity of the light near to that object.
Using light estimation computer can detect the intensity and direction of the light at any given point into space.
AR application can utilize this light information to calculate the size and the direction of the shadow for any object at any point in space.
What is ARCore?
ARCore is a software development kit (SDK) developed by Google for augmented reality applications to be built. It provides higher level APIs for the application to interact with the real world.
ARCore integrates all the three fundamental technologies for rich AR experience. It takes heavy lifting of tracking the position of the user into the three-dimensional space, estimating the light, detecting the flat surfaces. This allows application developers to focus on their business logic and build beautiful AR applications.
Android is the world’s largest operating system. The goal of ARCore is to bring the AR experience to as many Android devices as possible. As you read, unlike project Tango, ARCore doesn’t require any specialized hardware. It uses phone’s camera and a bunch of sensors (such as an accelerometer) to track user’s position and provides this information to the application. This gives the huge opportunity to the applications as they can bring their AR functionality to billions of Android devices out there into the market.
Currently supported platform for ARCore is the Android Operating System. Google also launched custom prototype browsers for web developers to bring AR in to web development world, too. These browsers allow web developers to create AR-enhanced websites using the ARCore and ARKit.
How ARCore gets information about real world?
As we discussed above, any AR application uses three fundamental technologies to get information about the real world. Let’s see how ARCore implements each of the technologies.
- Motion tracking in ARCore:
As the user moves in the real world, ARCore uses concurrent odometry and mapping (COM), to track the movement of the user into the real world. ARCore does all of this without using external hardware or sets up.
It users phones camera to detect the distinct feature points into the real world. As the camera moves, ARCore will track the change in the position of all these feature points on the screen. The point which is near to the camera will move less on the screen compare to the point far away from the camera. Base on these data, ARCore will calculate the relative distance of the feature point from the camera.
ARCore also utilizes phones inbuilt sensors to detect the orientation of the phone.e.
2. Surface detection in ARCore:
ARCore constantly looks for the distinct feature points in the real world using phone’s camera. It looks for clusters of feature points that appear to lie on common horizontal surfaces such as tabletop and desks.
It also provides the boundaries of the plane by detecting the edges in the camera images.
Developers can utilize these surfaces/planes to place the virtual object.
3. Light estimation in ARCore:
ARCore detects average light conditions for the scene using the phone camera. It provides the information about the average light intensity and the color correction information for the given scene.
Developers can utilize these pieces of information and provide shadow and the shine to their object to increase the realism of the scene.
At this point ARCore doesn't provide the light intensity at each point of the 3D space. It provides average light intensity for the whole scene.
In the next article, we are going to look into how to start building the AR application in the popular game engine, Unity. Stay tuned!!!