Open In App

What is ARCore?

Improve
Improve
Like Article
Like
Save
Share
Report

Many of you must have played Pokemon Go, an Augmented Reality based game that went viral a few years ago. If you did or if you ever used any of the other applications involving the concept of Augmented Reality (e.g: Snapchat uses AR to apply filters and different objects on your face virtually), you surely know how wonderful of an experience it is to use and interact with such an app. 

History

People find it hard to believe but the fact is, Augmented Reality is pretty old technology. It was invented in 1968 with Ivan Sutherland’s development of the first head-mounted display system called “The Sword of Damocles”. However, the term ‘Augmented reality’ wasn’t coined until 1990 by Boeing researcher Tim Caudell.

Since then, augmented reality’s use cases started growing in numbers and various technologies started adapting it including NASA’s hybrid synthetic vision system of their X-38 spacecraft where AR technology was used to assist in providing better navigation during their test flights to Google’s Glass devices, a pair of augmented reality glasses that users could wear for immersive experiences.

What Is Google ARCore?

Augmented Reality simulation is not a simple task if carried out from scratch. That is why there are proper tools in the market that helps a developer to build apps as desired with ease and efficiency. There is a good deal of Software Development Kits available to create Augmented Reality based applications and one of the popular SDKs that is much valued in the AR development community is Google’s ARCore.

ARCore is a platform developed by Google that was released on the 1st of March 2018 for building augmented reality experiences. ARCore enables an individual’s phone to sense its environment and understand its surroundings to interact with information. It basically uses three key capabilities namely Motion Tracking, Environmental Understanding, and Light Estimation (all of them are described below) that help to put virtual objects on the real environment of the world as seen through your phone’s camera.

As we know, fundamental knowledge are the building blocks to learn and apply any concept, let us look at the internal working of ARCore before diving into its implementation.

Basic Concepts Of ARCore

Here are some terms and concepts associated with ARCore.

  1. Motion Tracking: When you use an AR-based application, you are asked first to open your camera and sometimes also asked to move your phone. This is done to capture your surrounding and detect distinct features from it also called feature points. ARCore uses SLAM (Simultaneous Localization And Mapping) to understand the position of your phone relative to your surrounding. Once the feature points are detected, SLAM uses them to compute the change in location. To compute the position and orientation of the phone relative to its surrounding, over time, the visual information detected by the camera is combined with the measurements of the IMU (Inertial Measurement Unit: an electronic device that measures and reports a body’s specific force, angular rate, and sometimes the orientation of the body, using a combination of accelerometers, gyroscopes, and sometimes magnetometers). Now, the virtual camera where the rendering is done of 3D objects is aligned with the device’s camera provided by ARCore to enable developers to render virtual objects from the correct perspective. This is how the rendered virtual image can be overlaid on top of the image obtained from the device’s camera, making it appear as if the virtual object is part of the real environment of the world.
  2. Environmental Understanding: You must have seen applications that place 3D objects on to specific real-world objects like a cat dancing on a table or a couch on the floor (For eg: Houzz app is one of the famous applications that help you design the interior of your room or house by placing 3D furniture on your floor). Here we will discuss, how the application knows about the flat surfaces or the horizontal surfaces in such detail. As we already discussed the feature points, ARCore looks for clusters of these feature points that lie on the same horizontal or vertical surfaces like a table or a door and makes this information available to your app. Later, this information is used to place 3D objects on flat surfaces.
  3. Light Estimation: To make virtual content more realistic, it is always a good choice to consider light as one of the significant areas to work on. As the light gets reflected in different directions after falling onto an object making the environment pleasing to the eyes of a viewer, the concept of light estimation tries to do the same with the 3D objects. After capturing images of the surrounding, ARCore provides information about the average intensity and color correction of the images which lets a developer light a virtual object under the same conditions as the surrounding environment.

Does Your Phone Support ARCore?

For now, all devices do not support ARCore because of the camera quality and internal sensors specifications of each and every device obviously does not meet the requirements for ARCore certification. ARCore certification is a process where a device is said to be ARCore supported if it passes all its specification tests. Such a certification is necessary because Google wants users to have the best experience with AR applications. The three capabilities of ARCore require a good camera and a number of sensors. For example, motion tracking is done by combining the camera image and the motion sensor input to determine how the user’s device moves through the real world. Therefore, the quality of the camera and the hardware including CPU is thoroughly checked to ensure effective performance. You can view some supported device models for this link.

Getting Started With ARCore For Android

Prerequisite: Basics of Android Development

  1. Install Android Studio version 3.1 or higher with Android SDK Platform version 7.0 (API Level 24) or higher. (Link for installation tutorial: https://developer.android.com/studio/install)
  2. Get a sample project from by cloning the repository with the following command “git clone https://github.com/google-ar/arcore-android-sdk.git” (without the quotation marks).
  3. In Android Studio, go to file->Open and select the HelloAR sample project located in the samples subdirectory within the arcore-android-sdk directory.
  4. Now, you can run the app whether on a supported device or on the Android Emulator. (In the emulator, you must sign in to the Google Play Store or update Google Play Services for AR manually).
  5. HelloAR app is a very simple AR app that lets users place and manipulates android figurines on detected AR plane surfaces.

Advantages Of ARCore

  1. ARCore works with Unity3D and Unreal Engine as well as native to Android devices using the Java programming language.
  2. Google’s ARCore is a steady part of the corresponding operating system Android and is to be used on approx. 100 million smart devices by the end of 2017.
  3. Google completes Apple’s strive to make AR functions available on all smart devices. Native apps, but also third-party provider’s apps, have no access to the native AR functions of the operating system. As a reliable and long-term provider of AR technologies, Google and its ARCore developers offer a steady basis for AR app developments.

Disadvantages Of ARCore

  1. ARCore can only be used in devices with Android Operating System.
  2. If an ARCore’s application is to be used in a device with IOS making it cross-platform, Apple’s new AR solution ARKit could be used as a counterpart to the ARCore. Even though ARKit and ARCore strive for the same goals, it has to be tested in each case, if every function of a new AR app can be realized on both platforms.

References

  1. https://developers.google.com/ar/discover/concepts
  2. https://developers.google.com/ar/develop/java/quickstart

Last Updated : 02 Oct, 2022
Like Article
Save Article
Previous
Next
Share your thoughts in the comments