ARCore, ARKit: Augmented Reality for everyone, everywhere!
When following the development of Tango devices over the last years, the release of ARCore is both a relief and at the same time leaves the question open: Can I do everything that Tango devices do on my own phone now, just via updating the software? Or to put it another way: What can I do with Tango that I cannot with arCore?
ARCore is a relief because it solves the one big obstacle in the way to place commercial augmented reality project: Availability. At least for B2C applications, being able to target the whole audience instead of just a small group of enthusiasts in the technology bubble is the whole point of creating an app. To be successful everyone has got to be able to use it.
Even for google, growing an ecosystem of Tango enabled devices is not an easy task, if the big device manufacturers do not fully support it. This is Apples strategic advantage: They can influence the hardware setup yourself like happened for the iPhone X. So, from Google’s perspective, creating ARCore is a logical step, even without the strategic pressure of Apple putting out arKit before.
At the time of this writing, the ARCore Preview only supports Google Pixel and Samsung Galaxy S8 hardware. Google announced that they intend to increase availability to up to 100 Milllion Devices until the end of the Preview.
Cross Platform Similarities
One other big change towards the brighter side of things for developers is: By having a common featureset, now it is possible to create Applications for both Apple and Android ecosystems, without redesigning the application for each platform.
Our efforts on creating an indoor navigation system using Tango over the last years put us in the position to evaluate what are the things Tango can do, that cannot be done with ARCore (so far). As you may have guessed by now: Yes, we are working on showing you a direct comparison of both technologies as the indoor positioning component of our indoor navigation augmented reality interface. As a first step, lets dig deeper into the featureset of both the Tango Java API and ARCore’s pendant.
Both Tango and ARCore state three main features. In the next section we will start to give an overview what their differences and similarities are, and sketch the benefits and limitations that we know of.
Tango and ARKit Motion Tracking
The feature that the Tango- and the ARCore-Software libraries share is Motion Tracking. This means that the library can use the device sensors to calculate the translation and rotation in respect to some starting point in space. Although it sounds more or less trivial, things can get tricky due to the high framerate of the sensors, which causes small measurement errors to accumulate to a significant error over time, a concept which is called sensor drift.
Tango: Area Learning
In order to correct these errors caused by sensor drift, Tango uses the additional depth sensor on the device to measure the exact position of visual characteristics in the camera picture (a concept called visual odometry). By comparing constellations of these detected characteristics, a location that was visited in the training run can be re-recognized later.
This technology might work without additional sensors. As a result of their own indoor navigation project for Lowes, Google has announced a Visual Positioning Service (VPS), that will in the future be able to identify visited indoor locations by searching in a database hosted by Google. At the time of this writing it was in closed alpha.
ARCore: Environment understanding
Although ARCore is also calculating features from the camera picture, it lacks the additional hardware that ensures the correct positioning of these features. Nevertheless the lessons learned from Tango as an enabling technology seem to be sufficient to at least correct the sensor drift using characteristics calculated from the camera picture itself. ARCore uses these characteristics to estimate geometrical planes (and their extent) in the picture, that compose the environment.
As this feature is mostly used to place augmented objects in the picture in a way that they fit the real environment, the developer can define so-called anchors that attach them to real positions (represented as feature constellations).
Indoor Navigation: Initial Position
When the user of an indoor navigation application opens the application, the indoor positioning system needs to calculate the users initial position. Using Tangos Area Learning, this could be done by loading a so-called ADF file, that contained all the visual characteristics of an environment collected during a training run. ARCore does (until now) not allow developers to load previous sessions. Thereby an indoor navigation created with ARCore has (for now) either start from a known starting position, use a service like VPS for the initial position, or developers will have to create a logic for themselves to do so.
Tango: Depth Perception
While ARCore uses a mathematical model to estimate the devices surroundings, Tango devices can actually measure them, using an additional sensor. Thereby Tango is enabled to take real 3D scans of complex objects, a task which ARCore is not able to do.
ARCore: Lighting Estimation
ARCore is primarily intended to help developers render augmented objects into real surroundings. These objects have to match the lighting in the scene in order to blend in correctly. Lighting estimation calculates the direction and intensity light sources in the camera picture, to aid with this task.
What about arKit ?
While we are focusing on ARCore, we plan to extend our efforts to include Apples arKit as well.
Finally, there is a real chance to make augmented reality on mobile devices available for everyone, everywhere.