ARKit 2.0 by Apple: a Breakthrough in Augmented Reality?

ARKit 2.0 by Apple: a Breakthrough in Augmented Reality?

So, what’s new Apple has in store for us in ARKit 2? In this article, we will explore the functionality introduced by this new (definitely eagerly awaited) release.

Earlier this year, Apple presented the new edition of its AR software development engine – ARKit 2.0.

Initially introduced in 2017, ARKit allowed iPhone users to track the motion of their handsets in space, detect the amount and warmth of the light around, and get the information about horizontal planes. Needless to say, the engine became the real innovation of the iOS platform.

AR Map Maintenance & Recovery

Perhaps, one of the most powerful features brought by ARKit 2 is the possibility to save environment maps with augmented reality objects. Thanks to it, you can initialize the new AR session with the objects remained in the same locations where you previously put them.

Moreover, you can send the saved map to another server and use this map on other devices.

For better work with this function, Apple recommends doing the following:

  • Scan the scene from different perspectives. This way, the pixel array will be larger and more accurate.
  • The environment should be static and well textured.
  • The pixel array should be reasonably dense.

Multiplayer augmented reality

The map saving function allowed users to synchronize their coordinate systems with multiple devices. Knowing the location of each connected device, you can create multiplayer scenarios.

At the ARKit’s presentation, Apple showed the SwiftShot gaming app where each player was to take aim with him/her slingshots at the opponent’s slingshots. The app demonstrated the possibilities of the multiplayer AR in action. Find more about it on the official Apple website.

Environment reflection

By getting the footage from the camera, ARKit can now build the cube map with the environment texture. The information that didn’t get in the shot is generated with the help of machine learning algorithms.

Tracking of Moving 2D Images  

ARKit 1.5 allowed detecting only still images. In the new version this restriction was eliminated, and now you can get the coordinates of moving images. During the presentation, Apple showed how this feature could work on the example of the photo with video replacement.

For better movement tracking, use contrast, well-textured images. Otherwise, the Xcode iOS-integrated environment will warn you that this condition is not followed.  

Tracking of Static 3D Objects

Also, Apple added the detection of 3D objects. Before detecting, you need to scan the object. Use Apple’s scanner app for this purpose. Verify that the object you are going to scan is opaque and well-textured.

Face Recognition

In previous versions, you could get face coordinates and angle, polygonal face grid, and the face emotion set (including 51 emotions).  Apple enhanced this technology by adding the possibility to track gaze and mouth movement and to detect the directional light.

General Improvements

  • Accelerated plane initialization and detection;
  • More accurate phone movement tracking;
  • When expanding the plane, its borders are now detected more accurately;
  • Added the support of the 4:3 aspect ratio (the format is set by default).

All the above improvements, except for the new aspect ratio, will be applied automatically. To proceed to the 4:3 format, you need to rebuild the app with the new SDK.

Wrapping Up

Will Apple’s augmented reality be eventually successful? Will it become the key iOS feature?

We believe that it is quite possible. Provided that the manufacturer will take care of these two main issues:

  • Battery consumption;
  • User convenience.

Today the success of the AR’s popularity largely depends on developers. As soon as one releases a helpful and easy-to-use ARKit-based app, the technology will instantly go mainstream. However, the progress currently made by Apple astounds.


Print   Email