Accept Cookies

Our experiences with ARKit 2

Kenny Deriemaeker
XR Tech Lead

The past week saw the release of iOS 12, the latest update to the operating system for iPhone and iPad. One of the most exciting things for us at In The Pocket is a major update to ARKit, Apple's platform for augmented reality on mobile. Our developers have been experimenting with the new capabilities of ARKit 2 for several months and have learned a few interesting things along the way.

Augmenting things, people and places

ARKit 2 is better than ever at recognizing things. Now, a physical object - a magazine cover, a chair, a table… - can not only be recognized in an instant, but also tracked as it moves in 3D space and augmented with a virtual overlay.

Here's a great example of how this can bring a chair to life with data from the web, including actions that the user can perform right from the camera view.

A chair scanned by ARKit 2

Even more interesting is the ability to scan physical objects and places with an iPhone or iPad and to add AR content to them. If you've been following AR for a while, you'll be familiar with those ugly QR-code-like markers that are typically used to trigger an AR experience. It looks like those are now a thing of the past. With 3D object recognition, physical objects become their own markers.

The use cases are wide-ranging, from toys that come to life to maintenance and support scenarios in industrial environments. Here is a proof-of-concept of ours where we kind of did both:

An AR miniature forklift being inspected on a table

Localization and sharing

Apple is also taking the first important steps to take Augmented Reality into the real world, and to enable shared experiences. Both are enabled by the same underlying technology, which they simply call Persistence.

Persistence means that ARKit's world map — the 3D model of your environment which your iPhone constructs on the fly using computer vision and smart sensor analysis — is not discarded when you close the app or put away your phone. With ARKit 2, that 3D model can now be stored and retrieved later including whatever AR content was in that space.

At the most basic level this means that the couch you put in your living room in IKEA Place will still be in the same place the next time you open the app, as you'd expect.

At a more profound level, persistence enables precise camera-based localization: if an app can align ARKit's 3D model to data about the building or venue you're in, it can visualize that data and create a highly contextual AR experience. Indoor wayfinding and infrastructure visualization are some of the obvious uses for this. We're using it to guide visitors and new employees around our offices.

Wayfinding markers on the floor being shown trough AR

ARKit's maps can also be shared with other users: send your map of the room to the device of another user who is in the same room, combine with peer-to-peer networking and you can have a shared AR experience.

Two persons playing an AR game on an empty table

Games are a killer app for real time multi-user AR, but shared experiences are not necessarily realtime: I could also leave something behind in AR for you to find later. Yes, we could soon be up to our knees in digital user-generated AR content.

AR x AI = Magic

If there is one thing we have learned from experimenting with all these new building blocks, it is that Augmented Reality mixes really well with other emerging technologies, especially AI.

In the old world, assembling a machine or piece of equipment is usually a stressful experience guided by a paper instruction manual. In the new world of AR and AI, you simply take out your smartphone and point the camera at the pieces in front of you. The Machine Learning model recognizes which component is which, and Augmented Reality shows you precisely where it goes and what the steps are to get the job done.

The next level

The new A12 processor for the iPhone XS and XR underscores this synergy with lots of graphics horsepower (great for AR) and a dedicated neural processor (great for ML).

Apple's iPhone keynote contained a demo of Homecourt, an app that leverages all this power to help you improve at basketball. The app films you as you play, analyzes your movements and timing using Machine Learning, and gives you 3D feedback using Augmented Reality. As smartphone hardware gets more powerful and more optimized for tasks like AR and ML, we think we'll see more and more smart vision-based apps like this show up and improve various areas of our lives.

AR industry insiders tell us that 2021 is the safest bet for when Apple will reveal its own dedicated, HoloLens or Magic Leap-style AR glasses to the world. Until then, we have plenty of new tools and experiences to build for the device you've already got in your pocket.

Get started with Design Systems

Our Design Systems Whitepaper provides an actionable guide explaining the benefits of design systems and an overview of the different tools that are necessary to implement design systems.

Get started with Design Systems

Related stories

Get more insights

Sign up for our newsletter and stay updated on trends, events and inspiring cases.