At its Worldwide Developers Conference, Apple announced a major update to RealityKit, its suite of technology that enables developers to start developing augmented reality (AR) experiences. Apple says that with the introduction of RealityKit 2, developers will have more visual, audio, and animation control as they work on their AR experiences. But the most notable part of the update is how Apple’s new Object Capture API enables developers to create 3D models in minutes using just an iPhone.
Apple stated in their developer talk that one of the most difficult parts of creating great AR apps was creating 3D models. This could take hours and thousands of dollars.
Apple’s new tools allow developers with just an iPhone (or iPad, DSLR, or even a drone if they prefer) to take a series of images to capture 2D images of an object from all angles, including the bottom.
With the Object Capture API on macOS Monterey, only a few lines of code are then required to generate the 3D model, explained Apple.
To begin with, developers would start a new photogrammetry session in RealityKit that points to the folder where they took the images. Then they would call the process function to generate the 3D model with the desired level of detail. Object Capture enables developers to generate USDZ files optimized for AR Quick Look – the system that allows developers to add 3D virtual objects to apps or websites on iPhone and iPad. The 3D models can also be added to AR scenes in the Reality Composer in Xcode.
Apple said developers like Wayfair, Etsy, and others are using Object Capture to create 3D models of real-world objects – an indication that online shopping is on the verge of a major AR upgrade.
Wayfair, for example, uses Object Capture to develop tools for its manufacturers to create a virtual representation of their goods. This will allow Wayfair customers to preview more products in AR than they do today.
In addition, Apple developers like Maxon and Unity are using Object Capture to create 3D content in 3D content creation apps like Cinema 4D and Unity MARS.
Other updates in RealityKit 2 include custom shaders that give developers more control over the rendering pipeline to fine-tune the appearance of AR objects. dynamic loading for assets; the ability to create your own Entity Component System to organize the assets in your AR scene; and the ability to create player-controlled characters so users can jump, scale, and explore AR worlds in RealityKit-based games.
A developer, Mikko Haapoja from Shopify, tried the new technology (see below) and shared some real-world tests shooting objects with an iPhone 12 Max over Twitter.
Developers who want to try it out for themselves can use Apple’s sample app and install Monterey on their Mac to try it out. You can use the Qlone camera app or any other picture-taking application they might want to download from the App Store to take the photos they need to capture objects, Apple says. In the fall, the Qlone Mac Companion app will also use the Object Capture API.
According to Apple, there are over 14,000 ARKit apps on the App Store today, created by over 9,000 different developers. With the more than 1 billion AR-enabled iPhones and iPads in use worldwide, Apple determines that Apple offers the world’s largest AR platform.