During this years Apple WWDC 2019 Developers Conference Apple introduced its new ARKit 3 two developers, specifically designed to help creatives build augmented reality applications for iOS devices. A few of the new features rolling out with Apple ARKit 3 include real-time body tracking and human collusion. Now AR content can realistically passes behind and in front of people in the real world, making AR experiences more immersive while also enabling green screen-style effects in almost any environment.
– Now you can simultaneously use face and world tracking on the front and back cameras, opening up new possibilities. For example, users can interact with AR content in the back camera view using just their face. Now ARKit Face Tracking tracks up to three faces at once, using the TrueDepth camera on iPhone X, iPhone XS, iPhone XS Max, iPhone XR, and iPad Pro to power front-facing camera experiences like Memoji and Snapchat.
– Detect up to 100 images at a time, and get an automatic estimate of the physical size of the image. 3D-object detection is more robust, as objects are better recognized in complex environments. And now, machine learning is used to detect planes in the environment even faster.
– With live collaborative sessions between multiple people, you can build a collaborative world map, making it faster for you to develop AR experiences and for users to get into shared AR experiences like multiplayer games.
Other features rolling out with Apple ARKit 3 include :
– Simultaneous front and back camera
– Motion capture
– Faster reference image loading
– Auto-detect image size
– Visual coherence
– More robust 3D object detection
– People occlusion
– Video recording in AR Quick Look
– Apple Pay in AR Quick Look
– Multiple-face tracking
– Collaborative session
– Audio support in AR Quick Look
– Detect upt to 100 images
– HDR environment textures
– Multiple-model support in AR Quick Look
– AR Coaching UI
For more details on the new Apple ARKit 3 and its features jump over to the official Apple developer website.