During Apples unveiling of iOS 14 and the new macOS Big Sur, Apple also introduced its new ARKit 4 bringing with it a number of new features including Location Anchors, a new Depth API, and improved face tracking to name a few.
Location Anchoring allows developers to anchor your AR creations at specific latitude, longitude, and altitude coordinates. Users can move around virtual objects and see them from different perspectives, exactly as real objects are seen through a camera lens. Support for Face Tracking extends to the front-facing camera on any device with the A12 Bionic chip and later, including the new iPhone SE. “
The advanced scene understanding capabilities built into the LiDAR Scanner allow this API to use per-pixel depth information about the surrounding environment. When combined with the 3D mesh data generated by Scene Geometry, this depth information makes virtual object occlusion even more realistic by enabling instant placement of virtual objects and blending them seamlessly with their physical surroundings. This can drive new capabilities within your apps, like taking more precise measurements and applying effects to a user’s environment.”
“ARKit 4 on iPadOS introduces a brand-new Depth API, creating a new way to access the detailed depth information gathered by the LiDAR Scanner on iPad Pro. Location Anchoring leverages the higher resolution data in Apple Maps to place AR experiences at a specific point in the world in your iPhone and iPad apps. And support for face tracking extends to all devices with the Apple Neural Engine and a front-facing camera, so even more users can experience the joy of AR in photos and videos.”
For more details on the latest improvements to the Apple ARKit suite jump over to the official Apple Developer website by following the link below.
Source : Apple
Latest Geeky Gadgets Deals
Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn more.