Home / Apple / Snap uses iPhone 12 Pro lidar sensor for AR images

Snap uses iPhone 12 Pro lidar sensor for AR images



Snap has launched a new version of the augmented reality studio that lets users create cool AR effects on Apple’s latest iPhone 12 Pro smartphones, thanks to lidar sensors.

The social chat company said that the lidar-driven Snap Lenses will herald a new generation of AR. LiDAR, or light detection and range, uses lasers to illuminate objects and judge how far they are, based on how long it takes the light to reflect back. The iPhone 12 Pro will ship on October 23, while the iPhone 12 Pro Max will ship on November 13. Both are equipped with a sensor for scanning lidar, which allows them to detect the shape of objects in the surrounding area and more accurately map an AR image on the surface of that object, and adds a new level of realism to the AR effects.

Snap launches Lens Studio 3.2 today, allowing developers to take advantage of lidar and build lidar-powered lenses for iPhone 1

2 Pro and iPhone 12 Pro Max. Snap said that AR experiences can overlap the real world more seamlessly, allowing Snapchat’s camera to see a metric target network of the scene and better understand the geometry and meaning of surfaces and objects.

Snap said that this new level of stage understanding allows lenses to interact realistically with the outside world. With iPhone 12 Pro’s A14 Bionic and ARKit software, developers can render thousands of AR objects in real time and create immersive environments.

A new interactive preview mode in Lens Studio 3.2 lets developers create lenses and preview them around the world before accessing the new iPhone 12 Pro.


You can not solo security

COVID-19 Game Safety Report: Learn the latest gaming attack trends. Access here



Source link