Snapchat among first to leverage iPhone 12 Pro’s LiDAR Scanner for AR

OSTN Staff

Apple introduced its latest flagship iPhone models, the iPhone 12 Pro and 12 Pro Max, at its iPhone event on Tuesday. Among other things, the devices sport a new LiDAR Scanner designed to allow for more immersive augmented reality (AR) experiences. Snapchat today confirms it will be among the first to put the new technology to use in its iOS app for a lidar-powered Lens.

As Apple explained during the event, the LiDAR (Light Detection And Ranging) Scanner measures how long it takes for light to reach an object and reflect back.

Along with iPhone’s machine learning capabilities and dev frameworks, lidar helps the iPhone understand the world around you.

Apple adapted this technology for its iPhone 12 Pro models, where it’s helping to improve low-light photography, thanks to its ability to “see in the dark.”

Image Credits: Apple presentation, screenshot via TechCrunch

The technology can also be used by app developers to build a precise depth map of the scene, and help speed up AR so it feels more instantaneous, while enabling new app experiences that use AR.

In practice, what this means for app developers is the ability to use lidar to enable things like object and room scanning — think, better AR shopping apps, home design tools or AR games, for example.

It also can enable photo and video effects and a more exact placement of AR objects, as the iPhone is actually able to “see” a depth map of the room.

Image Credits: Apple presentation, screenshot via TechCrunch

That can lead to new AR experiences like what Snapchat is prepared to introduce. Already known for some best-in-class AR photo filters, the company says it will soon launch a lidar-powered lens specifically for the iPhone 12 Pro models.

Apple gave a brief peek at Snapchat’s lidar-powered feature during the lidar portion of the iPhone event today.

Here, you can see an AR Lens in the Snapchat app where flowers and grasses cover the table and floor, and birds fly toward the user’s face. The grasses toward the back of the room looked as if they were further away than those closer to the user, and vegetation was even climbing up and around the kitchen cabinets — an indication that it saw where those objects were in the physical space.

The birds in the Snapchat Lens disappear as they move behind the person, out of view, and even land precisely in the person’s hand.

We understand this is the exact Lens Snapchat has in the works, but the company is holding further details for the time being. However, it shows what a lidar-enabled Snapchat experience would feel like.

You can see the Snapchat filter in action at 59:41 in the Apple iPhone Event video.

 

Updated, 10/13/20, 4:47 PM ET with confirmation that the Lens shown during the event is the one that will launch.

Powered by WPeMatico

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.