Scroll Top

Apple iPhone 12 packs in Lidar to bring physical and virtual worlds closer together

WHY THIS MATTERS IN BRIEF

Lidar sensing systems used to be an incredibly large and incredibly expensive, and now they’re in your phone.

 

Love the Exponential Future? Join our XPotential Community, enjoy exclusive content, future proof yourself with XPotential Universityconnect, watch a keynote, or browse my blog.

This week Apple introduced its latest line up of smartphones, including the iPhone 12 Pro and iPhone 12 Pro Max, both of which are equipped with a LiDAR scanner which will help to dramatically improve their Augmented Reality (AR) capabilities and help users experience true real time AR experiences, and even help them scan their environments to create Virtual Reality (VR) copies of them, like the ones that Samsung recently showed off.

 

See also
Google's Eric Schmidt says the "Internet will disappear"

 

The new phones are also a clear demonstration of what I discuss in my futurist keynotes – the fact that over time every technology gets faster, or more performant, smaller, and cheaper. Bearing in mind that less than a decade ago the LiDAR sensors Apple’s using would have cost them more than $75,000 per unit it’s a staggering demonstration of just how fast the technology has been developed and commercialised – something that was helped massively by the automotive manufacturers demand for the technology which is a key component in helping them create self-driving cars which are the next generation of vehicle mobility.

LiDAR is interesting from a futures perspective because it’s a so called “Time of flight” depth sensor which measures how long it takes for light to bounce off of objects in the scene and return to the sensor. With precise timing, the information is used to judge the depth of each point, and that’s what will help future augmented reality experiences to be much faster and more accurate which, ultimately, will likely help boost their adoption.

 

See also
Oculus' virtual reality film Henry wins an Emmy

 

While existing iPhones are already capable of pretty good AR tracking the current approach derives depth from machine vision techniques like SLAM, which tracks points in the scene over time to infer depth. Typically, this means that the system needs a few seconds and some movement from the camera before it can understand its frame of reference and begin to assess the depth of the scene.

Apple says that LiDAR in the iPhone 12 Pro and 12 Pro Max means the phones will be capable of “instant AR.” That’s because LiDAR captures depth information in the equivalent of a ‘single photo’, without the need for any phone movement or the need to compare images across time.

One way to think about it is to think about the pixels in a photograph. When you take a picture, every pixel captures colour and brightness information. Conversely, every pixel of a ‘LiDAR snapshot’ captures a distance value. So rather than needing to wave your phone around for a few seconds before an AR app can establish accurate tracking, tracking can start immediately.

 

See also
With over 4,200 virtual shops Hong Kongs largest virtual mall eyes expansion

 

Of course, you can also compare LiDAR depth data over time so that instead of a simple snapshot of depth you can build an entire depth-map of the scene – and it’s this feature that, when combined with AN ai LIKE samsung’s will one day let people render their entire environment straight into VR, and all that will do is change how you capture your family’s memories forever – for example, rather than taking photos or videos of your kids birthday parties imagine capturing the entire party in VR so you can relive it as though you’re back in the room over and over again…

That’s likely the real end game here. But for now LiDAR in a phone is good for AR too …

Related Posts

Leave a comment

FREE! DOWNLOAD THE 2024 EMERGING TECHNOLOGY AND TRENDS CODEXES!DOWNLOAD

Awesome! You're now subscribed.

Pin It on Pinterest

Share This