Connect with us

Emerging Tech

Future iPhone might add a time-of-flight camera

We’re still a few months away from Apple announcing its 2019 iPhones, but rumors have already started for next year’s models, with the ever-reliable Apple analyst Ming-Chi Kuo claiming in his latest report that two of the 2020 iPhones will feature a rear time-of-flight (ToF) 3D depth sensor for better augmented reality features and portrait shots, viaMacRumors.

It’s not the first we’ve heard of Apple considering a ToF camera for its 2020 phones, either. Bloomberg reported a similar rumor back in January, and reports of a 3D camera system for the iPhone have existed since 2017. Other companies have beaten Apple to the punch here, with several phones on the market already featuring ToF cameras. But given the prevalence of Apple’s hardware and the impact it tends to have on the industry, it’s worth taking a look at what this camera technology is and how it works.

What is a ToF sensor, and how does it work?

Time-of-flight is a catch-all term for a type of technology that measures the time it takes for something (be it a laser, light, liquid, or gas particle) to travel a certain distance.

In the case of camera sensors, specifically, an infrared laser array is used to send out a laser pulse, which bounces off the objects in front of it and reflects back to the sensor. By calculating how long it takes that laser to travel to the object and back, you can calculate how far it is from the sensor (since the speed of light in a given medium is a constant). And by knowing how far all of the different objects in a room are, you can calculate a detailed 3D map of the room and all of the objects in it.

The technology is typically used in cameras for things like drones and self-driving cars (to prevent them from crashing into stuff), but recently, we’ve started seeing it pop up in phones as well.

How is it different from Face ID?

Face ID (and other similar systems) use an IR projector to pulse a grid of thousands of dots, which the phone then takes a 2D picture of and uses that to calculate the depth map.

The Verge@verge

Here’s how the iPhone X’s Face ID works

Embedded video

53912:32 AM – Nov 6, 2017Twitter Ads info and privacy291 people are talking about this

Time-of-flight sensors work differently: by using the time-of-flight data to calculate how long it takes the lasers to reach the object, it’s getting real-time, 3D depth data instead of a 2D map that is calculated to three dimensions.

That leads to several advantages: due to the laser-based system, it works for longer ranges than Apple’s grid-based system for Face ID, which only works for about 10 to 20 inches away from the phone. (If the subject is too far away, the dots for the grid are too spaced out to provide a useful resolution.) It also, in theory, allows for more accurate data than IR-grid systems. A good example is the LG G8, which uses a ToF sensor for its motion-sensing gestures. The ToF system allows for things like tracking and distinguishing each individual finger in 3D in real time to enable those gestures.

Why does Apple want it?

The rumors from both Kuo and Bloomberg are saying that Apple is looking to add the ToF sensor to the rear camera on 2020 iPhones, not to replace the existing IR system used for Face ID (which the new iPhones will reportedly still have).

Apple’s focus is said to be on enabling new augmented reality experiences: a ToF sensor could enable room tracking on a mobile scale, allowing a future iPhone to scan the room, create an accurate 3D rendering, and use that for far more immersive and accurate augmented reality implementations than current models allow for.

As an added bonus, a ToF sensor would also enable better depth maps for portrait mode pictures (which Huawei already does with the P30 Pro) by capturing full 3D maps to better separate the subject from the background, as well as better portrait mode-style videos.

Who else is using it?

Several phone companies already feature ToF scanners in their devices. As noted earlier, LG uses one in the front-facing camera of the G8 to enable motion gestures and better portrait photos. (It also uses the same IR laser system for its vein-mapping for the phone’s unique “palm recognition” feature.)

Huawei’s P30 Pro also features one as part of its rear-camera array, which is used for depth maps for portrait effects. That said, Huawei also claimed at the time of launch to have some AR ambitions for the sensor, too, noting that the P30 Pro can measure the height, depth, volume, and area of real-world objects with greater than 98.5 percent accuracy.

Sony — which provides imaging sensors for a wide variety of smartphones, including the iPhone — announced earlier this year that it was planning to ramp up production of 3D laser-based ToF chips this summer, which would be perfect timing for inclusion in a 2020 iPhone.

Source: https://www.theverge.com/circuitbreaker/2019/7/29/20734550/apple-2020-iphone-time-of-flight-camera-ar-depth-map-lasers-portrait-photography

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: