قالب وردپرس درنا توس
Home / Apple / Apple's future iPhone can add a flight time – here's what it can do

Apple's future iPhone can add a flight time – here's what it can do



We're still a few months away from Apple announcing iPhones from 2019, but rumors have already started for next year's models, with the increasingly reliable Apple analyst Ming-Chi Kuo claiming in his latest report that two of the 2020 iPhones will have a rear Time-of-flight (ToF) 3D depth sensor for enhanced real-time features and portrait images, via MacRumors .

It's also not the first thing we've heard about Apple considering a ToF camera for its 2020 phones. Bloomberg reported a similar rumor back in January, and reports of a 3D camera system for the iPhone have been around since 201

7. Other companies have stopped Apple here, with several phones on the market already having ToF cameras. But given the prevalence of Apple's hardware and the impact it tends to have on the industry, it's worth looking at what this camera technology is and how it works.

What is a ToF sensor and how does it work?

Time-of-flight is a startling term for a type of technology that measures the time it takes for something (be it laser, light, liquid or gas particle) to travel a certain distance.

For camera sensors, an infrared laser array is specifically used to emit a laser pulse, which bounces the objects in front of it and reflects back to the sensor. By calculating how long it takes the laser to travel to the object and back, you can calculate how far it is from the sensor (since the speed of light in a given medium is a constant). And by knowing how far all the different objects in a room are, you can calculate a detailed 3D map of the room and all the objects in it.

The technology is usually used in cameras for things like drones and self-driving cars (to prevent them from crashing into things), but recently we have begun to see them appear in phones as well.

How is it different from Face ID?

Face ID (and other similar systems) uses an IR projector to pulse a grid of thousands of dots, which the phone then takes a 2D image and uses it to calculate the depth map.

Time-of-flight sensors work differently: using time-of-flight data to calculate how long it takes the lasers to reach the object, it gets real-time 3D depth data instead of a 2D map that is designed for three dimensions.

It brings several benefits: because of its laser-based system, it works for a longer range than Apple's online Face ID system, which only works about 10 to 20 inches away from the phone. (If the subject is too far away, the points of the grid are too spaced to provide a useful resolution.) In theory, it also provides more accurate data than IR grid systems. A good example is the LG G8, which uses a ToF sensor for motion sensing movements. The ToF system allows for things like tracking and separating each finger in 3D in real time to enable these movements.

Why does Apple want it?

Rumors from both Kuo and Bloomberg state that Apple is looking to add the ToF sensor to the rear camera on iPhones by 2020, not to replace the existing IR system used for Face ID (like the new The iPhones reportedly still will).

Apple's focus is said to enable new augmented reality experiences: a ToF sensor can enable room tracking on a mobile scale, allowing a future iPhone to scan the room, create accurate 3D rendering and use it for far more immersive and more accurate augmented reality implementations than current models allow.

As an added bonus, a ToF sensor will also allow for better depth maps for portrait mode images (as Huawei already does with the P30 Pro) by taking full 3D maps to better distinguish the subject from the background, as well as better portrait mode style videos.

Who else is using it?

Several telephone companies already have ToF scanners on their devices. As mentioned earlier, LG uses one in the front camera of the G8 to allow for motion movements and better portrait images. (It also uses the same IR laser system for its vein mapping for the phone's unique "palm recognition" feature.)

Huawei's P30 Pro also has one as part of the rear camera array, which is used for depth maps for portrait effects. That said, at the time of launch, Huawei also claimed to have some AR ambitions for the sensor, noting that the P30 Pro can measure the height, depth, volume and area of ​​objects in the real world with more than 98.5 percent accuracy.

Sony – which provides image sensors for a wide range of smartphones, including the iPhone – announced earlier this year that they planned to increase the production of 3D laser-based ToF chips this summer, which would be the perfect time for inclusion in an iPhone from 2020.


Source link