The race to improve and differentiate smart phones has increasingly focused on cameras and everything they offer. It is very likely that you have heard about the TOF 3D cameras, or known as flight time cameras, which increasingly incorporate manufacturers in mobile phones but. What exactly do they consist of?
FLIGHT TIME (TOF)
Flight time cameras (ToF) are composed of a sensor that uses a small laser to shoot infrared light. This light bounces off anything or person in front of the camera and returns to the sensor. The time it takes for the light to recover is measured and translated into distance information that can be used to create a depth map.
"Flight time cameras actually measure the time it takes for the light to pass from the camera to the user or the environment and be reflected in the sensor," explains Dr. Xavier Lafosse, director of commercial technology at Corning Precision Glass Solutions.
So far, most of the phones have been based on stereovising, which uses two cameras to calculate the approximate depth, but this method does not work in low light or in the dark and is not very accurate.
A better method that also uses infrared is the illumination with structured light where a pattern of points is projected on a scene or face and the sensor measures the distance between the points and observes the distortion in the pattern to calculate the depth. This technology works well in the short range, up to arm's length, for snapshots such as facial recognition, which is why Apple used it with its TrueDepth camera for face identification.
Flight time works similarly but does not use a dot pattern. Because these methods depend on infrared light, they work well in low light environments and even in dark environments. The flight time camera illuminates the scene with a homogeneous flow of light and the camera considers each individual pxel of the image. The sensor is synchronized with an incredibly sensitive clock that is capable of measuring small variations revealed by the speed of the bouncing light. With the depth information assigned to each pxel, you get a complete depth map.
"It is the only really accurate method to measure distance," explains Dr. Lafosse; it is the only one that is not an interpolation or a calculation, but a measure of distance ”.
These depth maps can be applied in different areas and are therefore being introduced in more and more mobile phones. Thus, you will find a flight time camera in the LG G8 ThinQ and in the Honor View 20.
LG has paired the flight time sensor with its 8-megapixel front camera to create what is called the Z Camera: the Z axis denotes the depth of the 3D images. This allows facial unlocking and something called a hand ID, another biometric system that reads vein patterns in the hand. It is also used for Air Motion gestures, which allows you to perform actions using gestures on the G8 ThinQ, such as playing and pausing music without touching the device.
In Honor View 20 the flight time camera is paired with a 48 megapixel sensor as part of the rear main camera. This command allows you to analyze the depth information in a way that improves the portrait mode, creating a really precise bokeh effect with the sharp embossed subject and the blurred background. But that is not all that can be done with the rear camera.
The value of flight time is really focused on medium to long range distances: think of applications such as augmented reality, ”said Dr. Lafosse. "If the flight time camera is in the back, then you know that it is not about facial recognition, but about detecting the environment and what you have in front of you."
You may have tried some applications and augmented reality games in the past, but flight time cameras can dramatically increase accuracy and merge your real environment with the game and characters for a new level of experience.
There is also great potential in more advanced social interactions: instead of FaceTime with Animojis, you may enjoy a more complete 3D experience in the future.
"Flight time sensors can map their surroundings accurately, so your friend's avatar is not floating in the air, but sitting on the couch next to you," said Dr. Lafosse.
The underlying technology is not new: Microsoft's Kinect sensor has had something similar and the army has been using flight time technology to obtain depth information for many, many years. But Dr. Lafosse has explained that the improvements in technology have allowed the integration of the required elements in smaller and smaller devices and thanks to this, mobile adoption is growing.
This technology is also vital for augmented reality or mixed reality wearables, such as Microsoft's HoloLens or Magic Leap, to work because these systems need a very accurate picture of their environment.
Another area where flight time cameras can help is indoor GPS navigation. If there is a 3D map of your building in the cloud, then the sensor can recognize precisely where you are at any time.
Why is Corning an expert in this area? The company manufactures the glass that protects most of the phones and is also working on the optical components of the flight time cameras, making them smaller, more transparent and making sure they have the best possible performance, while manufacturers such as Sony continue to improve the sensors, making them smaller and more energy efficient. We are sure that the flight time camera will appear more and more mobile in the near future, and there are strong rumors that Apple will include one in the next iPhone.
The augmented reality was exciting when it first arose, but when applied it was seen that the technology was not up to expectations. Flight time cameras, together with improved processing power and higher speed connectivity, may be about to bring that original vision to life.