Apple is using Lidar not for some fancy face detection algorithm or for the state of the art focusing system, but apple is using Lidar for augmented reality.
Lidar is a method for measuring distances by illuminating the target with laser light and measuring the reflection with a sensor. Differences in laser return times and wavelengths can then be used to make digital 3-D representations of the target.
An analysis by System Plus Consulting found that the iPad’s lidar sends out light using an array of vertical cavity surface-emitting lasers (VCSELs) made by Lumentum. It then detects the return flash using an array of sensors called single-photon avalanche diodes (SPADs) supplied by Sony. I’ll explain what these are in the next section.
Now the thing to note is apple used lidar tech in there devices itself increased the possibility that other smartphone vendors might follow apple’s lead will provide them with a nice tailwind in the coming years.
If we go to the history of lidar ,first three dimensional lidar setup was introduced by Velodyne more than ten years ago & cost of that setup was about $75,000 and in fact exponentially large than a smartphone.so Apple had to come up with the plan of their own and the plan Apple came up with is VCSELs
What Are Vscels in Lidar ?
VCSELs are just one of a number of possible chip-based light sources that can be used in these devices. What makes VCSELs so attractive is the laser is emitted perpendicular to the device. That has a number of benefits, from scaling to test.
VCSELs do their ranging and time-of-flight calculations using pulses of light with frequencies in the tens of gigahertz, identifying movement by looking at changes between one image and the next. It’s not clear if that approach could be adapted to improve on LiDAR’s tendency to use longer wavelengths and continuous scanning, noted David Hall, principal marketing manager at National Instruments.