Apple has leaned towards Augmented Reality (AR) for a long time. The company started emphasizing on AR with the launch of their ARKit back in 2017. Since then, the Cupertino-based tech giant has been experimenting with AR and developing it with each passing year. However, this year, the company launched its latest iPad Pro that comes equipped with a hardware component that specifically enhances the AR capabilities of the mobile device. Yep, you guessed it right, the LiDAR sensor.
How LiDAR Works
Light Detection and Ranging (LiDAR) is a technology that is often used in automated robots and cars to map the real world. The sensor works with the help of laser beams. These laser beams are emitted into the environment by the projectors and the sensor waits for the reflection of these laser beams to return. As the reflections touch the sensor, the system calculates the amount of time each laser beam took to return to the sensor and based on this analysis a 3D map of the real world is made.
Now, decent traditional LiDAR sensors are as big as the size of a fist and are really expensive. Apple somehow was able to cram in an operational LiDAR sensor inside the iPhone 11-like camera module of the latest iPad Pro. Now, this move improves the AR capabilities of the mobile device by many folds.
How Does AR Work Without LiDAR System?
AR that works with the help of computer vision (like in iPhones) depend on the movement of the camera to analyse the environment. So, to make AR work efficiently, you have to move your camera very carefully for the system to recognise flat surfaces. However, LiDAR sensors work instantly and do not require any careful movements. It starts to map the environment as soon as you open the camera and creates a 3D model of the real world in no time.
How Can LiDAR Improve AR in Mobile Devices?
Now, LiDAR sensors in mobile devices can open up a whole range of AR possibilities and also eradicate some of the limitations of the traditional AR system. Firstly, it is way faster than computer vision AR and can map the world in real-time. Secondly, it is much more accurate in capturing the details of the environment. And third, unlike computer vision-dependent AR, LiDAR sensors do not need any light to capture objects in the environment. Instead, it uses laser beams that enable the system to see things without any visible light.
These limitations are prevalent in recent mobile devices with AR capabilities, but not in the iPad Pro. As Apple was able to fit a LiDAR sensor in an iPad, it won’t be a shocker if they integrate one in their upcoming iPhones too, as the leaks suggest.