How is Apple’s LiDAR Technology a Game-changer for AR Apps
LiDAR technology has been around for quite some time now: its use isn’t exactly novel. And if you follow the talk on self-driving cars, you’re already familiar with it. However, it was, for a long time, considered ‘too bulky’ and ‘too expensive’ for mobile devices. Apple decided to put a period to that consideration, and debuted LiDAR’s usage in its iPad Pro and then later to its 5G iPhone 12 pro and pro max last year.
And Apple’s LiDAR isn’t just an improvement on the iPhone pro and pro max camera, LiDAR adds more horsepower to AR, manifesting greater benefits and experiences, taking AR development to a whole new level.
With Apple having taken the bold step – harnessing the power of LiDAR – it’s only a matter of time before the rest of the world follows in its footsteps. In fact, Snapchat and TikTok have already hopped the LiDAR bandwagon, creating filters that take maximum advantage of Apple’s LiDAR Scanners.
But what exactly does it mean for augmented reality app development? Before we get to that let me first give you a brief intro to LiDAR technology.
A Word or Two on LiDAR and How it Works
Lidar is basically an acronym for Light Detection and Ranging or laser imaging, detection, and ranging. An article on LiDAR mentions that LiDAR is really just a mesh of light and Radar and it works in a similar manner as a radar, even a sonar, but instead of radio or sound waves, it uses light waves from a laser.
The time that it takes for light to contact an object or surface and mirror back to the scanner is calculated by a LiDAR method. After that, the distance is measured using the velocity of light. Basically, it uses the speed of light to calculate how far an object is. It works on a time-of-flight principle.
Albeit, it’s carried out on a much larger scale. LiDAR sends laser pulses to a surface or object, sometimes up to 150,000 per second. The LiDAR scanner can pinpoint and calculate the exact accurate distance to objects up to 5 meters away and that too at nanosecond speeds. The time it takes to reflect back is calculated through the formula ‘d=r*t/2’ but let’s not go very deep into the mathematics of it here. Let’s stay on the absolute general and essentials only.
How is Apple Leveraging LiDAR in its Devices and What it Means for AR Apps
Apple’s iPhone LiDAR camera was one of the most phenomenal advancements in AR in 2020. Although LiDAR is currently only available on the iPad Pro, and the iPhone 12 Pro and Pro Max, it will eventually make its way to the majority of the iPhone lineup in the upcoming years, and will likely have a meaningful impact on AR’s future.
Because of its potential to “see in the dark,” Apple decided to use this technology in its iPhone 12 Pro models, which has shown promising results, improving low-light photography and the overall picture quality.
Non-LiDAR phones typically use a pulse of light in a similar way, albeit, with far less accuracy. LiDAR is an improvement in terms of ‘accuracy’. With Apple’s dominance as a technology pioneer, this development will likely pivot LiDAR’s larger penetration and act as a catalyst for exciting augmented reality apps.
AR App Development with LiDAR – As Apple says ‘AR at the speed of Light’
Apple claims LiDAR in its 5GiPhone 12 pro series allows the devices to create HIGHLY realistic AR applications. A more precise depth map, for example, helps the iPhone 12 Pro to better grasp which objects are in front of others, ensuring that AR characters and concepts are positioned correctly in their surrounding environment.
And it goes without saying that the ability to deliver a highly accurate and detailed depth map of an area is absolutely crucial in AR applications. With LiDAR the app developers could create an accurate depth map of the scene, speed up AR so it feels more seamless, even enable new AR app experiences, create features with the better object and room scanning — for example, improved AR shopping apps, home design tools, and better AR games.
AR systems that are far less sophisticated are often tied down by a single viewing angle. This means, features like depth – which require great lighting for accuracy – are easily thrown off by bad lighting, this being an unavoidable factor in the actual world.
LiDAR, on the other hand, can ensure a precise depth mapping almost immediately, irrespective of the settings. AR models can be built in any setting in this way, providing for more accurate depth perception and virtual object placing.
And LiDAR combined with the power of ARKit4 could prove to be phenomenal for augmented reality app development. experiences.
Snapchat & TikTok are Already Ahead of the Others in the Race
Snapchat was one of the first AR app development companies to make use of LiDAR iPhones. In fact, Apple even featured Snapchat in its keynote to launch the new camera. Apple’s LiDAR sensors enhance Snap’s Lens Studio’s functionality, empowering AR developers to explore new creative possibilities.
LiDAR, as per Snap’s Qi Pan, boosts both underlying functionalities as well as user-friendliness of AR filters. The former entails improved object tracking, leading to quality graphics that interact with physical world objects more realistically. In low-light situations, it can significantly improve these capabilities.
This would lead to a slew of new Snapchat lens apps. For starters, more indoor activations, such as enhancing your workplace or home, are possible. Two, it adds additional lenses to the rear-facing camera, in relation to selfie lenses, to augment the world — a route Snap is already on.
In terms of usability, LiDAR enables accurate spatial mapping. The mapping isn’t just accurate, it is actually so much faster than a traditional RGB camera. Users don’t have to shake their phones around to unlock AR interactions, making it a more easy-going UX that might cater to a broader audience.
Snap also launched its Lens Studio 3.2 allowing AR developers to take advantage of lidar and create lidar-powered Lenses for the iPhone 12 Pro and the iPhone 12 Pro Max.
TikTok too Launched a LiDAR-powered AR Effect Earlier this Year
TikTok celebrated the new year ‘AR-style’, with its launch of LiDAR-powered effect. With this, the app is able to create a 3D map of a real-world scene and build animations that interact with real-world objects.
With Snapchat and TikTok leveraging the iPhone 12 pro series’ LiDAR sensors, many more AR app development companies would rush to become a part of the newly-hyped technology.
To ring in 2021 we released our first AR effect on the new iPhone 12 Pro, using LiDAR technology which allows us to create effects that interact with your environment – visually bridging the digital and physical worlds. We're excited to develop more innovative effects in 2021! pic.twitter.com/6yFD2FfHta
— TikTokComms (@TikTokComms) January 6, 2021
The Best is Yet to Come!
While the iPhone 12 Pro’s use of a LiDAR-capable sensor nicely sets the stage for augmented reality apps., using LiDAR with AR is not without its design challenges. There are obstacles to overcome before LiDAR can be universally accepted into the AR world of magic and wonders. Still, the pros will continue to outweigh the cons to it as the tech-gods continue to improve on the technology.
In the augmented reality world, it will be a technology that becomes more ubiquitous with the passage of time. As for companies wondering ‘to LiDAR or not to LiDAR’, this is only just the beginning of something amazing and it would be justified to expect more LiDAR-powered wonders from key AR app developers.