Do Autonomous Vehicles Need LiDAR??- PART 2

By Sabbir Rangwala

Autonomous Vehicles

In Part 1 (published 17/07/2021), I explored the parallels between the automotive LiDAR and telecom optics investment booms, and posited that going forward, consolidation and pivots for automotive LiDAR companies are inevitable. I got great comments from many people, thank you. I will try and address some of them here, a critical one being whether LiDAR is needed at all for AV safety. I feel that this issue is an important one to address before delving into LiDAR consolidation and opportunities, which is the subject of Part 2 (in progress).

One writer commented that I had overlooked the fact that telecom relied on guided optics (over fiber) while LiDAR operates in free space, and therefore has other significant technical challenges. I agree with this. Free space laser energy brings about significant other challenges like weather, eye-safety, interference and vulnerability to spoofing. Additionally, it means that LiDAR deployments will occur in a highly public arena, as opposed to telecom optics which were essentially invisible (literally and from a mindshare perspective). Autonomous cars and LiDAR are highly visible and have given optics the “cool” factor, maybe “geeky” and mysterious, but exciting. The recent high stakes litigation over LiDAR IP contributed further to this.

If the answer is no, Part 2 of the earlier article on LiDAR consolidation is irrelevant. So, I need to address this question one way or another.

Typically, the argument for not needing LiDAR as an obstacle avoidance and safety sensor goes as follows — The answer seems pretty obvious — computers today do not replicate human intelligence — they do not think like humans, from a perception and decision-making perspective. And they need other crutches as stated eloquently by my fellow Forbes author, Brad Templeton. Additionally, LiDARs for AVs are also used to develop 3D maps and provide vehicle localization (which could be achieved through other means like cameras and GPS).

Tesla TSLA -1% is pushing the idea of pseudo-LIDAR based on innovations in cameras, AI and computing, and the fact that they have access to rich driving data from ~1M cars in varied traffic and weather conditions. Daimler has research activity , aimed at exploiting gated imaging and machine learning to replace LiDAR. GM Cruise stated in a recent interview that they had replaced one of the sensors in the sensor pod of their newly launched purpose built ride hailing AV, the Origin, most likely implying that LiDAR will be replaced by thermal cameras.

Other methods to extract 3D information using stereo vision cameras have also been proposed. NODAR is one such player, proposing to solve the issue with classical stereo cameras and their stringent mechanical alignment requirements by using software to adjust for positional misalignments over time, and due to vibrations and shock. They use the recently released Ford data set to show how their algorithms perform better than classical algorithms on this data but provide no independent confirmation on the quality of the 3D point clouds. Foresight Automotive proposes to use automatic calibration and fusion with a LWIR camera to create 3D point clouds and robust obstacle detection without LiDAR.

On the other hand, the leaders in the AV and ride hailing businesses swear by LiDAR and have made significant investments in developing or using the technology — Waymo, Nuro, Uber, Aurora, Zoox, Argo, Lyft LYFT -2.4%, Yandex YNDX -0.1%, etc. Waymo, is easily the leader on the AV front with more than 10M driven miles to date and 1B simulated miles. And they have invested heavily in their internal LiDAR development, even launching the Laser Bear Honeycomb, a short-range LiDAR (used in their current generation AVs in conjunction with other long range LiDARS also internally developed) for sales into non-AV applications. Not clear what their motive for this is — it cannot possibly be to make money!

Finally, V2X deployment is another argument against requiring LiDAR in AVs — although this would impact more than just LiDAR — it is an altogether different model for enabling AVs relative to the current thrust of relying on in-vehicle sensors and computing for perception, behavior prediction, localization, route planning and control. V2X once real, essentially unloads the burden of intelligence and decision making from individual cars to a connected network that can guide vehicles via 5G and other communication channels (with human or autonomous operation). This seems achievable but needs universal and deep deployment in the traffic network, low latency, high availability, standards, political will, and lots of public and private funding. I asked Richard Bishop, an authority on AV deployments (and one of the early pioneers leading a USDOT program for automated driving using smart road infrastructure), whether we could expect this kind of deployment anytime soon. “

The proposition that AVs could deploy without the need for LiDAR is a tantalizing one and could happen. Not just because of advances in AI and computing, but also with developments in mono or stereo vision cameras and high-resolution radars (which is an important area of research and disruption). Although difficult, the prospect of ubiquitous V2X also looms. If these developments materialize, the companies focused on automotive LiDAR will face challenges way beyond the question of consolidation and pivots.

The reality today is that LiDAR is here to stay, and AV companies need to factor it into their sensor and perception stacks, while keeping in mind that some of these disruptions could eliminate the need for LiDAR. LiDAR companies whose survival depends on the need for LiDAR in AVs would be wise to keep these possibilities in mind and consider it as a true competitive threat to their business models. They should consider investing a part of their resources to disrupt their own business models and assumptions.

Follow me on LinkedIn.

We embrace startups that deploy artificial intelligence at its core. We are Fountech Ventures, Deep Tech AI Venture Builders.