Connect the Dots: AEye Combines 2D and 3D Dots with AI for Enhanced LiDAR

Create: 12/12/2017 - 18:39

The data generated to enable advanced driver assistance systems (ADAS) autonomous vehicles (AV) is growing exponentially, to the point that the bottleneck is not getting the data, but making sense of it, quickly, accurately and efficiently. For this, it’s time to add intelligent detection and ranging (iDAR) to the IoT and AI lexicon.

iDAR is a nod to the light detection and ranging (LiDAR) technology used in Velodyne’s “coffee can” sensors that dotted the highways atop Google’s famous early AV research vehicles. They used time-of-flight (ToF) data from pulsed laser beams to determine range down to a few inches, depending on ambient conditions. The coffee-can structure allowed the laser to rotate to get a 360˚ view.

While the basic premise of LiDAR is still used by Velodyne, increasingly fierce competition has seen it reduce the size of its system to that of a hockey puck with a price target of $1200 for a fully kitted vehicle. The competition has come from the likes of LeddarTech, Luminar Technologies, Innoviz Technologies, Quanergy, and of course Google’s Waymo.

While it was initially surmised that LiDAR might replace cameras, radar and other sensing technologies, it has become clear that each has a role to play and the goal post has moved from picking and choosing sensing technologies, to integrating and analyzing data from multiple sensors to get the best result using a combination of sensor fusion and AI.

This gives a snapshot into what AEye is doing. While the company is focused on LiDAR, it does add a low-light camera and embedded AI. It is using light from a 1550-nanometer laser steered by micro-opto-electro-mechanical-system-based mirrors to replace rotating mechanical systems with a solid-state optical system. Around these, it is adding more intelligent sensing and software-defined LiDAR techniques to provide a “visual cortex” for autonomous vehicles.

This emulates how the human visual cortex pre-processes and customizes information sent to the brain. In the case of AEye’s approach, it “pre-fuses” computer vision and LiDAR information for intelligent data collection and rapid perception and motion planning.

AEye has already demonstrated it can perform ranging up to 300 m while generating a high-density point cloud to develop a 3D image of the environment. While AEye hasn’t revealed details on range accuracy and the number of points, it needs to be able to compete with Quanergy’s early accuracy of +/- 5 cm at 100 m. a field of view of 120˚ and ability to generate up to 0.5 million data points per second.

Camera Plus LiDAR on Scalable Processing Architecture

AEye may well be able to compete on the LiDAR-specific elements, but the company’s emphasis is on overlaying 2D real-world color information provided by the low-light camera with the 3D data. This capability allows it to provide an accurate interpretation of signage, emergency warning lights, brake vs. reverse lights and other visual cues that aren’t depth dependent.

On top of this it embeds AI such that each collocated pixel (2D) and voxel (3D) can be dynamically analyzed and controlled in each frame. This allows path-planning software to address regions and objects of interest, or to apply differentiated focus on select objects or obstacles.

AEye combines LiDAR with a camera and AI to enable higher levels of dynamic feature extraction. (Image source: AEye)

The final touch is the ability to customize data collection in real time such that resources can be dynamically assigned based on the environment, application and customer requirements, such as highway or city driving. This addresses the efficiency angle, while also improving perception.

AEye is still in start-up mode with $16 million in funding from Kleiner Perkins Caufield and Byers, Airbus Ventures and Intel Capital. However, it is in the middle of frenzied activity around AI for AV and ADAS, and IoT. The end game is to provide the most useful and actionable information in the most efficient manner possible, which means IoT solution providers will have more to offer end customers in automotive, as well as any transportation, industrial and safety end application.

About Author

Patrick Mannion
Patrick Mannion is an independent writer and content consultant who has been working in, studying, and writing about engineering and technology for over 25 years.

Latest Videos