AEye Inc. announced iDAR, a new form of intelligent data collection that reportedly enables rapid, dynamic perception and path planning. iDAR (intelligent detection and ranging) combines MOEMS (micro opto electro mechanical system) LiDAR (light detection and ranging), pre-fused with a low-light camera and embedded artificial intelligence—creating software-definable and extensible hardware that can dynamically adapt to real-time demands. AEye says iDAR will deliver higher accuracy, longer range, and more intelligent information to optimize path planning software, enabling radically improved autonomous vehicle safety and performance at a reduced cost.
AEye’s iDAR is designed to intelligently prioritize and interrogate co-located pixels (2D) and voxels (3D) within a frame, enabling the system to target and identify objects within a scene 10-20x more effectively than LiDAR-only products. Additionally, iDAR can overlay 2D images on 3D point clouds for the creation of true color LiDAR. Its embedded AI capabilities enable iDAR to use thousands of existing and custom computer vision algorithms, which add intelligence that can be leveraged by path planning software.
“AEye’s unique architecture has allowed us to address many of the fundamental limitations of first generation spinning or raster scanning LiDAR technologies,” said Luis Dussan, AEye Founder and CEO. “These first-generation systems silo sensors and use rigid asymmetrical data collection that either oversample or undersample information. This dynamic exposes an inherent tradeoff between density and latency in legacy sensors, which restricts or eliminates the ability to do intelligent sensing. For example, while traditional 64-line systems can hit an object once per frame (every 100 ms or so), we can, with intelligent sensing, selectively revisit any chosen object twice within 30 microseconds—an improvement of 3000X. This embedded intelligence optimizes data collection, so we can transfer less data while delivering better quality, more relevant content.”
AEye’s iDAR technology mimics how a human’s visual cortex focuses on and evaluates potential driving hazards: it uses a distributed architecture and at-the-edge processing to dynamically track targets and objects of interest, while always critically assessing general surroundings.
AEye says its iDAR system uses proprietary low-cost, solid-state beam-steering 1550 nm MOEMS-based LiDAR, computer vision, and embedded artificial intelligence to enable dynamic control of every co-located pixel and voxel in each frame within rapidly changing scenes. This enables path-planning software to address regions and objects of interest, or to apply differentiated focus on select objects or obstacles. By adding intelligence to the sensor layer, objects of interest can be identified and tracked with minimal computational latency. For example, the system can identify objects and then revisit them even within the same frame, giving the perception and path planning layers the ability to make calculations such as multi-directional velocity and acceleration vectors simultaneously. This allows for faster and better predictability of behavior and intent.
iDAR’s true color LiDAR overlays 2D real-world color on 3D data, adding computer vision intelligence to 3D point clouds. True color LiDAR’s unique fusion enables absolute color and distance segmentation, and co-location with no registration processing, resulting in almost no computational penalty. This enables much greater accuracy and speed in interpreting signage, emergency warning lights, brake versus reverse lights, and other scenarios that have historically been tricky for legacy LiDAR-based systems to navigate. In addition, this approach expands the ability to perform “enhanced training” for autonomous vehicles.
A key feature of AEye’s iDAR platform is its software-definable hardware. iDAR adds three feedback loops: one at the sensor layer, one at the perception layer, and another with path planning software. By enabling customizable data collection in real-time, the system is able to adapt to the environment and dynamically change performance based on the customer’s/host’s applications and needs. In addition, it can emulate legacy systems, define regions of interest, focus on threat detection, and/or be programmed for variable environments, such as highway or city driving. This configurability leads to optimized data collection, reduced bandwidth, improved vision perception and intelligence, and faster motion planning for autonomous vehicles.
iDAR’s system architecture allows for remote updates of the firmware and software, which enables rapid prototyping without requiring hardware modifications.
AEye is also announcing the iDAR Development Partner Program for OEM customers, Tier 1 partners, and universities interested in integrating iDAR into their vehicles. The company will demo iDAR and announce its automotive product suite at CES 2018 in Las Vegas January 9-12.