On-Road Object Detection and Tracking Based on Radar and Vision Fusion: A Review

Environment perception, one of the most fundamental and challenging problems of autonomous vehicles (AVs), has been widely studied in recent decades. Due to inferior fault tolerance and the insufficient information caused by a single autonomous sensor (e.g., radar, lidar, or camera), multisensor fusion plays a significant role in environment perception systems, and its performance directly defines the safety of AVs. Due to good performance and low cost, radar-vision (RV) fusion has become popular and widely applied in the mass production of AVs. However, there have been a few generalizations about RV fusion, and in that context, this article presents a comprehensive review on RV fusion for both object detection and object tracking by RV fusion. With respect to the input data and fusion framework, this article categorizes the existing fusion frameworks into two categories, providing a detailed overview of each: object detection and tracking by RV fusion. Also, the state-of-the-art detectors and trackers based on deep learning are introduced, along with an analysis of their advantages and limitations. Finally, challenges and improvements are summarized to facilitate future research in the RV fusion field.

Language

  • English

Media Info

Subject/Index Terms

Filing Info

  • Accession Number: 01861440
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Oct 18 2022 5:11PM