Robust Perception and Visual Understanding of Traffic Signs in the Wild

As autonomous vehicles (AVs) become increasingly prevalent on the roads, their ability to accurately interpret and understand traffic signs is crucial for ensuring reliable navigation. While most previous research has focused on addressing specific aspects of the problem, such as sign detection and text extraction, the development of a comprehensive visual processing method for traffic sign understanding remains largely unexplored. In this work, the authors propose a robust and scalable traffic sign perception system that seamlessly integrates the essential sensor signal processing components, including sign detection, text extraction, and text recognition. Furthermore, they propose a novel method to estimate the sign relevance with respect to the ego vehicle, by computing the 3D orientation of the sign from the 2D image. This critical step enables AVs to prioritize the detected signs based on their relevance. They evaluate the effectiveness of their perception solution through extensive validation across various real and simulated datasets. This includes a novel dataset they created for sign relevance that features sign orientation. The authors'f indings highlight the robustness of their approach and its potential to enhance the performance and reliability of AVs navigating complex road environments.


  • English

Media Info

  • Media Type: Web
  • Features: Figures; References; Tables;
  • Pagination: pp 611-625
  • Serial:
  • Publication flags:

    Open Access (libre)

Subject/Index Terms

Filing Info

  • Accession Number: 01891340
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Aug 28 2023 9:19AM