Detection of Lane-Changing Behavior Using Collaborative Representation Classifier-Based Sensor Fusion

Sideswipe accidents occur primarily when drivers attempt an improper lane change, drift out of lane, or the vehicle loses lateral traction. In this article, a fusion approach is introduced that utilizes data from two differing modality sensors (a front-view camera and an onboard diagnostics (OBD) sensor) for the purpose of detecting driver’s behavior of lane changing. For lane change detection, both feature-level fusion and decision-level fusion are examined by using a collaborative representation classifier (CRC). Computationally efficient detection features are extracted from distances to the detected lane boundaries and vehicle dynamics signals. In the feature-level fusion, features generated from two differing modality sensors are merged before classification, while in the decision-level fusion, the Dempster-Shafer (D-S) theory is used to combine the classification outcomes from two classifiers, each corresponding to one sensor. The results indicated that the feature-level fusion outperformed the decision-level fusion, and the introduced fusion approach using a CRC performs significantly better in terms of detection accuracy, in comparison to other state-of-the-art classifiers.


  • English

Media Info

Subject/Index Terms

Filing Info

  • Accession Number: 01696639
  • Record Type: Publication
  • Source Agency: SAE International
  • Report/Paper Numbers: 09-06-02-0010
  • Files: TRIS, SAE
  • Created Date: Dec 17 2018 2:44PM