Recognizing Safetycritical Events from Naturalistic Driving Data

New trends in research on traffic accidents comprehend Naturalistic Driving Studies (NDS). NDS are based on large-scale data collection of driver, vehicle, and environment information in real traffic. NDS datasets have proven to be extremely valuable for the analysis of safety-critical events such as crashes and near crashes. However, finding safety-critical events in NDS data may be difficult and time consuming. Safety-critical events are currently individuated using kinematic triggers (e.g., searching for deceleration below a certain threshold signifying harsh braking). Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is – to date – necessary to decide whether the events individuated by the triggers are actually safety-critical. Such reviewing procedure is based on subjective decisions, is time consuming, and is often tedious for the analysts. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety-critical events from kinematic triggers in naturalistic driving data. Review of about 400 videos from the triggered events collected by 100 Volvo cars in the euroFOT project, suggested that driver's individual reaction may be the key to discriminate safety-critical events. In fact, whether an event is safety-critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state-of-the-art subjective review procedures to individuate safety-critical events from NDS can benefit from automated objective video analysis. In addition, this paper discusses the major challenges in making such a video analysis viable for future NDS.

Language

  • English

Media Info

Subject/Index Terms

Filing Info

  • Accession Number: 01488246
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Jul 9 2013 9:09AM