Learning Driving Scene Prediction from Environmental Perception of Vehicle Fleet Data

The research on understanding road scenes and the behavior of drivers' interaction has experienced an increasing interest in recent years. Especially for advanced driver assistance systems (ADAS) and highly automated driving (HAD) there is the requirement of understanding complex scenarios. While prediction methods which rely on common kinematic motion models are only suitable for short time prediction intervals, methods which model the relation and interaction between traffic participants achieved good results on longer prediction intervals. A driving scene is defined by multiple surrounding vehicles as they are available from the environmental perception from long range radars of standard series vehicles. To represent and predict driving scenes with a different number of surrounding vehicles -- and especially potential hazardous situations -- the authors choose a grid-based approach. The authors introduce a novel approach to extract sparse features from driving scenes using non-negative matrix factorization. From the factorized and sparse feature space the authors determine the parameters of an auto-regressive (AR) model. Beneficially, the interaction between different vehicles is modeled inherently. Using this model, the authors predict driving scenes maintaining feature sparseness.


  • English

Media Info

  • Media Type: Web
  • Features: References;
  • Pagination: pp 547-552
  • Monograph Title: 18th International IEEE Conference on Intelligent Transportation Systems (ITSC 2015)

Subject/Index Terms

Filing Info

  • Accession Number: 01600964
  • Record Type: Publication
  • ISBN: 9781467365956
  • Files: TRIS
  • Created Date: May 2 2016 3:21PM