Multi-modal user experience evaluation on in-vehicle HMI systems using eye-tracking, facial expression, and finger-tracking for the smart cockpit

The trend toward intelligent connected vehicles (ICVs) led to numerous more novel and more natural human-vehicle relationships, which will bring about tremendous changes in smart cockpit functions and interaction methods. However, most in-vehicle human-machine interaction (HMI) systems focus on adding more functions, while few of them focus on the user experience (UX) of the system. This study presents an evaluation method of UX based on eye-tracking, finger movement tracking, and facial expressions, the study also proposed a pleasantness prediction based on multi-layer perception (MLP) algorithm using multi-modal data. Through the UX experiment on two in-vehicle HMI systems, the study verified that the proposed evaluation method can be objective and efficient to evaluate the in-vehicle HMI system. Based on the MLP algorithm, the study trained the pleasantness prediction model using multi-modal data. Besides, the authors collected new data of the third in-vehicle HMI system to test the trained model and presented excellent test results.

Language

  • English

Media Info

Subject/Index Terms

Filing Info

  • Accession Number: 01864460
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Nov 21 2022 4:19PM