Knowledge Gap: New Studies Highlight Driver Confusion about Automated Systems

This article reports on two recent studies from the Insurance Institute for Highway Safety (IIHS) that investigated misperceptions and gaps in the understanding of drivers who are using automated or autonomous cars. The first study considered the impact of the names used for the technologies or systems in the autonomous cars and how the names can lead drivers to misunderstand the capabilities of the system or technology so-named. Some of the potentially confusing names include AutoPilot (Tesla) and Super Cruise (Cadillac). The other study evaluated the important information communicated by system displays and whether the human driver fully understood that information. Displays are important because they tell the driver how a system is responding to a situation or when the system is temporarily inactive. Even after a basic orientation to the display of the vehicle, most drivers did not grasp what was happening when the system didn’t detect a leading vehicle that had gone out of range or disappeared over a hill. Another problematic message was driver understanding when lane centering was inactive. In both of these situations, the human driver must take full control of steering. The author also reminds readers of the different levels of automation, noting that the automation available in vehicles on the market today are only Level 1 or 2, which perform functions only under the supervision of a human driver. For full copies of the research reports, readers are advised to email iihs.org.

Language

  • English

Media Info

Subject/Index Terms

Filing Info

  • Accession Number: 01714538
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Aug 23 2019 5:02PM