Trust in Autonomous Cars: Exploring the Role of Shared Moral Values, Reasoning, and Emotion in Safety-Critical Decisions

Objective: Autonomous cars (ACs) controlled by artificial intelligence are expected to play a significant role in transportation in the near future. This study investigated determinants of trust in ACs. Background: Trust in ACs influences different variables, including the intention to adopt AC technology. Several studies on risk perception have verified that shared value determines trust in risk managers. Previous research has confirmed the effect of value similarity on trust in artificial intelligence. The authors focused on moral beliefs, specifically utilitarianism (belief in promoting a greater good) and deontology (belief in condemning deliberate harm), and tested the effects of shared moral beliefs on trust in ACs. Method: They conducted three experiments (N = 128, 71, and 196, for each), adopting a thought experiment similar to the well-known trolley problem. They manipulated shared moral beliefs (shared vs. unshared) and driver (AC vs. human), providing participants with different moral dilemma scenarios. Trust in ACs was measured through a questionnaire. Results: The results of Experiment 1 showed that shared utilitarian belief strongly influenced trust in ACs. In Experiment 2 and Experiment 3, however, the authors did not find statistical evidence that shared deontological belief had an effect on trust in ACs. Conclusion: The results of the three experiments suggest that the effect of shared moral beliefs on trust varies depending on the values that ACs share with humans. Application: To promote AC implementation, policymakers and developers need to understand which values are shared between ACs and humans to enhance trust in ACs.

Language

  • English

Media Info

  • Media Type: Web
  • Features: References;
  • Pagination: pp 1465-1484
  • Serial:

Subject/Index Terms

Filing Info

  • Accession Number: 01789385
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Nov 23 2021 11:51AM