Perceiving Humans: From Monocular 3D Localization to Social Distancing

Perceiving humans in the context of Intelligent Transportation Systems (ITS) often relies on multiple cameras or expensive LiDAR sensors. In this work, the authors present a new cost-effective vision-based method that perceives humans’ locations in 3D and their body orientation from a single image. They address the challenges related to the ill-posed monocular 3D tasks by proposing a neural network architecture that predicts confidence intervals in contrast to point estimates. Their neural network estimates human 3D body locations and their orientation with a measure of uncertainty. Their proposed solution (i) is privacy-safe, (ii) works with any fixed or moving cameras, and (iii) does not rely on ground plane estimation. They demonstrate the performance of their method with respect to three applications: locating humans in 3D, detecting social interactions, and verifying the compliance of recent safety measures due to the COVID-19 outbreak. The authors show that it is possible to rethink the concept of “social distancing” as a form of social interaction in contrast to a simple location-based rule. They publicly share the source code towards an open science mission.

Language

  • English

Media Info

Subject/Index Terms

Filing Info

  • Accession Number: 01852023
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Jul 21 2022 11:30AM