Extrinsic Calibration of Camera Networks Based on Pedestrians

In this paper, the authors propose a novel extrinsic calibration method for camera networks by analyzing tracks of pedestrians. First of all, the authors extract the center lines of walking persons by detecting their heads and feet in the camera images. The authors propose an easy and accurate method to estimate the 3D positions of the head and feet w.r.t. a local camera coordinate system from these center lines. The authors also propose a RANSAC-based orthogonal Procrustes approach to compute relative extrinsic parameters connecting the coordinate systems of cameras in a pairwise fashion. Finally, the authors refine the extrinsic calibration matrices using a method that minimizes the reprojection error. While existing state-of-the-art calibration methods explore epipolar geometry and use image positions directly, the proposed method first computes 3D positions per camera and then fuses the data. This results in simpler computations and a more flexible and accurate calibration method. Another advantage of the authors' method is that it can also handle the case of persons walking along straight lines, which cannot be handled by most of the existing state-of-the-art calibration methods since all head and feet positions are co-planar. This situation often happens in real life.

  • Record URL:
  • Availability:
  • Supplemental Notes:
    • © 2016 Junzhi Guan et al.
  • Authors:
    • Guan, Junzhi
    • Deboeverie, Francis
    • Slembrouck, Maarten
    • Van Haerenborgh, Dirk
    • Van Cauwelaert, Dimitri
    • Veelaert, Peter
    • Philips, Wilfried
  • Publication Date: 2016

Language

  • English

Media Info

  • Media Type: Digital/other
  • Features: Figures; References; Tables;
  • Pagination: 24p
  • Serial:
  • Publication flags:

    Open Access (libre)

Subject/Index Terms

Filing Info

  • Accession Number: 01606992
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Jul 14 2016 11:32AM