A More Robust Method for Digital Video Camera Calibration for Luminance Estimation
Mapping the luminance values of a visual scene is of broad interest to accident reconstructionists, human factors professionals, and lighting experts. Such mappings are useful for a variety of purposes, including determining the effectiveness and appropriateness of lighting installations, and performing visibility analyses for accident case studies. Previous work has shown that pixel intensity captured by consumer-grade digital still cameras can be calibrated to estimate luminance [1-7]. Taking a digital still image and converting this image into a luminance map even further reduces the time required for luminance measurement. Suway and Suway previously presented a methodology for estimating luminance from digital images and video of a scene [1]. In this paper, the authors update this methodology for calculating luminance from a digital camera. The updated calibration method results in more accurate luminance estimation over the entire range of the camera’s sensor, particularly in low-light conditions. Further refinements to the fitting methodology result in a robust calibration even in the presence of measurement noise. Ultimately, the presented methodology allows for the user to mount a camera near the driver’s eye location and to drive through a scene capturing video or still images. Still images from the video are then exported and analyzed, creating a luminance map. The method previously presented by Suway and Suway [1] is compared to the updated method. It is shown that throughout the range of pixel values, the updated method accurately estimates the luminance values. In many cases, the results are comparable to the previously published method [1]; however, the updated method is more robust at the extremes and more robust to noise.
- Record URL:
-
Availability:
- Find a library where document is available. Order URL: http://worldcat.org/issn/01487191
-
Supplemental Notes:
- Abstract reprinted with permission of SAE International.
-
Authors:
- Suway, Jeffrey
- Suway, Steven
-
Conference:
- WCX SAE World Congress Experience
- Location: Detroit & Online Michigan, United States
- Date: 2022-4-5 to 2022-4-7
- Publication Date: 2022-3-29
Language
- English
Media Info
- Media Type: Web
- Features: References;
-
Serial:
- SAE Technical Paper
- Publisher: Society of Automotive Engineers (SAE)
- ISSN: 0148-7191
- EISSN: 2688-3627
- Serial URL: http://papers.sae.org/
Subject/Index Terms
- TRT Terms: Crash reconstruction; Digital video; Human factors; Image processing; Luminance; Visualization
- Subject Areas: Highways; Safety and Human Factors; Vehicles and Equipment;
Filing Info
- Accession Number: 01843136
- Record Type: Publication
- Source Agency: SAE International
- Report/Paper Numbers: 2022-01-0802
- Files: TRIS, SAE
- Created Date: Apr 25 2022 10:05AM