Abstract
Illumination variance is a major challenge for vision-based robotics. Most approaches focus on alleviating illumination changes in already captured images. Despite the large utility, camera attributes have been empirically determined to function in a highly passive manner, yielding vision algorithm failure under radical illumination variance. Recent studies have proposed exposure and gain control schemes that could maximize image information and eschew saturation. In this article, we propose a proactive control scheme for the camera's two dominant attributes-exposure time and gain control. Unlike existing approaches, we formulate this camera attribute control as an optimization problem in which the underlying function is not known a priori. We first define a new metric of the image regarding these two major attributes to include both image gradients and signal-to-noise ratio simultaneously. Based on this metric, we introduce a new formulation for this attribute control via Bayesian optimization (BO) and learn the environmental change from the captured image. During the control, to mitigate the burden of image acquisition and Bayesian optimization, images are synthesized using a camera response function and avoided the actual frame grab from the camera. The proposed method was validated in light-flickering indoor, outdoor near sunset, and indoor-outdoor transient environments where light changes rapidly, supporting 20-40 Hz frame rates.
Original language | English |
---|---|
Article number | 9098963 |
Pages (from-to) | 1256-1271 |
Number of pages | 16 |
Journal | IEEE Transactions on Robotics |
Volume | 36 |
Issue number | 4 |
DOIs | |
State | Published - Aug 2020 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2004-2012 IEEE.
Keywords
- Camera attribute control
- computer vision for other robotic application
- field robots
- visual-based navigation