Abstract
In this paper, we address the calibration of a 3D mapping system to obtain omnidirectional depth values and images from 16-channel Velodyne LiDAR and six-channel vision cameras. The mapping system was designed for the purpose of HD (High Definition) digital map acquisition for autonomous vehicle navigation. Two calibration problems were addressed, namely temporal calibration with time synchronization and spatial calibration of extrinsic parameters between sensors. Firstly, the six cameras and LiDAR sensors were precisely time-synchronized using the PPS and GPRMC from a GPS signal. The PPS triggered the six cameras simultaneously to obtain synchronized images. Secondly, a 3D plane matching technique was used to calibrate the external parameters between the LiDAR sensor and all cameras. After considering each camera-LiDAR combination as an independent multi-sensor unit, the rotation and translation between the two sensor coordinates were calibrated. Thereafter, 3D planes in the input chessboard images were fitted with respect to the camera and the LiDAR coordinate systems, respectively. The rotation was calculated by aligning the normal vectors between the fitted 3D planes. An arbitrary point on the 3D camera plane was projected to the LiDAR plane, and the distance between the two points was iteratively minimized to estimate the translation. Finally, the estimated transformation was refined using the distance between all chessboard 3D points and the LiDAR plane. In summary, we introduced two new refinement methods in addition to our previously described method. Error analysis was performed using both a simulation tool and real test datasets.
Original language | English |
---|---|
Pages (from-to) | 363-372 |
Number of pages | 10 |
Journal | Journal of Institute of Control, Robotics and Systems |
Volume | 26 |
Issue number | 5 |
DOIs | |
State | Published - 2020 |
Keywords
- 3D mapping
- Calibration
- Camera
- LiDAR
- Plane matching