Awesome-LiDAR-Camera-Calibration
A Collection of LiDAR-Camera-Calibration Papers, Toolboxes and Notes.
Outline
0. Introduction
For applications such as autonomous driving, robotics, navigation systems, and 3-D scene reconstruction, data of the same scene is often captured using both lidar and camera sensors. To accurately interpret the objects in a scene, it is necessary to fuse the lidar and the camera outputs together. Lidar camera calibration estimates a rigid transformation matrix (extrinsics, rotation+translation, 6 DoF) that establishes the correspondences between the points in the 3-D lidar plane and the pixels in the image plane.
1. Target-based methods
Paper
Target
Feature
Optimization
Toolbox
Note
Extrinsic Calibration of a Camera and Laser Range Finder (improves camera calibration), 2004
checkerboard
C:Plane (a), L: pts in plane (m)
point-to-plane
CamLaserCalibraTool
CN
Fast Extrinsic Calibration of a Laser Rangefinder to a Camera, 2005
checkerboard
C: Plane (a), L: Plane (m)
plane(n/d) correspondence, point-to-plane
LCCT
*
Extrinsic calibration of a 3D laser scanner and an omnidirectional camera, 2010
checkerboard
C: plane (a), L: pts in plane (m)
point-to-plane
cam_lidar_calib
*
LiDAR-Camera Calibration using 3D-3D Point correspondences, 2017
cardboard + ArUco
C: 3D corners (a), L: 3D corners (m)
ICP
lidar_camera_calibration
*
Reflectance Intensity Assisted Automatic and Accurate Extrinsic Calibration of 3D LiDAR and Panoramic Camera Using a Printed Chessboard, 2017
checkerboard
C: 2D corners (a), L: 3D corners (a)
PnP, angle difference
ILCC
*
Extrinsic Calibration of Lidar and Camera with Polygon, 2018
regular cardboard
C: 2D edge, corners (a), L: 3D edge, pts in plane (a)
point-to-line, point-inside-polygon
ram-lab/plycal
*
Automatic Extrinsic Calibration of a Camera and a 3D LiDAR using Line and Plane Correspondences, 2018
checkerboard
C: 3D edge, plane(a), L: 3D edge, pts in plane (a)
direcion/normal, point-to-line, point-to-plane
Matlab LiDAR Toolbox
*
Improvements to Target-Based 3D LiDAR to Camera Calibration, 2020
cardboard with ArUco
C: 2d corners (a), L: 3D corners (a)
PnP, IOU
github
*
ACSC: Automatic Calibration for Non-repetitive Scanning Solid-State LiDAR and Camera Systems, 2020
checkerboard
C: 2D corners (a), L: 3D corners (a)
PnP
ACSC
*
Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups, 2021
cardboard with circle & Aruco
C: 3D points (a), L: 3D points (a)
ICP
velo2cam_ calibration
*
C: camera, L: LiDAR, a: automaic, m: manual
2. Targetless methods
2.1. Motion-based methods
2.2. Scene-based methods
2.2.1. Traditional methods
Paper
Feature
Optimization
Toolbox
Note
Automatic Targetless Extrinsic Calibration of a 3D Lidar and Camera by Maximizing Mutual Information, 2012
C:grayscale, L: reflectivity
mutual information, BB steepest gradient ascent
Extrinsic Calib
*
Automatic Calibration of Lidar and Camera Images using Normalized Mutual Information, 2013
C:grayscale, L: reflectivity, noraml
normalized MI, particle swarm
*
*
Automatic Online Calibration of Cameras and Lasers, 2013
C: Canny edge, L: depth-discontinuous edge
correlation, grid search
*
*
SOIC: Semantic Online Initialization and Calibration for LiDAR and Camera, 2020
semantic centroid
PnP
*
*
A Low-cost and Accurate Lidar-assisted Visual SLAM System, 2021
C: edge(grayscale), L: edge (reflectivity, depth projection)
ICP, coordinate descent algorithms
CamVox
*
Pixel-level Extrinsic Self Calibration of High Resolution LiDAR and Camera in Targetless Environments,2021
C:Canny edge(grayscale), L: depth-continuous edge
point-to-line, Gaussian-Newton
livox_camera_calib
*
CRLF: Automatic Calibration and Refinement based on Line Feature for LiDAR and Camera in Road Scenes, 2021
C:straight line, L: straight line
perspective3-lines (P3L)
*
CN
2.2.2. Deep-learning methods
3. Other toolboxes