当前位置:网站首页>Zed2 camera calibration -- binocular, IMU, joint calibration
Zed2 camera calibration -- binocular, IMU, joint calibration
2022-06-11 04:59:00 【Immediately】
In order to obtain VINS Relevant parameters in the configuration file , Also for binocular cameras and IMU The actual output data of the sensor is more accurate , Yes ZED2 Camera calibration , Including camera calibration 、IMU Calibration and joint calibration .
One Install the calibration tool
1、 Use kalibr Tool calibration ZED2 Binocular camera .
2、 use imu_utils calibration IMU, Install and compile in sequence code_utils、imu_utils.
This part has been installed before , Refer to the blog for the specific process :https://blog.csdn.net/xiaojinger_123/article/details/120849737?spm=1001.2014.3001.5501
But because the system ubuntu18.04 become 20.04, Pay attention to system dependencies , Modify according to the corresponding version .
1 Download and compile kalibr
sudo apt update
sudo apt-get install python3-setuptools python3-rosinstall ipython3 libeigen3-dev libboost-all-dev doxygen libopencv-dev ros-noetic-vision-opencv ros-noetic-image-transport-plugins ros-noetic-cmake-modules python3-software-properties software-properties-common libpoco-dev python3-matplotlib python3-scipy python3-git python3-pip libtbb-dev libblas-dev liblapack-dev libv4l-dev python3-catkin-tools python3-igraph libsuitesparse-dev
pip3 install wxPython
sudo pip3 install python-igraph --upgrade
mkdir ~/kalibr_ws/src
cd ~/kalibr_ws/src
git clone --recursive https://github.com/ori-drs/kalibr
cd ~/kalibr_ws
source /opt/ros/noetic/setup.bash
catkin init
catkin config --extend /opt/ros/noetic
catkin config --merge-devel # Necessary for catkin_tools >= 0.4.
catkin config --cmake-args -DCMAKE_BUILD_TYPE=Release
catkin build -DCMAKE_BUILD_TYPE=Release -j4
2 Download and compile code_utils、imu_utils
1) Report errors Could not find a package configuration file provided by “Ceres“
solve : install cereshttps://blog.csdn.net/weixin_48083022/article/details/118282363
2)ceres Compiler error error: ‘integer_sequence’ is not a member of ‘std’
It is used in compiling some ceres An error will be reported when the project is completed error: ‘integer_sequence’ is not a member of ‘std’, This is because in the newer version ceres Yes c++ The version requires
In the report of the wrong item CMakeList Inside
set(CMAKE_CXX_FLAGS “-std=c++11”)
Change to
set(CMAKE_CXX_STANDARD 14)
For other questions, please refer to :https://github.com/gaowenliang/imu_utils/issues/32
Two Select the calibration plate
About the selection of calibration plate ,checkerboard and aprilgrid These two are commonly used .( The camera needs a distance calibration plate 1-2m, The calibration plate occupies the field of view 60% above ). because Aprilgrid Can provide serial number information , It can prevent jumping in attitude calculation . So it is suggested to adopt Aprilgrid Calibrate
Be careful : During calibration , The calibration plate shall not leave the camera field of view , Start and finish smoothly , Try to make the calibration plate appear in all corners of the field of vision .
checkerboard:
targetCols、targetRows The number is the number of inner corner points
target_type: 'checkerboard' #gridtype
targetCols: 6 #number of internal chessboard corners
targetRows: 8 #number of internal chessboard corners
rowSpacingMeters: 0.17 #size of one chessboard square [m]
colSpacingMeters: 0.17 #size of one chessboard square [m]
aprilgrid:
tagSpacing= Side length of small square / The side length of the large square is
target_type: 'aprilgrid' #gridtype
tagCols: 6 #number of apriltags
tagRows: 6 #number of apriltags
tagSize: 0.088 #size of apriltag, edge to edge [m]
tagSpacing: 0.3 #ratio of space between tags to tagSize
Calibration board download :https://github.com/ethz-asl/kalibr/wiki/downloads#calibration-targets
3、 ... and ZED2 Calibration data recording
at present ZED2 The resolution of the , stay ZED2_WS/src/zed-ros-wrapper/zed_wrapper/params Found under folder common.yaml,resolution by 3, namely VGA Pattern , The actual resolution size is 672*376.
start-up ZED2 Of ROS node :
roscore
roslaunch zed_wrapper zed2.launch
see topic, Open the visual image , Make sure that the calibration plate is in the left and right eye images :
rostopic list
rosrun image_view image_view image:=/zed2/zed_node/left/image_rect_color
rosrun image_view image_view image:=/zed2/zed_node/right/image_rect_color
1) Report errors :Command ‘rosrun’ not found
solve :sudo apt install ros-noetic-rosbash
2) Report errors :[rospack] Error: package ‘image_view’ not found
solve :sudo apt-get install ros-noetic-image-view
because zed Cameras are usually in the form of 60Hz About the frequency of recording , and kalibr The calibration requires that the frequency should not be too high , Recommended as 4HZ, Therefore, it is necessary to reduce the frequency of the original topic ( Too many calibrated images , This leads to too much computation ), Refer to lowering image data to 4Hz,( Some blogs recommend 20HZ, All possible , Longer processing time ),IMU Data to 200Hz,ros The way to do this is to subscribe first , And then redistribute .
Here is a down frame command
rosrun topic_tools throttle messages old_topic 4.0 new_topic//kalibr The recommended frame rate is 4HZ
There's another point here , Namely zed There are several topics in the camera corresponding to the original image , such as :
/zed2/zed_node/left/image_rect_color, This topic can also be used : The resolution has not changed , But it is to stretch the original image after edge clipping , This may result in loss of information .
Reduce image frequency , View the current frequency :
rosrun topic_tools throttle messages /zed2/zed_node/imu/data_raw 200 /zed2/zed_node/imu/data_raw2
rosrun topic_tools throttle messages /zed2/zed_node/left/image_rect_color 4.0 /zed2/zed_node/left/image_rect_color2
rosrun topic_tools throttle messages /zed2/zed_node/right/image_rect_color 4.0 /zed2/zed_node/right/image_rect_color2
rostopic hz /zed2/zed_node/left/image_rect_color2
rostopic hz /zed2/zed_node/right/image_rect_color2
Start recording calibration bag file , Reference resources Youtube video :https://youtu.be/puNXsnrYWTY?t=57:
Be careful : The recording process ensures that the calibration board will not exceed the picture , Try to keep the image clear when recording , Do not move violently , At the same time, activate as much as possible IMU All angles , All directions .
rosbag record -O Kalib_data_vga.bag /zed2/zed_node/imu/data_raw2 /zed2/zed_node/left/image_rect_color2 /zed2/zed_node/right/image_rect_color2
After recording, you will get a Kalib_data_vga.bag file .
View recorded data 
Four Start camera calibration
What we use here is Aprilgrid Calibration plate , Corresponding april.yaml(april_6x6_80x80cm.yaml) My parameter in is that the file content is :
target_type: 'aprilgrid' #gridtype
tagCols: 6 #number of apriltags
tagRows: 6 #number of apriltags
tagSize: 0.021 #size of apriltag, edge to edge [m]
tagSpacing: 0.285714 #ratio of space between tags to tagSize
stay kalibr Under the folder , Perform calibration , Among them april.yaml It indicates the parameter file lowered together with the calibration board .
Monocular calibration :
source ~/kalibr_workspace/devel/setup.bash
rosrun kalibr kalibr_calibrate_cameras --bag Kalib_data_vga.bag --topics /zed2/zed_node/left/image_rect_color2 --models pinhole-radtan --target april.yaml
Perform monocular calibration 、 Monocular +IMU The joint table will report the following errors on a regular basis :
Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approx. sync. tolerance.
Traceback (most recent call last):
File "/home/ipsg/tool/kalibr_ws/devel/bin/kalibr_calibrate_cameras", line 15, in <module>
exec(compile(fh.read(), python_script, 'exec'), context)
File "/home/ipsg/tool/kalibr_ws/src/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 447, in <module>
main()
File "/home/ipsg/tool/kalibr_ws/src/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 204, in main
graph.plotGraph()
File "/home/ipsg/tool/kalibr_ws/src/aslam_offline_calibration/kalibr/python/kalibr_camera_calibration/MulticamGraph.py", line 311, in plotGraph
edge_label=self.G.es["weight"],
KeyError: 'Attribute does not exist'
terms of settlement :
Find the workspace src/Kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras The following code in , Then comment out (201 That's ok )
if not graph.isGraphConnected():
obsdb.printTable()
print "Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approx. sync. tolerance."
graph.plotGraph()
sys.exit(-1)
The calibration results can be obtained after the comments are made .
Binocular calibration :
source ~/kalibr_workspace/devel/setup.bash
rosrun kalibr kalibr_calibrate_cameras --bag Kalib_data_vga.bag --topics /zed2/zed_node/left/image_rect_color2 /zed2/zed_node/right/image_rect_color2 --models pinhole-radtan pinhole-radtan --target april.yaml
Additable –show-extraction, In the calibration process, whether the corner detection is good can be visualized , A serious error was found in corner re projection ; --approx-sync 0.04, among 0.04 It can be adjusted to 0.1, The function is to synchronize the data of each camera .
–bag-from-to Parameters , If the data set starts and ends with the action of lifting the camera , Please add this parameter , Remove this part of the data . The action of lifting and lowering will have a certain impact on the calibration . There is no need to add .
Report errors 1):
[FATAL] [1636711641.843028]: No corners could be extracted for camera /zed2/zed_node/left/image_rect_color2! Check the calibration target configuration and dataset.
Traceback (most recent call last):
File "/home/sjj/kalibr_workspace/devel/lib/kalibr/kalibr_calibrate_cameras", line 15, in <module>
exec(compile(fh.read(), python_script, 'exec'), context)
File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 447, in <module>
main()
File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 185, in main
if not cam.initGeometryFromObservations(observations):
File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_camera_calibration/CameraCalibrator.py", line 56, in initGeometryFromObservations
success = self.geometry.initializeIntrinsics(observations)
RuntimeError: [Exception] /home/sjj/kalibr_workspace/src/kalibr/aslam_cv/aslam_cameras/include/aslam/cameras/implementation/PinholeProjection.hpp:716: initializeIntrinsics() assert(observations.size() != 0) failed: Need min. one observation
This problem is suspected because the calibration plate is too small , The camera resolution is not enough , Corner information can be successfully detected after change .
Report errors 2):
During calibration , If you cannot get the initial focal length , You can set :export KALIBR_MANUAL_FOCAL_LENGTH_INIT=1. And then run the program , When the program fails , It will prompt you to enter a focal length manually ,Initialization of focal length failed. Provide manual initialization: At this time, you manually input, for example 400. Give a larger value , It can also converge .
After calibration , The directory will generate 3 File
.yaml It is mainly used for later stage IMU+ Camera joint calibration
.pdf Show the effect as a graph
.txt Including camera internal parameters and re projection error
attach :.yaml The file is introduced
camera_model// Camera model
T_cam_imu //IMU extrinsics: from IMU Conversion to camera coordinates (T_c_i)
camera projection type (pinhole / omni)
intrinsics// Inside the camera
vector containing the intrinsic parameters for the given projection type. elements are as follows:
pinhole: [fu fv pu pv]
omni: [xi fu fv pu pv]
distortion_model// Distortion model
lens distortion type (radtan / equidistant)
distortion_coeffs// Distortion parameters
parameter vector for the distortion model
T_cn_cnm1// The relative pose of the left and right cameras
camera extrinsic transformation, always with respect to the last camera in the chain
(e.g. cam1: T_cn_cnm1 = T_c1_c0, takes cam0 to cam1 coordinates)
timeshift_cam_imu// When capturing data ,imu Data and image time offset , Camera and IMU The time interval between timestamps , In seconds (t_imu = t_cam + shift)
timeshift between camera and IMU timestamps in seconds (t_imu = t_cam + shift)
rostopic // The subject of the camera image stream
topic of the camera's image stream
resolution // Camera resolution [width,height]
camera resolution [width,height]
Calibration result detection :
The distortion_coeffs values should be small, even if not equal to zero
The last value of the first row of the T_cn_cnm1 matrix should be very near to -0.06
The last value of the second row of the T_cn_cnm1 matrix should be near to zero
The last value of the third row of the T_cn_cnm1 matrix should be near to zero
The values on the diagonal of the T_cn_cnm1 matrix should be very near to 1.0
The remaining values of the T_cn_cnm1 matrix should be near to zero
5、 ... and IMU Parameter calibration
This part can be obtained in the following two ways :
1) By recording still data ( The more 2 Hours ), modify launch file , Use imu_utils Calibrate , Obtain the calibration result file , To create the corresponding imu-params.yaml( Take the calibration result Acc And Gyr Fill in the average of imu.yaml file )
2) In addition to using their own calibration IMU parameter information , You can also directly use the parameter provided on the official website , verified , It can also be run in the system after calibration
Here, select the second method to set imu.yaml, The official parameters have a certain degree of credibility , If the later experimental data is not good , Manual calibration can be repeated . Reference resources :https://blog.csdn.net/sinat_16643223/article/details/115416277?spm=1001.2014.3001.5506
establish imu-params.yaml
gedit imu-params.yaml
And fill in the following :
#Accelerometers
accelerometer_noise_density: 1.4e-03 #Noise density (continuous-time)
accelerometer_random_walk: 8.0e-05 #Bias random walk
#Gyroscopes
gyroscope_noise_density: 8.6e-05 #Noise density (continuous-time)
gyroscope_random_walk: 2.2e-06 #Bias random walk
rostopic: /zed2/zed_node/imu/data_raw2 #the IMU ROS topic
update_rate: 200.0 #Hz (for discretization of the values above)
6、 ... and The camera IMU Joint calibration
Joint calibration is mainly to obtain the camera and IMU Transformation relation of shafting
rosrun kalibr kalibr_calibrate_imu_camera --bag Kalib_data_vga.bag --cam camchain-Kalib_data_vga.yaml --imu imu-params.yaml --target april.yaml
Report errors :
[ERROR] [1637115378.140707]: Optimization failed!
Traceback (most recent call last):
File "/home/sjj/kalibr_workspace/devel/lib/kalibr/kalibr_calibrate_imu_camera", line 15, in <module>
exec(compile(fh.read(), python_script, 'exec'), context)
File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_imu_camera", line 246, in <module>
main()
File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_imu_camera", line 209, in main
iCal.optimize(maxIterations=parsed.max_iter, recoverCov=parsed.recover_cov)
File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_imu_camera_calibration/icc_calibrator.py", line 179, in optimize
raise RuntimeError("Optimization failed!")
RuntimeError: Optimization failed!
This error message can successfully read the corresponding imu Information and monocular information . So the data probability is no problem . In the end in kalibr Of issues Found a solution to the problem .
If kalibr Problems encountered while running , You can go to the official website to search , Generally, most of the problems you encounter are buried by someone .
solve :
Please open the kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_imu_camera.py file , Search for timeOffsetPadding, I first looked at its value and saw that it was 0.03, Then increase the value of this variable , First I grew to 0.3, no way , Increase to 3, Optimization time is too long , Cause a similar stuck condition , Can be further reduced , Debug according to your own situation .(PS: Not too big , I changed it to 100, The direct memory has dried up ).
Rerun , The joint calibration results can be obtained camchain-imucam-Kalibr_data.yaml Documentation and complete PDF The report .
边栏推荐
- [Transformer]CoAtNet:Marrying Convolution and Attention for All Data Sizes
- [aaai 2021 timing action nomination generation] detailed interpretation of bsn++ long article
- How can smart construction sites achieve digital transformation?
- jvm调优六:GC日志分析和常量池详解
- Free data | new library online | cnopendata data data of national heritage stores and auction enterprises
- Huawei equipment is configured to access the virtual private network through GRE tunnel
- Google drive download failed, network error
- Bas Bound, Upper Bound, deux points
- Redis master-slave replication, sentinel, cluster cluster principle + experiment (wait, it will be later, but it will be better)
- Cartographer learning records: 3D slam part of cartographer source code (I)
猜你喜欢

Decision tree (hunt, ID3, C4.5, cart)

Paper reproduction: pare

华为设备配置MCE

Yolact paper reading and analysis

四大MQ的区别

博途仿真时出现“没有针对此地址组态任何硬件,无法进行修改”解决办法

New product pre-sale: 25g optical network card based on Intel 800 series is coming

PHP phone charge recharge channel website complete operation source code / full decryption without authorization / docking with the contract free payment interface

jvm调优六:GC日志分析和常量池详解

免费数据 | 新库上线 | CnOpenData全国文物商店及拍卖企业数据
随机推荐
【Markdown语法高级】 让你的博客更精彩(三:常用图标模板)
lower_bound,upper_bound,二分
Emlog new navigation source code / with user center
The first master account of Chia Tai International till autumn
Apply the intelligent OCR identification technology of Shenzhen Yanchang technology to break through the bottleneck of medical bill identification at one stroke. Efficient claim settlement is not a dr
C语言试题三(语法选择题——含知识点详解)
go MPG
Differences between the four MQ
Take stock of the AI black technologies in the Beijing Winter Olympic Games, and Shenzhen Yancheng Technology
[Transformer]MViTv1:Multiscale Vision Transformers
[Transformer]On the Integration of Self-Attention and Convolution
Chia Tai International: anyone who really invests in qihuo should know
Analysis of 17 questions in Volume 1 of the new college entrance examination in 2022
RGB image histogram equalization and visualization matlab code
Cartographer learning record: cartographer Map 3D visualization configuration (self recording dataset version)
Share | defend against physically realizable image classification attacks
选择数字资产托管人时,要问的 6 个问题
Zhengda international qihuo: trading market
Yolact paper reading and analysis
Leetcode classic guide