当前位置:网站首页>Robot autonomous exploration series papers environment code
Robot autonomous exploration series papers environment code
2022-07-07 22:26:00 【Lift the cement together】
Recently I saw CMU Open source a series of robot autonomous exploration code , I happen to be working in this direction myself , I'm going to take a look at their work first to get in .
They have opened two independently explored codes , State estimation used 、 The algorithm of obstacle avoidance is based on one work , So first analyze the code of this work .
Code launch file :
<launch>
<arg name="cameraOffsetZ" default="0"/>
<arg name="vehicleX" default="0"/>
<arg name="vehicleY" default="0"/>
<arg name="checkTerrainConn" default="false"/>
<include file="$(find ps3joy)/launch/ps3.launch" />
<include file="$(find local_planner)/launch/local_planner.launch" >
<arg name="cameraOffsetZ" value="$(arg cameraOffsetZ)"/>
<arg name="goalX" value="$(arg vehicleX)"/>
<arg name="goalY" value="$(arg vehicleY)"/>
</include>
<include file="$(find terrain_analysis)/launch/terrain_analysis.launch" />
<include file="$(find terrain_analysis_ext)/launch/terrain_analysis_ext.launch" >
<arg name="checkTerrainConn" value="$(arg checkTerrainConn)"/>
</include>
<include file="$(find sensor_scan_generation)/launch/sensor_scan_generation.launch" />
<include file="$(find loam_interface)/launch/loam_interface.launch" />
<node launch-prefix="nice" pkg="rviz" type="rviz" name="rvizGA" args="-d $(find vehicle_simulator)/rviz/vehicle_simulator.rviz" respawn="true"/>
</launch>
1. loam_interface
This program is to provide a and slam State estimation is an excuse for node communication , Can be very convenient in launch Modified in the document slam Algorithm topic
First of all to see loam_interface.launch
<launch>
<node pkg="loam_interface" type="loamInterface" name="loamInterface" output="screen" required="true">
<param name="stateEstimationTopic" type="string" value="/lio_sam/mapping/odometry" />
<param name="registeredScanTopic" type="string" value="/lio_sam/mapping/cloud_registered" />
<param name="flipStateEstimation" type="bool" value="false />
<param name="flipRegisteredScan" type="bool" value="false" />
<param name="sendTF" type="bool" value="true" />
<param name="reverseTF" type="bool" value="false" />
</node>
</launch>
This article takes lio-sam State estimation node .
ros::Subscriber subOdometry = nh.subscribe<nav_msgs::Odometry> (stateEstimationTopic, 5, odometryHandler);
ros::Subscriber subLaserCloud = nh.subscribe<sensor_msgs::PointCloud2> (registeredScanTopic, 5, laserCloudHandler);
1.1 odometryHandler
subscribe lio-sam Odometer information /lio_sam/mapping/odometry( The position and posture of the robot in the world coordinate system ), Execute callback function odometryHandler( Change the position and posture of the robot in the world coordinate system to a topic name /state_estimation Release it , And release TF Change the relationship )
void odometryHandler(const nav_msgs::Odometry::ConstPtr& odom)
{
double roll, pitch, yaw;
geometry_msgs::Quaternion geoQuat = odom->pose.pose.orientation;
odomData = *odom;
// In this case flipStateEstimation by false
if (flipStateEstimation) {
tf::Matrix3x3(tf::Quaternion(geoQuat.z, -geoQuat.x, -geoQuat.y, geoQuat.w)).getRPY(roll, pitch, yaw);
pitch = -pitch;
yaw = -yaw;
geoQuat = tf::createQuaternionMsgFromRollPitchYaw(roll, pitch, yaw);
odomData.pose.pose.orientation = geoQuat;
odomData.pose.pose.position.x = odom->pose.pose.position.z;
odomData.pose.pose.position.y = odom->pose.pose.position.x;
odomData.pose.pose.position.z = odom->pose.pose.position.y;
}
// publish odometry messages
odomData.header.frame_id = "/map";
odomData.child_frame_id = "/sensor";
pubOdometryPointer->publish(odomData);
// publish tf messages
odomTrans.stamp_ = odom->header.stamp;
odomTrans.frame_id_ = "/map";
odomTrans.child_frame_id_ = "/sensor";
odomTrans.setRotation(tf::Quaternion(geoQuat.x, geoQuat.y, geoQuat.z, geoQuat.w));
odomTrans.setOrigin(tf::Vector3(odomData.pose.pose.position.x, odomData.pose.pose.position.y, odomData.pose.pose.position.z));
if (sendTF) {
if (!reverseTF) {
tfBroadcasterPointer->sendTransform(odomTrans);
} else {
tfBroadcasterPointer->sendTransform(tf::StampedTransform(odomTrans.inverse(), odom->header.stamp, "/sensor", "/map"));
}
}
}
1.2 laserCloudHandler
subscribe lio-sam Key frame point cloud information /lio_sam/mapping/cloud_registered( Key frames in the world coordinate system ), Execute callback function laserCloudHandler
void laserCloudHandler(const sensor_msgs::PointCloud2ConstPtr& laserCloudIn)
{
laserCloud->clear();
pcl::fromROSMsg(*laserCloudIn, *laserCloud);
// stay lio-sam In Chinese, it means false
if (flipRegisteredScan) {
int laserCloudSize = laserCloud->points.size();
for (int i = 0; i < laserCloudSize; i++) {
float temp = laserCloud->points[i].x;
laserCloud->points[i].x = laserCloud->points[i].z;
laserCloud->points[i].z = laserCloud->points[i].y;
laserCloud->points[i].y = temp;
}
}
// publish registered scan messages
sensor_msgs::PointCloud2 laserCloud2;
pcl::toROSMsg(*laserCloud, laserCloud2);
laserCloud2.header.stamp = laserCloudIn->header.stamp;
laserCloud2.header.frame_id = "/map";
pubLaserCloudPointer->publish(laserCloud2);
}
This callback function also publishes the key frame point cloud in the world coordinate system with a topic name /registered_scan
1 loam_interface summary :
Keep receiving slam Part of The current pose and current key frame of the robot in the world coordinate system , And separately /state-estimastion and /registered_scan The topic name is released
2 sensor_scan_generation
subscribe /state_estimation and /registered_scan, perform laserCloudAndOdometryHandler, And publish /state_estimation_at_scan and /sensor_scan
void laserCloudAndOdometryHandler(const nav_msgs::Odometry::ConstPtr& odometry,
const sensor_msgs::PointCloud2ConstPtr& laserCloud2)
{
laserCloudIn->clear();
laserCLoudInSensorFrame->clear();
pcl::fromROSMsg(*laserCloud2, *laserCloudIn);
odometryIn = *odometry;
transformToMap.setOrigin(tf::Vector3(odometryIn.pose.pose.position.x, odometryIn.pose.pose.position.y, odometryIn.pose.pose.position.z));
transformToMap.setRotation(tf::Quaternion(odometryIn.pose.pose.orientation.x, odometryIn.pose.pose.orientation.y,
odometryIn.pose.pose.orientation.z, odometryIn.pose.pose.orientation.w));
int laserCloudInNum = laserCloudIn->points.size();
pcl::PointXYZ p1;
tf::Vector3 vec;
for (int i = 0; i < laserCloudInNum; i++)
{
p1 = laserCloudIn->points[i];
vec.setX(p1.x);
vec.setY(p1.y);
vec.setZ(p1.z);
vec = transformToMap.inverse() * vec;
p1.x = vec.x();
p1.y = vec.y();
p1.z = vec.z();
laserCLoudInSensorFrame->points.push_back(p1);
}
2. summary
subscribe /state_estimation as well as /registered_scan, Execute callback function .
Release /state_estimation_at_scan( Equate to /state_estimation)
Release /sensor_scan( The point cloud in the world coordinate system is converted to the sensor Lidar In coordinate system )
3. terrain_analysis
This program subscribes to two topics :
3.1 subscribe /state_estimation, perform odometryHandler
ros::Subscriber subOdometry = nh.subscribe<nav_msgs::Odometry>("/state_estimation", 5, odometryHandler);
void odometryHandler(const nav_msgs::Odometry::ConstPtr &odom) {
double roll, pitch, yaw;
geometry_msgs::Quaternion geoQuat = odom->pose.pose.orientation;
tf::Matrix3x3(tf::Quaternion(geoQuat.x, geoQuat.y, geoQuat.z, geoQuat.w))
.getRPY(roll, pitch, yaw);
vehicleRoll = roll;
vehiclePitch = pitch;
vehicleYaw = yaw;
vehicleX = odom->pose.pose.position.x;
vehicleY = odom->pose.pose.position.y;
vehicleZ = odom->pose.pose.position.z;
sinVehicleRoll = sin(vehicleRoll);
cosVehicleRoll = cos(vehicleRoll);
sinVehiclePitch = sin(vehiclePitch);
cosVehiclePitch = cos(vehiclePitch);
sinVehicleYaw = sin(vehicleYaw);
cosVehicleYaw = cos(vehicleYaw);
if (noDataInited == 0) {
vehicleXRec = vehicleX;
vehicleYRec = vehicleY;
noDataInited = 1;
}
if (noDataInited == 1) {
float dis = sqrt((vehicleX - vehicleXRec) * (vehicleX - vehicleXRec) +
(vehicleY - vehicleYRec) * (vehicleY - vehicleYRec));
if (dis >= noDecayDis)
noDataInited = 2;
}
}
3.2 subscribe /registered_scan, Execute callback function laserCloudHandler
void laserCloudHandler(const sensor_msgs::PointCloud2ConstPtr &laserCloud2) {
laserCloudTime = laserCloud2->header.stamp.toSec();
if (!systemInited) {
systemInitTime = laserCloudTime;
systemInited = true;
}
laserCloud->clear();
pcl::fromROSMsg(*laserCloud2, *laserCloud);
pcl::PointXYZI point;
laserCloudCrop->clear();
int laserCloudSize = laserCloud->points.size();
for (int i = 0; i < laserCloudSize; i++) {
point = laserCloud->points[i];
float pointX = point.x;
float pointY = point.y;
float pointZ = point.z;
float dis = sqrt((pointX - vehicleX) * (pointX - vehicleX) +
(pointY - vehicleY) * (pointY - vehicleY));
if (pointZ - vehicleZ > minRelZ - disRatioZ * dis &&
pointZ - vehicleZ < maxRelZ + disRatioZ * dis &&
dis < terrainVoxelSize * (terrainVoxelHalfWidth + 1)) {
point.x = pointX;
point.y = pointY;
point.z = pointZ;
point.intensity = laserCloudTime - systemInitTime;
laserCloudCrop->push_back(point);
}
}
newlaserCloud = true;
}
Save the current frame point cloud that meets certain conditions to laserCloudCrop in , The intensity of the point is set to the time difference of the laser frame .
And then we go to the main function , about laserCloudCrop Every point in the , Get its two-dimensional grid coordinate index indX,indY
// stack registered laser scans
pcl::PointXYZI point;
int laserCloudCropSize = laserCloudCrop->points.size();
for (int i = 0; i < laserCloudCropSize; i++) {
point = laserCloudCrop->points[i];
//point.x - vehicleX Is the vector from the terrain point to the current point
int indX = int((point.x - vehicleX + terrainVoxelSize / 2) / terrainVoxelSize) + terrainVoxelHalfWidth;
int indY = int((point.y - vehicleY + terrainVoxelSize / 2) / terrainVoxelSize) + terrainVoxelHalfWidth;
// When the rounding is negative , for example x It was originally -0.8, Plus 0.5 It becomes -0.3, seek int It becomes 0, But it should be -1
if (point.x - vehicleX + terrainVoxelSize / 2 < 0)
indX--;
if (point.y - vehicleY + terrainVoxelSize / 2 < 0)
indY--;
Save the points whose index does not exceed the limit to terrainVoxelCloud in
if (indX >= 0 && indX < terrainVoxelWidth && indY >= 0 && indY < terrainVoxelWidth) {
terrainVoxelCloud[terrainVoxelWidth * indX + indY]->push_back(point);
terrainVoxelUpdateNum[terrainVoxelWidth * indX + indY]++;
}
then , Traverse terrainVoxelCloud Every grid , When the number of point clouds in each grid exceeds a certain amount or exceeds a certain time ( For subsequent point cloud frames , Start when the point cloud frames are more separated ), Reprocessing :
1. For point clouds in each grid terrainVoxelCloudPtr Perform downsampling to obtain laserCloudDwz, Also on terrainVoxelCloudPtr Conduct clear()
2. about laserCloudDwz Every point in the , When certain conditions are met , Save to terrainVoxelCloudPtr
That is, only save terrainVoxelCloud Points in each grid that meet certain conditions
for (int ind = 0; ind < terrainVoxelNum; ind++) {
// When traversing each grid , Only if the number of point clouds is greater than voxelPointUpdateThre When the grid is processed or the time exceeds a certain threshold
// systemInitTime Is the time of the first laser frame
if (terrainVoxelUpdateNum[ind] >= voxelPointUpdateThre ||
laserCloudTime - systemInitTime - terrainVoxelUpdateTime[ind] >= voxelTimeUpdateThre || clearingCloud) {
pcl::PointCloud<pcl::PointXYZI>::Ptr terrainVoxelCloudPtr = terrainVoxelCloud[ind];
laserCloudDwz->clear();
downSizeFilter.setInputCloud(terrainVoxelCloudPtr);
downSizeFilter.filter(*laserCloudDwz);
terrainVoxelCloudPtr->clear();
int laserCloudDwzSize = laserCloudDwz->points.size();
for (int i = 0; i < laserCloudDwzSize; i++) {
point = laserCloudDwz->points[i];
float dis = sqrt((point.x - vehicleX) * (point.x - vehicleX) + (point.y - vehicleY) * (point.y - vehicleY));
// In the voxel grid ,, The point cloud that needs to be segmented by the ground meets the following requirements , These point clouds will be put into terrainCloud , Used for ground segmentation .
// 1. The point cloud height is greater than the minimum threshold
// 2. The point cloud height is less than the maximum threshold
// 3. The time difference between the current point cloud and the point cloud to be processed is less than the threshold decayTime, Or the distance is less than noDecayDis
// 4. The point cloud outside the distance is not cleared at this time , Or not within the distance that needs to be removed
if (point.z - vehicleZ > minRelZ - disRatioZ * dis && point.z - vehicleZ < maxRelZ + disRatioZ * dis &&
(laserCloudTime - systemInitTime - point.intensity < decayTime || dis < noDecayDis) &&
!(dis < clearingDis && clearingCloud)) {
// Only the points that meet the above conditions are reserved
terrainVoxelCloudPtr->push_back(point);
}
}
take terrainVoxelCloud Save all points in to terrainCloud in ;
Traverse terrainVoxelCloud Every point in the :
1. Get the two-dimensional coordinate index of each point indX、indY
2. When each point of Z When the value is within the threshold , Spread it to a 3*3 The neighborhood of , And save it to planarPointElev in
3. For the vector from the current position to the point cloud , Rotate it to sensor In coordinate system , Judge sensor Of each point in the coordinate system FOV The size of the viewing angle , When it is between two thresholds ,planarVoxelDyObs[planarVoxelWidth * indX + indY]++
// Recreate a two-dimensional array to store the elevation information of each point , But the resolution is from 1 Change into 0.2
int terrainCloudSize = terrainCloud->points.size();
for (int i = 0; i < terrainCloudSize; i++) {
point = terrainCloud->points[i];
int indX = int((point.x - vehicleX + planarVoxelSize / 2) / planarVoxelSize) + planarVoxelHalfWidth;
int indY = int((point.y - vehicleY + planarVoxelSize / 2) / planarVoxelSize) + planarVoxelHalfWidth;
if (point.x - vehicleX + planarVoxelSize / 2 < 0)
indX--;
if (point.y - vehicleY + planarVoxelSize / 2 < 0)
indY--;
// Yes Z Point whose value satisfies certain conditions , Save its elevation value
if (point.z - vehicleZ > minRelZ && point.z - vehicleZ < maxRelZ) {
// Each point is not placed in the same grid , Instead, copy to a 3*3 The neighborhood of .
for (int dX = -1; dX <= 1; dX++) {
for (int dY = -1; dY <= 1; dY++) {
// planarVoxelWidth = 51
if (indX + dX >= 0 && indX + dX < planarVoxelWidth &&
indY + dY >= 0 && indY + dY < planarVoxelWidth) {
// Elevation information
planarPointElev[planarVoxelWidth * (indX + dX) + indY + dY].push_back(point.z);
}
}
}
}
Traverse laserCloudCrop Every point in the , Get its indX as well as indY Indexes , When each point of FOV Above a certain threshold ,planarVoxelDyObs[planarVoxelWidth * indX + indY] = 0;
I think it can be understood as a re judgment of the previous one
if (clearDyObs) {
for (int i = 0; i < laserCloudCropSize; i++) {
point = laserCloudCrop->points[i];
int indX = int((point.x - vehicleX + planarVoxelSize / 2) / planarVoxelSize) + planarVoxelHalfWidth;
int indY = int((point.y - vehicleY + planarVoxelSize / 2) / planarVoxelSize) + planarVoxelHalfWidth;
if (point.x - vehicleX + planarVoxelSize / 2 < 0)
indX--;
if (point.y - vehicleY + planarVoxelSize / 2 < 0)
indY--;
if (indX >= 0 && indX < planarVoxelWidth && indY >= 0 && indY < planarVoxelWidth) {
float pointX1 = point.x - vehicleX;
float pointY1 = point.y - vehicleY;
float pointZ1 = point.z - vehicleZ;
float dis1 = sqrt(pointX1 * pointX1 + pointY1 * pointY1);
float angle1 = atan2(pointZ1 - minDyObsRelZ, dis1) * 180.0 / PI;
if (angle1 > minDyObsAngle) {
planarVoxelDyObs[planarVoxelWidth * indX + indY] = 0;
}
}
}
}
边栏推荐
- ByteDance senior engineer interview, easy to get started, fluent
- Dayu200 experience officer MPPT photovoltaic power generation project dayu200, hi3861, Huawei cloud iotda
- The latest Android interview collection, Android video extraction audio
- What if the win11u disk does not display? Solution to failure of win11 plug-in USB flash disk
- Time standard library
- This experimental syntax requires enabling the parser plugin: ‘optionalChaining‘
- Add get disabled for RC form
- Unity development --- the mouse controls the camera to move, rotate and zoom
- Matplotlib快速入门
- 用语雀写文章了,功能真心强大!
猜你喜欢
Remember aximp once Use of exe tool
Ueeditor custom display insert code
Jerry's about TWS pairing mode configuration [chapter]
Vs custom template - take the custom class template as an example
Customer case | China law network, through observing the cloud, greatly shortens the time of fault location
Display optimization when the resolution of easycvr configuration center video recording plan page is adjusted
Add get disabled for RC form
ByteDance Android interview, summary of knowledge points + analysis of interview questions
How to make agile digital transformation strategy for manufacturing enterprises
海外代理推荐
随机推荐
ByteDance Android interview, summary of knowledge points + analysis of interview questions
IP network active evaluation system -- x-vision
Overseas agent recommendation
Blender exchange group, welcome to the water group ~
100million single men and women "online dating", supporting 13billion IPOs
Tcp/ip protocol stack
This experimental syntax requires enabling the parser plugin: ‘optionalChaining‘
C development -- WPF simple animation
[azure microservice service fabric] how to transfer seed nodes in the service fabric cluster
The function is really powerful!
Revit secondary development - project file to family file
Time standard library
Aspose. Word operation word document (I)
The free styling service of Dyson's official direct store is now open for appointment. Pioneer Technology interprets the styling concept of hair care and helps consumers unlock diversified and shiny s
Programming mode - table driven programming
The strongest installation of the twin tower model, Google is playing "antique" again?
It's worth seeing. Interview sites and interview skills
Typeorm automatically generates entity classes
[advanced MySQL] index details (I): index data page structure
大数据开源项目,一站式全自动化全生命周期运维管家ChengYing(承影)走向何方?