• LIO-SAM 详读代码笔记 -- 5.MapOptimization


    这个模块主要的功能就是接收featureExtraction 传来的coud_info 数据,把点云特征点根据初始化的里程计信息转换到map坐标系下,然后对当前帧的点云特征点信息进行scan-map 匹配优化,这里用到loam里面的点到线,点到面的距离作为残差项,用高斯牛顿法优化里程计信息,然后按照匀速运动模型计算一个超前一个lidar周期的odom_imcremental里程计,传送到IMU 预积分模块使用。

     

    图2.ros topic 发布与订阅关系图关系图(还是叫图2,因为是同一个图)

     

    图1.代码架构图(数据流图)

    成员变量列表

    变量名类型注释备注
    gtSAMgraphNonlinearFactorGraph因子图-
    initialEstimateValues因子途中待优化的变量-
    optimizedEstimateValues因子图中的变量,用于保存优化后的结果-
    *isamISAM2因子图的优化器-
    isamCurrentEstimateValues因子图中的变量-
    poseCovarianceEigen::MatrixXd因子图中的协方差模型-
    pubLaserCloudSurroundros::Publisher发布全局地图? "lio_sam/mapping/map_global" 这里也没有发布全局地图,有设定阈值,默认时1000米范围内的特征点地图-
    pubLaserOdometryGlobalros::Publisher发布地图优化后的 里程计信息 "lio_sam/mapping/odometry" // 在imuPreintegration文件的中的TransformFusion类订阅,融合imu 里程计后发布出发布 "odometry/imu_incremental" // 供imageProjection 1.lidar 运动畸变矫正 2.提取lidar的位姿信息 集成到cloud-info中 回到当前这个类 作lidar 位姿初始化-
    pubLaserOdometryIncrementalros::Publisher发布 "lio_sam/mapping/odometry_incremental" 里程计信息,与上面的区别是,比lio_sam/mapping/map_global 超前一个Lidar优化周期的里程计信息 , // 在 imuPreintegration 类中订阅,用于因子途中位姿优化的位姿初始化-
    pubKeyPosesros::Publisher发布 "lio_sam/mapping/trajectory",sensor_msgs::PointCloud2类型 系统中没有人订阅-
    pubPathros::Publisher发布 "lio_sam/mapping/path"-
    pubHistoryKeyFramesros::Publisher发布"lio_sam/mapping/icp_loop_closure_history_cloud"-
    pubHistoryKeyFramesros::Publisher发布"lio_sam/mapping/icp_loop_closure_history_cloud"-
    pubIcpKeyFramesros::Publisher//"lio_sam/mapping/icp_loop_closure_corrected_cloud"-
    pubRecentKeyFramesros::Publisher//"lio_sam/mapping/map_local" sensor_msgs::PointCloud2 ?-
    pubRecentKeyFramesros::Publisher//"lio_sam/mapping/cloud_registered sensor_msgs::PointCloud2 ?-
    pubCloudRegisteredRawros::Publisher// "lio_sam/mapping/cloud_registered_raw"-
    pubLoopConstraintEdgeros::Publisher"/lio_sam/mapping/loop_closure_constraints" visualization_msgs::MarkerArray ?-
    pubSLAMInforos::Publisher//"lio_sam/mapping/slam_info" lio_sam::cloud_info-
    subCloudros::Subscriber//"lio_sam/feature/cloud_info" lio_sam::cloud_info 订阅来自featureExtraction的cloud_info-
    subGPSros::Subscriber//"odometry/gpsz" # GPS odometry topic from navsat, see module_navsat.launch file-
    subLoopros::Subscriber//"lio_loop/loop_closure_detection" 类型std_msgs::Float64MultiArray-
    srvSaveMapros::ServiceServer-
    gpsQueuestd::deque// GPS的消息队列-
    cloudInfolio_sam::cloud_info// featureExtraction发来的点云特征消息,这里不再是队列, //来一个处理一个,处理的延迟应该低于lidar数据发布延迟,不然会卡顿丢帧 ?优化过程是执行周期 mappingProcessInterval,lidar 数据的频率是10Hz-
    cornerCloudKeyFramesvector::Ptr>真正地图存储的数据 // 数组存放每个关键帧的边角点-
    surfCloudKeyFramesvector::Ptr>真正地图存储的数据 // 数组存放每个关键帧的平面点-
    cloudKeyPoses3Dvector::Ptr>真正地图存储的数据 // //存放关键帧的位置 (不包含旋转信息)-
    cloudKeyPoses6Dpcl::PointCloud::Ptr真正地图存储的数据 // 存放关键帧的位置 (包含旋转信息) ,位置信息跟cloudKeyPoses3D 一致-
    copy_cloudKeyPoses3Dpcl::PointCloud::Ptr临时变量-
    copy_cloudKeyPoses6Dpcl::PointCloud::Ptr临时变量-
    laserCloudCornerLastpcl::PointCloud::Ptrcorner feature set from odoOptimization 从cloud_info 中提取的当前帧的边角点点云数据-
    laserCloudSurfLastpcl::PointCloud::Ptrsurf feature set from odoOptimization 从cloud_info 中提取的当前帧的平面点点云数据-
    laserCloudCornerLastDSpcl::PointCloud::Ptrdownsampled corner feature set from odoOptimization-
    laserCloudSurfLastDSpcl::PointCloud::Ptrdownsampled surf feature set from odoOptimization-
    laserCloudOripcl::PointCloud::Ptrscan-map优化过程中 存放lidar 坐标系下的特征点数据-
    coeffSelpcl::PointCloud::Ptr// 优化过程中 存放距离函数对lidar 坐标系下的特征点映射到map坐标系后的雅可比-
    laserCloudOriCornerVecstd::vector在使用openmp 加速计算时 用于保存角点特征原点的变量// corner point holder for parallel computation-
    coeffSelCornerVecstd::vector在使用openmp 加速计算时 用于保存角点特征原点的雅可比变量// corner point holder for parallel computation-
    laserCloudOriCornerFlagstd::vector在使用openmp 加速计算时 用于保存角点特征原点的占用标志位// corner point holder for parallel computation-
    laserCloudOriSurfVecstd::vector在使用openmp 加速计算时 用于保存平面特征原点的变量// surf point holder for parallel computation-
    coeffSelSurfVecstd::vector在使用openmp 加速计算时 用于保存平面特征原点的雅可比变量// surf point holder for parallel computation-
    laserCloudOriSurfFlagstd::vector在使用openmp 加速计算时 用于保存平面点特征原点的占用标志位// surf point holder for parallel computation-
    laserCloudMapContainermap, pcl::PointCloud>>保存地图 这个地图容器 1000帧关键帧之后,会清空,所以保存的不是全局地图 // 作用 仅仅时缓存一些关键帧数据,能够快速的构建局部地图-
    laserCloudCornerFromMappcl::PointCloud::Ptr局部地图的所有角点特征融合-
    laserCloudSurfFromMappcl::PointCloud::Ptr// 局部地图的所有面点特征-
    laserCloudCornerFromMapDSpcl::PointCloud::Ptr// 局部地图的所有角点特征的下采样点云,在scan-map 优化值中真正用到的数据-
    laserCloudSurfFromMapDSpcl::PointCloud::Ptr// 局部地图的所有面点特征的下采样点云,在scan-map 优化值中真正用到的数据-
    kdtreeCornerFromMappcl::KdTreeFLANN::Ptr// 局部地图的所有角点特征的下采样点云对用的Kdtree-
    kdtreeSurfFromMappcl::KdTreeFLANN::Ptr// 局部地图的所有面点特征的下采样点云对用的Kdtree-
    kdtreeSurroundingKeyPosespcl::KdTreeFLANN::Ptr// 关键帧位置数组对应的一个KDtree-
    kdtreeHistoryKeyPosespcl::KdTreeFLANN::Ptr// 关键帧位置数组对应的一个KDtree,这个 对应可视化部分用到的-
    downSizeFilterCornerpcl::VoxelGrid// 下采样器-
    downSizeFilterSurfpcl::VoxelGrid// 下采样器-
    downSizeFilterICPpcl::VoxelGrid// 下采样器-
    downSizeFilterSurroundingKeyPosespcl::VoxelGridfor surrounding key poses of scan-to-map optimization// 下采样器-
    timeLaserInfoStampros::Time这个时间戳是cloud_info 的时间戳,经过了imageProjection /featureExtraction 已经不是很新鲜的是时间戳了-
    timeLaserInfoCurdouble这个时间戳是cloud_info 的时间戳,经过了imageProjection /featureExtraction 已经不是很新鲜的是时间戳了-
    transformTobeMapped[6]float待优化的位姿变量,整个文件围绕着这个变量进行-
    mtxstd::mutex互斥量-
    mtxLoopInfostd::mutex回环检测的互斥量-
    isDegeneratebool标志要不要计算优化变量的协方差,雅可比矩阵的特征值均小于100时 不需要计算协方差-
    matPcv::Mat优化变量的协方差矩阵-
    laserCloudCornerFromMapDSNumint-
    laserCloudSurfFromMapDSNumint-
    laserCloudCornerLastDSNumint-
    laserCloudSurfLastDSNumint-
    aLoopIsClosedbool标志是否有回环检测模块-
    loopIndexContainermap-
    loopPoseQueuevector-
    loopNoiseQueuevector-
    loopInfoVecdeque-
    globalPathnav_msgs::Path-
    transPointAssociateToMapEigen::Affine3f把lidar 点云映射到的map坐标系的转换矩阵,对应着transformTobeMapped 的值-
    incrementalOdometryAffineFrontEigen::Affine3f本文件整个优化前对应着transformTobeMapped 的值-
    incrementalOdometryAffineBackEigen::Affine3f本文件整个优化后对应着transformTobeMapped 的值,这两转换矩阵作差,就是匀速运动学模型的速度,用于计算lidar 周期的朝一个周期的优化 里程计信息,在发布 odom_imcremental 里程计时要用到-

    代码注解

    这里的代码的主要逻辑从 cloud_info 的回调函数为入口:

    1. /**
    2. * @brief CloudInfo 的回调函数
    3. *
    4. * @param msgIn
    5. */
    6. void laserCloudInfoHandler(const lio_sam::cloud_infoConstPtr& msgIn)
    7. {
    8. // extract time stamp
    9. timeLaserInfoStamp = msgIn->header.stamp;
    10. timeLaserInfoCur = msgIn->header.stamp.toSec();
    11. // extract info and feature cloud
    12. cloudInfo = *msgIn;
    13. pcl::fromROSMsg(msgIn->cloud_corner, *laserCloudCornerLast); // 当前帧的角点
    14. pcl::fromROSMsg(msgIn->cloud_surface, *laserCloudSurfLast); // 当前帧的平面点
    15. std::lock_guard lock(mtx);
    16. static double timeLastProcessing = -1;
    17. // seconds, regulate mapping frequencymapOptimization 的执行时间间隔,地图优化的执行周期,比lidar 频率慢
    18. if (timeLaserInfoCur - timeLastProcessing >= mappingProcessInterval)
    19. {
    20. timeLastProcessing = timeLaserInfoCur;
    21. /**
    22. * @brief 把上一次有优化的结果 transformTobeMapped存放到incrementalOdometryAffineFront 之后
    23. * 用imu预积分的里程计信息或者IMU 信息(只做旋转变换)初始化当前lidar的位姿transformTobeMapped
    24. *
    25. */
    26. updateInitialGuess();
    27. /**
    28. * @brief Construct a new extract Surrounding Key Frames object
    29. * 在cloudKeyPoses3D 这个数组中搜索 以loudKeyPoses3D.back()为中心 的周边50米的距离的关键帧数据 构建局部地图
    30. * 并构造角点局部地图和平面点局部地图 ,
    31. * 存放在laserCloudCornerFromMapDS
    32. * 和laserCloudSurfFromMapDS
    33. */
    34. extractSurroundingKeyFrames();
    35. /**
    36. * @brief 对当前帧的点云角点 和平面点 进行下采样
    37. *
    38. */
    39. downsampleCurrentScan();
    40. //用局部构建的地图去优化当前帧的位姿
    41. scan2MapOptimization();
    42. // 添加因子图和保存优化后的关键帧数据
    43. saveKeyFramesAndFactor();
    44. // 如果发生回环 ,需要把回环对应的历史位姿更新以下
    45. correctPoses();
    46. publishOdometry();
    47. publishFrames();
    48. }
    49. }

    做值优化问题,都需要有一个初始值,在loam 和lego_loam 中都是通过scan-scan 的方式得出scan-map的初始值,有了imu 数据之后,可以通过imu 预积分数据作scan-map的初始值。

    1. /**
    2. * @brief 把上一次有优化的结果 transformTobeMapped存放到incrementalOdometryAffineFront 之后
    3. * 用imu预积分的里程计信息或者IMU 信息(只做旋转变换)初始化当前lidar的位姿transformTobeMapped
    4. *
    5. */
    6. void updateInitialGuess()
    7. {
    8. // save current transformation before any processing
    9. incrementalOdometryAffineFront = trans2Affine3f(transformTobeMapped); // 把上一次优化的结果 保存在incrementalOdometryAffineFront
    10. static Eigen::Affine3f lastImuTransformation; // 保存IMU 里程计到当前帧时刻的里程计信息
    11. // initialization
    12. if (cloudKeyPoses3D->points.empty()) // 地图里面没有数据,表示当前为初始位置,
    13. {
    14. // 初始时刻 机器人的朝向取 IMU 的朝向
    15. transformTobeMapped[0] = cloudInfo.imuRollInit;
    16. transformTobeMapped[1] = cloudInfo.imuPitchInit;
    17. transformTobeMapped[2] = cloudInfo.imuYawInit;
    18. // 超参数:在yaml 文件中设置 if using GPS data, set to "true" ,useImuHeadingInitialization 如果不用 imu朝向 ,把yaw 置为0
    19. if (!useImuHeadingInitialization)
    20. transformTobeMapped[2] = 0;
    21. lastImuTransformation = pcl::getTransformation(0, 0, 0, cloudInfo.imuRollInit, cloudInfo.imuPitchInit, cloudInfo.imuYawInit); // save imu before return;
    22. return;
    23. }
    24. // use imu pre-integration estimation for pose guess
    25. static bool lastImuPreTransAvailable = false;
    26. //这个变量保存IMU 的预积分的里程计信息,关联两次优化
    27. static Eigen::Affine3f lastImuPreTransformation;
    28. if (cloudInfo.odomAvailable == true)
    29. {
    30. // 这里看到一对 XXXFront 和 XXXBack ,之后肯定是要作差的,表示之前的时刻到当前时刻的位姿差
    31. Eigen::Affine3f transBack = pcl::getTransformation(cloudInfo.initialGuessX, cloudInfo.initialGuessY, cloudInfo.initialGuessZ,
    32. cloudInfo.initialGuessRoll, cloudInfo.initialGuessPitch, cloudInfo.initialGuessYaw);
    33. if (lastImuPreTransAvailable == false)
    34. { // 第一次设置时,
    35. lastImuPreTransformation = transBack;
    36. lastImuPreTransAvailable = true;
    37. } else {
    38. // 两次优化之间的发生的位姿差 delta_T = T_i^T * T_j
    39. Eigen::Affine3f transIncre = lastImuPreTransformation.inverse() * transBack;
    40. Eigen::Affine3f transTobe = trans2Affine3f(transformTobeMapped);
    41. // T_j'' = T_i' * delta_T 其中 T_j'' 为待优化的变量的初始值
    42. Eigen::Affine3f transFinal = transTobe * transIncre;
    43. pcl::getTranslationAndEulerAngles(transFinal, transformTobeMapped[3], transformTobeMapped[4], transformTobeMapped[5],
    44. transformTobeMapped[0], transformTobeMapped[1], transformTobeMapped[2]);
    45. lastImuPreTransformation = transBack; // 一直保存IMU 预积分里程计的信息,传递下去
    46. lastImuTransformation = pcl::getTransformation(0, 0, 0, cloudInfo.imuRollInit, cloudInfo.imuPitchInit, cloudInfo.imuYawInit); // save imu before return;
    47. return;
    48. }
    49. }
    50. // use imu incremental estimation for pose guess (only rotation)
    51. if (cloudInfo.imuAvailable == true)
    52. {
    53. Eigen::Affine3f transBack = pcl::getTransformation(0, 0, 0, cloudInfo.imuRollInit, cloudInfo.imuPitchInit, cloudInfo.imuYawInit);
    54. Eigen::Affine3f transIncre = lastImuTransformation.inverse() * transBack;
    55. Eigen::Affine3f transTobe = trans2Affine3f(transformTobeMapped);
    56. Eigen::Affine3f transFinal = transTobe * transIncre;
    57. pcl::getTranslationAndEulerAngles(transFinal, transformTobeMapped[3], transformTobeMapped[4], transformTobeMapped[5],
    58. transformTobeMapped[0], transformTobeMapped[1], transformTobeMapped[2]);
    59. lastImuTransformation = pcl::getTransformation(0, 0, 0, cloudInfo.imuRollInit, cloudInfo.imuPitchInit, cloudInfo.imuYawInit); // save imu before return;
    60. return;
    61. }
    62. }

    在scan-map 优化执行之前,首先需要有一个map,淡然不能是全局map,内存撑爆,且运算时间剧增,也没必要用全局map ,这里用的是距离最新历史关键帧50米范围内的关键帧数据作为一个局部地图取匹配优化当前帧的位姿。

    1. /**
    2. * 这个函数就是调用了extractNearby函数
    3. * */
    4. void extractSurroundingKeyFrames()
    5. {
    6. if (cloudKeyPoses3D->points.empty() == true)
    7. return;
    8. // ?
    9. // if (loopClosureEnableFlag == true)
    10. // {
    11. // extractForLoopClosure();
    12. // } else {
    13. // extractNearby();
    14. // }
    15. extractNearby();
    16. }

    上面这个函数就是调用了extractNearby函数,在cloudKeyPoses3D 这个数组中搜索 以loudKeyPoses3D.back()为中心 的周边50米的距离的关键帧数据 构建局部地图。

    1. /**
    2. * @brief 在cloudKeyPoses3D 这个数组中搜索 以loudKeyPoses3D.back()为中心 的周边50米的距离的关键帧数据 构建局部地图
    3. *
    4. */
    5. void extractNearby()
    6. { // 搜索出来距离50米的关键帧位姿(位置)存放在这个点云容器中
    7. pcl::PointCloud::Ptr surroundingKeyPoses(new pcl::PointCloud());
    8. pcl::PointCloud::Ptr surroundingKeyPosesDS(new pcl::PointCloud());
    9. // 这两个变量是 KDtree 搜索用到的两个临时变量
    10. std::vector<int> pointSearchInd;
    11. std::vector<float> pointSearchSqDis;
    12. // extract all the nearby key poses and downsample them
    13. kdtreeSurroundingKeyPoses->setInputCloud(cloudKeyPoses3D); // create kd-tree
    14. // 以cloudKeyPoses3D->back() 为中心 以surroundingKeyframeSearchRadius(默认50米)为半径 搜索最近的点
    15. kdtreeSurroundingKeyPoses->radiusSearch(cloudKeyPoses3D->back(), (double)surroundingKeyframeSearchRadius, pointSearchInd, pointSearchSqDis);
    16. for (int i = 0; i < (int)pointSearchInd.size(); ++i)
    17. {
    18. int id = pointSearchInd[i];
    19. surroundingKeyPoses->push_back(cloudKeyPoses3D->points[id]);
    20. }
    21. downSizeFilterSurroundingKeyPoses.setInputCloud(surroundingKeyPoses);
    22. // 下采样之后,点云里的点不完全时原始的点,有些是融合过 ,所以时间戳信息 需要重新获取一下
    23. downSizeFilterSurroundingKeyPoses.filter(*surroundingKeyPosesDS);
    24. for(auto& pt : surroundingKeyPosesDS->points)
    25. { // 再找一遍,把原始点云的最近的点的intensity赋值给新的位置点 ,intensity 在这个是在这个地图中的索引 也是laserCloudMapContainer容器的索引
    26. kdtreeSurroundingKeyPoses->nearestKSearch(pt, 1, pointSearchInd, pointSearchSqDis);
    27. pt.intensity = cloudKeyPoses3D->points[pointSearchInd[0]].intensity;
    28. }
    29. // also extract some latest key frames in case the robot rotates in one position
    30. int numPoses = cloudKeyPoses3D->size();
    31. for (int i = numPoses-1; i >= 0; --i)
    32. {
    33. // 过滤完之后 ,surroundingKeyPosesDS在添加一些 时间比较接近的位置点
    34. if (timeLaserInfoCur - cloudKeyPoses6D->points[i].time < 10.0)
    35. surroundingKeyPosesDS->push_back(cloudKeyPoses3D->points[i]);
    36. else
    37. break;
    38. }
    39. extractCloud(surroundingKeyPosesDS);
    40. }

    然后再调用extractCloud 函数,搜索出来距离50米的关键帧位姿(位置)的角点点云和平面点点云,存放在laserCloudCornerFromMapDS和laserCloudSurfFromMapDS中

    1. /**
    2. * @brief 搜索出来距离50米的关键帧位姿(位置)的角点点云和平面点点云
    3. * 存放在laserCloudCornerFromMapDS
    4. * 和laserCloudSurfFromMapDS
    5. *
    6. * @param cloudToExtract
    7. */
    8. void extractCloud(pcl::PointCloud::Ptr cloudToExtract)
    9. {
    10. // fuse the map
    11. laserCloudCornerFromMap->clear();
    12. laserCloudSurfFromMap->clear();
    13. for (int i = 0; i < (int)cloudToExtract->size(); ++i)
    14. { // 添加关键帧的搜索半径 , 局部地图的距离最新帧中心阈值
    15. if (pointDistance(cloudToExtract->points[i], cloudKeyPoses3D->back()) > surroundingKeyframeSearchRadius)
    16. continue;
    17. int thisKeyInd = (int)cloudToExtract->points[i].intensity;
    18. if (laserCloudMapContainer.find(thisKeyInd) != laserCloudMapContainer.end())
    19. { // 局部地图 角点地图 和平面点地图
    20. // transformed cloud available
    21. *laserCloudCornerFromMap += laserCloudMapContainer[thisKeyInd].first;
    22. *laserCloudSurfFromMap += laserCloudMapContainer[thisKeyInd].second;
    23. } else {
    24. // transformed cloud not available
    25. // 把点云 从lidar 坐标系转到world 坐标系
    26. pcl::PointCloud laserCloudCornerTemp = *transformPointCloud(cornerCloudKeyFrames[thisKeyInd], &cloudKeyPoses6D->points[thisKeyInd]);
    27. pcl::PointCloud laserCloudSurfTemp = *transformPointCloud(surfCloudKeyFrames[thisKeyInd], &cloudKeyPoses6D->points[thisKeyInd]);
    28. *laserCloudCornerFromMap += laserCloudCornerTemp;
    29. *laserCloudSurfFromMap += laserCloudSurfTemp;
    30. // 添加到 地图容器中
    31. laserCloudMapContainer[thisKeyInd] = make_pair(laserCloudCornerTemp, laserCloudSurfTemp);
    32. }
    33. }
    34. // Downsample the surrounding corner key frames (or map)
    35. downSizeFilterCorner.setInputCloud(laserCloudCornerFromMap);
    36. downSizeFilterCorner.filter(*laserCloudCornerFromMapDS);
    37. laserCloudCornerFromMapDSNum = laserCloudCornerFromMapDS->size();
    38. // Downsample the surrounding surf key frames (or map)
    39. downSizeFilterSurf.setInputCloud(laserCloudSurfFromMap);
    40. downSizeFilterSurf.filter(*laserCloudSurfFromMapDS);
    41. laserCloudSurfFromMapDSNum = laserCloudSurfFromMapDS->size();
    42. // clear map cache if too large
    43. // 地图数据清空 缓存1000帧
    44. if (laserCloudMapContainer.size() > 1000)
    45. laserCloudMapContainer.clear();
    46. }

    有了初始值,有了局部地图,就可以执行scan-map 匹配优化了。

    分三部:1.根据初始值变换角点到map 坐标系下,计算点到线的残差项,及其对变换后的点的雅可比,保存起来。

    2. 根据初始值变换平面点到map 坐标系下,计算点到面的残差项,及其对变换后的点的雅可比,保存起来。

    3. 计算变换后的点对变换矩阵中 R 和t 的雅可比,然后用高斯牛顿法优化待优化变量。

    这里详细讲解一下计算过程,可能这部分内容源自loam 或者是其他文献,所以著者也没有详细给出讲解:

    (1)先准备以下数据:由于有了初始值 transformTobeMapped ,我们可以得到初始的旋转角\theta _{x},\theta_{y},\theta_{z} ,t_x,t_y,t_z

    所与可以得到旋转矩阵

    R=R_{x}R_{y}R_{z}=\begin{bmatrix} 1& 0&0 \\ 0& cos\theta_x &-sin\theta_x \\ 0& sin\theta_x & cos\theta_x \end{bmatrix}\begin{bmatrix} cos\theta_y& 0&sin\theta_y \\ 0& 1& 0\\ -sin\theta_y& 0 & cos\theta_y \end{bmatrix}\begin{bmatrix} cos\theta_z& -sin\theta_z&0 \\ sin\theta_z& cos\theta_z& 0\\ 0& 0 & 1 \end{bmatrix}

    (2) 论文中给出 点到线的距离公式,这里直接沿用,至于怎么来loam中讲得很明白,这里旨在读懂代码,明确代码里的变量怎么来。

    点到线的距离公式:d_{ek} =\frac{\left \| (p^e_{i+1,k} - p^e_{i,u} )\times (p^e_{i+1,k} - p^e_{i,v} )\right \|}{\left \| p^e_{i,u} - p^e_{i,v} \right \|},为了简化公式编写且与代码中的变量标号相对应,点到线的距离公式简化成 d =\frac{\left \| (p_0 - p_1 )\times (p_0 - p_2)\right \|}{\left \| p_1- p_2\right \|},首先要确定这几个点是怎么来的,其中 p_0为从lidar 坐标系变换到map 坐标系下的点,即p_0 = Rp^e + t ,p1p1p2p2主要来自与主成分分析法(PCA),在map 中找出距离 p_0最近的5个点,作主成分分析,如果这5个的点协方差矩阵的其中一个特征值远大于其他两个,则这5个点成线性分布,最大特征值对应的特征向量即为该直线的方向,在这5个点的中心沿着直线的正反方向各取一个点,即为p1p2

    (3)这里先给出 p_0Rt的雅可比,因为无论是平面点 或者是角点特征,都经过了这个变换,雅可比是一致,代码中也是这么分开来实现的。

    p0θ=(Rpe+t)θ=limθ>0exp(I+θ×)RpeRpeθ=limθ>0(I+θ×)RpeRpeθ=θ×Rpeθ=(Rpe)×θθ=(Rpe)×

    t的雅可比 为I

    (4)在函数cornerOptimization中求得点到线距离的残差项 存放在coeff.intensity 中,雅可比项就是存放在coeff的x,y,z 中。现在来推到点到线距离残差项dp_0的雅可比。

    1)首先要知道 xx=xx  各位可以 用(x,y,z)的模来推导一下.

    2)  dp0=1p1p2(p0p1)×(p0p2)p0

    =1p1p2(p0p1)×(p0p2)(p0p1)×(p0p2)(p0p1)×(p0p2)p0

    =1p1p2(p0p1)×(p0p2)(p0p1)×(p0p2)((p0p1)×(p0p2)p0+(p0p1)×(p0p2)p0)=1p1p2(p0p1)×(p0p2)(p0p1)×(p0p2)((p0p2)×(p0p1)p0+(p0p1)×(p0p2)p0)=1p1p2(p0p1)×(p0p2)(p0p1)×(p0p2)((p0p2)×+(p0p1)×)

    =1p1p2(p0p1)×(p0p2)(p0p1)×(p0p2)((p1p2)×)

    然后一项一项对应到代码里面去就好了。这样比看代码好理解一些。点到平面的距离就不写了(懒)。

    1. void cornerOptimization()
    2. { // 把初始化过后的待优化变量transformTobeMapped 转换成一个转换矩阵transPointAssociateToMap,
    3. // 用于对当前帧点云的坐标系转化到map 坐标系
    4. updatePointAssociateToMap();
    5. #pragma omp parallel for num_threads(numberOfCores)
    6. for (int i = 0; i < laserCloudCornerLastDSNum; i++)
    7. {
    8. // pointOri lidar 坐标系下的特征点
    9. // pointSel 是pointOri 转化到Map 坐标系下的点
    10. // coeff 距离函数对pointSel 的雅可比
    11. PointType pointOri, pointSel, coeff;
    12. // kdtree 搜索近邻点所需要的两个临时变量
    13. std::vector<int> pointSearchInd;
    14. std::vector<float> pointSearchSqDis;
    15. pointOri = laserCloudCornerLastDS->points[i]; //当前帧的特征点
    16. // 把点云点从lidar 坐标系映射到Map 坐标系
    17. pointAssociateToMap(&pointOri, &pointSel);
    18. // 以pointSel 为中心在局部地图的KdTree中找最近的5个点
    19. kdtreeCornerFromMap->nearestKSearch(pointSel, 5, pointSearchInd, pointSearchSqDis);
    20. cv::Mat matA1(3, 3, CV_32F, cv::Scalar::all(0)); // matA1 为这5个点的协方差矩阵
    21. cv::Mat matD1(1, 3, CV_32F, cv::Scalar::all(0)); // matD1 为协方差矩阵的特征值
    22. cv::Mat matV1(3, 3, CV_32F, cv::Scalar::all(0)); // matV1 为协方差矩阵的特征向量
    23. //从KdTree中找最近的5个点中最远距离的那个的距离小于1M,认为中的找到的这5个点有效
    24. if (pointSearchSqDis[4] < 1.0) {
    25. // C =[cx,cy,cz]^T 为这5个点的中心
    26. float cx = 0, cy = 0, cz = 0;
    27. for (int j = 0; j < 5; j++) {
    28. cx += laserCloudCornerFromMapDS->points[pointSearchInd[j]].x;
    29. cy += laserCloudCornerFromMapDS->points[pointSearchInd[j]].y;
    30. cz += laserCloudCornerFromMapDS->points[pointSearchInd[j]].z;
    31. }
    32. cx /= 5; cy /= 5; cz /= 5;
    33. float a11 = 0, a12 = 0, a13 = 0, a22 = 0, a23 = 0, a33 = 0;
    34. for (int j = 0; j < 5; j++) {
    35. float ax = laserCloudCornerFromMapDS->points[pointSearchInd[j]].x - cx;
    36. float ay = laserCloudCornerFromMapDS->points[pointSearchInd[j]].y - cy;
    37. float az = laserCloudCornerFromMapDS->points[pointSearchInd[j]].z - cz;
    38. a11 += ax * ax; a12 += ax * ay; a13 += ax * az;
    39. a22 += ay * ay; a23 += ay * az;
    40. a33 += az * az;
    41. }
    42. a11 /= 5; a12 /= 5; a13 /= 5; a22 /= 5; a23 /= 5; a33 /= 5;
    43. matA1.at<float>(0, 0) = a11; matA1.at<float>(0, 1) = a12; matA1.at<float>(0, 2) = a13;
    44. matA1.at<float>(1, 0) = a12; matA1.at<float>(1, 1) = a22; matA1.at<float>(1, 2) = a23;
    45. matA1.at<float>(2, 0) = a13; matA1.at<float>(2, 1) = a23; matA1.at<float>(2, 2) = a33;
    46. // 求取matA1 的特征值和特征向量
    47. cv::eigen(matA1, matD1, matV1);
    48. // matD1 是按从达到小的顺序分布的,如果特征值的第1个值比第二值足够大 则认为 这5个点是成线型分布的
    49. // 并且 其对应的特征向量则为直线的方向
    50. if (matD1.at<float>(0, 0) > 3 * matD1.at<float>(0, 1)) {
    51. // 构造3个点,这样就可计算一个点到其他两个点所在的直线的距离
    52. // 1.当前帧lidar 在map坐标系下的映射点
    53. float x0 = pointSel.x;
    54. float y0 = pointSel.y;
    55. float z0 = pointSel.z;
    56. // 2. 在5点的均值点处沿着 直线方向 距离0.1 取一个点
    57. float x1 = cx + 0.1 * matV1.at<float>(0, 0);
    58. float y1 = cy + 0.1 * matV1.at<float>(0, 1);
    59. float z1 = cz + 0.1 * matV1.at<float>(0, 2);
    60. // 3. 在5点的均值点处沿着 直线反方向 距离0.1 取一个点
    61. float x2 = cx - 0.1 * matV1.at<float>(0, 0);
    62. float y2 = cy - 0.1 * matV1.at<float>(0, 1);
    63. float z2 = cz - 0.1 * matV1.at<float>(0, 2);
    64. float a012 = sqrt(((x0 - x1)*(y0 - y2) - (x0 - x2)*(y0 - y1)) * ((x0 - x1)*(y0 - y2) - (x0 - x2)*(y0 - y1))
    65. + ((x0 - x1)*(z0 - z2) - (x0 - x2)*(z0 - z1)) * ((x0 - x1)*(z0 - z2) - (x0 - x2)*(z0 - z1))
    66. + ((y0 - y1)*(z0 - z2) - (y0 - y2)*(z0 - z1)) * ((y0 - y1)*(z0 - z2) - (y0 - y2)*(z0 - z1)));
    67. float l12 = sqrt((x1 - x2)*(x1 - x2) + (y1 - y2)*(y1 - y2) + (z1 - z2)*(z1 - z2));
    68. float la = ((y1 - y2)*((x0 - x1)*(y0 - y2) - (x0 - x2)*(y0 - y1))
    69. + (z1 - z2)*((x0 - x1)*(z0 - z2) - (x0 - x2)*(z0 - z1))) / a012 / l12;
    70. float lb = -((x1 - x2)*((x0 - x1)*(y0 - y2) - (x0 - x2)*(y0 - y1))
    71. - (z1 - z2)*((y0 - y1)*(z0 - z2) - (y0 - y2)*(z0 - z1))) / a012 / l12;
    72. float lc = -((x1 - x2)*((x0 - x1)*(z0 - z2) - (x0 - x2)*(z0 - z1))
    73. + (y1 - y2)*((y0 - y1)*(z0 - z2) - (y0 - y2)*(z0 - z1))) / a012 / l12;
    74. float ld2 = a012 / l12;
    75. // 残差项乘以一个系数,类似于核函数
    76. float s = 1 - 0.9 * fabs(ld2);
    77. coeff.x = s * la;
    78. coeff.y = s * lb;
    79. coeff.z = s * lc;
    80. coeff.intensity = s * ld2;
    81. if (s > 0.1) {
    82. laserCloudOriCornerVec[i] = pointOri;
    83. coeffSelCornerVec[i] = coeff;
    84. laserCloudOriCornerFlag[i] = true;
    85. }
    86. }
    87. }
    88. }
    89. }
    90. void surfOptimization()
    91. {
    92. updatePointAssociateToMap();
    93. #pragma omp parallel for num_threads(numberOfCores)
    94. for (int i = 0; i < laserCloudSurfLastDSNum; i++)
    95. {
    96. PointType pointOri, pointSel, coeff;
    97. std::vector<int> pointSearchInd;
    98. std::vector<float> pointSearchSqDis;
    99. pointOri = laserCloudSurfLastDS->points[i];
    100. pointAssociateToMap(&pointOri, &pointSel);
    101. kdtreeSurfFromMap->nearestKSearch(pointSel, 5, pointSearchInd, pointSearchSqDis);
    102. Eigen::Matrix<float, 5, 3> matA0;
    103. Eigen::Matrix<float, 5, 1> matB0;
    104. Eigen::Vector3f matX0;
    105. matA0.setZero();
    106. matB0.fill(-1);
    107. matX0.setZero();
    108. if (pointSearchSqDis[4] < 1.0) {
    109. for (int j = 0; j < 5; j++) {
    110. matA0(j, 0) = laserCloudSurfFromMapDS->points[pointSearchInd[j]].x;
    111. matA0(j, 1) = laserCloudSurfFromMapDS->points[pointSearchInd[j]].y;
    112. matA0(j, 2) = laserCloudSurfFromMapDS->points[pointSearchInd[j]].z;
    113. }
    114. matX0 = matA0.colPivHouseholderQr().solve(matB0);
    115. float pa = matX0(0, 0);
    116. float pb = matX0(1, 0);
    117. float pc = matX0(2, 0);
    118. float pd = 1;
    119. float ps = sqrt(pa * pa + pb * pb + pc * pc);
    120. pa /= ps; pb /= ps; pc /= ps; pd /= ps;
    121. bool planeValid = true;
    122. for (int j = 0; j < 5; j++) {
    123. if (fabs(pa * laserCloudSurfFromMapDS->points[pointSearchInd[j]].x +
    124. pb * laserCloudSurfFromMapDS->points[pointSearchInd[j]].y +
    125. pc * laserCloudSurfFromMapDS->points[pointSearchInd[j]].z + pd) > 0.2) {
    126. planeValid = false;
    127. break;
    128. }
    129. }
    130. if (planeValid) {
    131. float pd2 = pa * pointSel.x + pb * pointSel.y + pc * pointSel.z + pd;
    132. float s = 1 - 0.9 * fabs(pd2) / sqrt(sqrt(pointOri.x * pointOri.x
    133. + pointOri.y * pointOri.y + pointOri.z * pointOri.z));
    134. coeff.x = s * pa;
    135. coeff.y = s * pb;
    136. coeff.z = s * pc;
    137. coeff.intensity = s * pd2;
    138. if (s > 0.1) {
    139. laserCloudOriSurfVec[i] = pointOri;
    140. coeffSelSurfVec[i] = coeff;
    141. laserCloudOriSurfFlag[i] = true;
    142. }
    143. }
    144. }
    145. }
    146. }
    147. void combineOptimizationCoeffs()
    148. {
    149. // combine corner coeffs
    150. for (int i = 0; i < laserCloudCornerLastDSNum; ++i){
    151. if (laserCloudOriCornerFlag[i] == true){
    152. laserCloudOri->push_back(laserCloudOriCornerVec[i]);
    153. coeffSel->push_back(coeffSelCornerVec[i]);
    154. }
    155. }
    156. // combine surf coeffs
    157. for (int i = 0; i < laserCloudSurfLastDSNum; ++i){
    158. if (laserCloudOriSurfFlag[i] == true){
    159. laserCloudOri->push_back(laserCloudOriSurfVec[i]);
    160. coeffSel->push_back(coeffSelSurfVec[i]);
    161. }
    162. }
    163. // reset flag for next iteration
    164. std::fill(laserCloudOriCornerFlag.begin(), laserCloudOriCornerFlag.end(), false);
    165. std::fill(laserCloudOriSurfFlag.begin(), laserCloudOriSurfFlag.end(), false);
    166. }
    167. bool LMOptimization(int iterCount)
    168. {
    169. // This optimization is from the original loam_velodyne by Ji Zhang, need to cope with coordinate transformation
    170. // lidar <- camera --- camera <- lidar
    171. // x = z --- x = y
    172. // y = x --- y = z
    173. // z = y --- z = x
    174. // roll = yaw --- roll = pitch
    175. // pitch = roll --- pitch = yaw
    176. // yaw = pitch --- yaw = roll
    177. // lidar -> camera
    178. float srx = sin(transformTobeMapped[1]);
    179. float crx = cos(transformTobeMapped[1]);
    180. float sry = sin(transformTobeMapped[2]);
    181. float cry = cos(transformTobeMapped[2]);
    182. float srz = sin(transformTobeMapped[0]);
    183. float crz = cos(transformTobeMapped[0]);
    184. int laserCloudSelNum = laserCloudOri->size();
    185. if (laserCloudSelNum < 50) {
    186. return false;
    187. }
    188. cv::Mat matA(laserCloudSelNum, 6, CV_32F, cv::Scalar::all(0));
    189. cv::Mat matAt(6, laserCloudSelNum, CV_32F, cv::Scalar::all(0));
    190. cv::Mat matAtA(6, 6, CV_32F, cv::Scalar::all(0));
    191. cv::Mat matB(laserCloudSelNum, 1, CV_32F, cv::Scalar::all(0));
    192. cv::Mat matAtB(6, 1, CV_32F, cv::Scalar::all(0));
    193. cv::Mat matX(6, 1, CV_32F, cv::Scalar::all(0));
    194. PointType pointOri, coeff;
    195. for (int i = 0; i < laserCloudSelNum; i++) {
    196. // 把LIdar 坐标系(xyz-前左上)的数据转成跟Camera 坐标系(xyz-左 上 前)一样的数据(这么做纯粹是为了抄代码吗?)
    197. // lidar -> camera
    198. pointOri.x = laserCloudOri->points[i].y;
    199. pointOri.y = laserCloudOri->points[i].z;
    200. pointOri.z = laserCloudOri->points[i].x;
    201. // lidar -> camera
    202. coeff.x = coeffSel->points[i].y;
    203. coeff.y = coeffSel->points[i].z;
    204. coeff.z = coeffSel->points[i].x;
    205. coeff.intensity = coeffSel->points[i].intensity;
    206. // in camera
    207. float arx = (crx*sry*srz*pointOri.x + crx*crz*sry*pointOri.y - srx*sry*pointOri.z) * coeff.x
    208. + (-srx*srz*pointOri.x - crz*srx*pointOri.y - crx*pointOri.z) * coeff.y
    209. + (crx*cry*srz*pointOri.x + crx*cry*crz*pointOri.y - cry*srx*pointOri.z) * coeff.z;
    210. float ary = ((cry*srx*srz - crz*sry)*pointOri.x
    211. + (sry*srz + cry*crz*srx)*pointOri.y + crx*cry*pointOri.z) * coeff.x
    212. + ((-cry*crz - srx*sry*srz)*pointOri.x
    213. + (cry*srz - crz*srx*sry)*pointOri.y - crx*sry*pointOri.z) * coeff.z;
    214. float arz = ((crz*srx*sry - cry*srz)*pointOri.x + (-cry*crz-srx*sry*srz)*pointOri.y)*coeff.x
    215. + (crx*crz*pointOri.x - crx*srz*pointOri.y) * coeff.y
    216. + ((sry*srz + cry*crz*srx)*pointOri.x + (crz*sry-cry*srx*srz)*pointOri.y)*coeff.z;
    217. // camera -> lidar
    218. matA.at<float>(i, 0) = arz;
    219. matA.at<float>(i, 1) = arx;
    220. matA.at<float>(i, 2) = ary;
    221. matA.at<float>(i, 3) = coeff.z;
    222. matA.at<float>(i, 4) = coeff.x;
    223. matA.at<float>(i, 5) = coeff.y;
    224. matB.at<float>(i, 0) = -coeff.intensity;
    225. }
    226. cv::transpose(matA, matAt);
    227. matAtA = matAt * matA;
    228. matAtB = matAt * matB;
    229. cv::solve(matAtA, matAtB, matX, cv::DECOMP_QR);
    230. if (iterCount == 0) {
    231. cv::Mat matE(1, 6, CV_32F, cv::Scalar::all(0));
    232. cv::Mat matV(6, 6, CV_32F, cv::Scalar::all(0));
    233. cv::Mat matV2(6, 6, CV_32F, cv::Scalar::all(0));
    234. cv::eigen(matAtA, matE, matV);
    235. matV.copyTo(matV2);
    236. isDegenerate = false;
    237. float eignThre[6] = {100, 100, 100, 100, 100, 100};
    238. for (int i = 5; i >= 0; i--) {
    239. if (matE.at<float>(0, i) < eignThre[i]) {
    240. for (int j = 0; j < 6; j++) {
    241. matV2.at<float>(i, j) = 0;
    242. }
    243. isDegenerate = true;
    244. } else {
    245. break;
    246. }
    247. }
    248. matP = matV.inv() * matV2;
    249. }
    250. if (isDegenerate)
    251. {
    252. cv::Mat matX2(6, 1, CV_32F, cv::Scalar::all(0));
    253. matX.copyTo(matX2);
    254. matX = matP * matX2;
    255. }
    256. transformTobeMapped[0] += matX.at<float>(0, 0);
    257. transformTobeMapped[1] += matX.at<float>(1, 0);
    258. transformTobeMapped[2] += matX.at<float>(2, 0);
    259. transformTobeMapped[3] += matX.at<float>(3, 0);
    260. transformTobeMapped[4] += matX.at<float>(4, 0);
    261. transformTobeMapped[5] += matX.at<float>(5, 0);
    262. float deltaR = sqrt(
    263. pow(pcl::rad2deg(matX.at<float>(0, 0)), 2) +
    264. pow(pcl::rad2deg(matX.at<float>(1, 0)), 2) +
    265. pow(pcl::rad2deg(matX.at<float>(2, 0)), 2));
    266. float deltaT = sqrt(
    267. pow(matX.at<float>(3, 0) * 100, 2) +
    268. pow(matX.at<float>(4, 0) * 100, 2) +
    269. pow(matX.at<float>(5, 0) * 100, 2));
    270. if (deltaR < 0.05 && deltaT < 0.05) {
    271. return true; // converged
    272. }
    273. return false; // keep optimizing
    274. }

    后面关于因子图的代码就不说了,在imupreintegration 中有讲了,关键是这个文件太长了,乏了。

  • 相关阅读:
    穿越功耗墙,我们该从哪些方面提升“性能”?
    统信UOS 1060上通过Fail2Ban来Ban IP
    实现 Google 第三方授权登录
    Linux系统卡顿处理记录(Debian)
    nodejs卸载和安装教程
    形参与实参
    APC学习记录
    【教3妹学算法】特殊数组的特征值
    python系列教程214——列表解析与for和if
    Java-GUI编程之菜单组件
  • 原文地址:https://blog.csdn.net/fuzi2012/article/details/126429165