• 【SLAM】lidar-camera外参标定(港大MarsLab)无需二维码标定板


    综述

    标定问题也是一种位姿估计问题,它本质上和各种激光里程计和视觉里程计解决的是一样的问题。标定采用的办法也可以在里程计问题中借鉴。它们都有着共同的流程:

    a.特征提取

    b.特征匹配

    c.位姿求解

    参考论文: Pixel-level Extrinsic Self Calibration of High Resolution LiDAR and Camera in Targetless Environments

    github: https://github.com/hku-mars/livox_camera_calib

    这篇论文是著名的港大Mars Lab出品,他们解决的是livox和camera的标定问题。并且,不需要在环境里另外放置仍何类似于标定板或者二维码这样的先验物体!论文很长,很多篇幅在介绍他们这么做的理由,这部分我们跳过,只看他们在代码里是怎么做的。

    1. camera image的特征提取

    提取鲁棒的,高度辨识度的特征是最重要的一点。相机标定的时候,我们会用棋盘格,因为它特征明显,结构已知,能很容易在不同照片上实现数据关联。对于lidar和image而言,他们的数据是不一样的,如何找到可以匹配对的特征是第一个难题,所以他们选择了线特征。在照片上提取线特征,有opencv的办法,也有learning-based(例子)的办法。这篇论文2D线特征提取的办法比较简单,他们就用了opencv自带的canny算法,canny只能提取照片中的边缘信息,也就是说,只能告诉你哪个像素是不是直线点,但是不能告诉你这个像素属于哪个直线。一般要在照片上识别出不同的直线,要在canny的基础上,再利用hough算法或者LSD算法进一步提取直线,这几个算法都被opencv内置了。

    1. void Calibration::edgeDetector(
    2. const int &canny_threshold, const int &edge_threshold,
    3. const cv::Mat &src_img, cv::Mat &edge_img,
    4. pcl::PointCloud<pcl::PointXYZ>::Ptr &edge_cloud)
    5. {
    6. //高斯模糊
    7. int gaussian_size = 5;
    8. cv::GaussianBlur(src_img, src_img, cv::Size(gaussian_size, gaussian_size), 0, 0);
    9. //提取边缘像素
    10. cv::Mat canny_result = cv::Mat::zeros(height_, width_, CV_8UC1);
    11. cv::Canny(src_img, canny_result, canny_threshold, canny_threshold * 3, 3, true);
    12. //对这些直线点进行分组
    13. std::vector<std::vector<cv::Point>> contours;
    14. std::vector<cv::Vec4i> hierarchy;
    15. cv::findContours(canny_result, contours, hierarchy, cv::RETR_EXTERNAL, cv::CHAIN_APPROX_NONE, cv::Point(0, 0));
    16. //如果某一组的点数过少,那就不要,这相当于过滤操作
    17. edge_img = cv::Mat::zeros(height_, width_, CV_8UC1);
    18. edge_cloud = pcl::PointCloud<pcl::PointXYZ>::Ptr(new pcl::PointCloud<pcl::PointXYZ>);
    19. for (size_t i = 0; i < contours.size(); i++)//contours.size()
    20. {
    21. if (contours[i].size() > edge_threshold)
    22. {
    23. for (size_t j = 0; j < contours[i].size(); j++)
    24. {
    25. //what the use of p ???????
    26. pcl::PointXYZ p;
    27. p.x = contours[i][j].x;
    28. p.y = -contours[i][j].y;
    29. p.z = 0;
    30. edge_img.at<uchar>(-p.y, p.x) = 255;
    31. }
    32. }
    33. }
    34. //然后把直线点的像素坐标保存到pcl点云里
    35. //edge_cloud is the real output
    36. for (int x = 0; x < edge_img.cols; x++)
    37. {
    38. for (int y = 0; y < edge_img.rows; y++)
    39. {
    40. if (edge_img.at<uchar>(y, x) == 255)
    41. {
    42. pcl::PointXYZ p;
    43. p.x = x;
    44. p.y = -y; //TODO -y?
    45. p.z = 0;
    46. edge_cloud->points.push_back(p);
    47. }
    48. }
    49. }
    50. edge_cloud->width = edge_cloud->points.size();
    51. edge_cloud->height = 1;
    52. }

    在代码里,他们先对照片进行gaussblur去除噪声,然后进行canny提取边缘像素。他们最后之所以把像素保存到pcl的点云里,用这些点构造了以后一个kdtree,是因为他们在进行特征匹配的时候要借用pcl的kdtree找最近邻。

    2. lidar cloud的特征提取

    接下来,问题变成了如何在点云中找到属于线特征的那些点。一般来说,在3D 点云中提线,需要先提面,面和面之间的交线就是直线了。github上也有点云提线的代码,例如这篇,但是提的线的位置不是特别的准。

    pcl有现成的面分割算法,也是用ransac的方式去拟合一个个平面。但是如果点云中包含的面比较多,那么ransac就会失效。所以这个代码里,他们把点云分成边长1m的体素,对于每一个体素,里面包含的平面就不会很多,减少错误的概率。然后用pcl的分割器采用RANSAC的方式提平面,保留那些相交且夹角30-150度的平面,提取交线。论文中还有一个depth-continuous edge的概念。这个概念在论文里花了很大篇幅介绍。大概的意思是说提取的直线并不是整段都要,只保留离模型点云近的那一段段小线段,如果交线的某个位置离两个平面点云都很近,那么就会被选取。这些小线段就是论文中 depth-continuous edge,代码里其实非常简短。

    所以说,标定的时候,你不能在荒郊野外深山老林,周围环境里最好有一些棱角分明的建筑,比如说港大校园。

    1. void Calibration::LiDAREdgeExtraction(
    2. const std::unordered_map<VOXEL_LOC, Voxel *> &voxel_map,
    3. const float ransac_dis_thre, const int plane_size_threshold, //0.02, 60
    4. pcl::PointCloud<pcl::PointXYZI>::Ptr &lidar_line_cloud_3d)
    5. {
    6. ROS_INFO_STREAM("Extracting Lidar Edge");
    7. ros::Rate loop(5000);
    8. lidar_line_cloud_3d = pcl::PointCloud<pcl::PointXYZI>::Ptr(new pcl::PointCloud<pcl::PointXYZI>);
    9. for (auto iter = voxel_map.begin(); iter != voxel_map.end(); iter++)
    10. {
    11. if (iter->second->cloud->size() > 50)
    12. {
    13. std::vector<Plane> plane_list;
    14. //1.创建一个体素滤波器
    15. pcl::PointCloud<pcl::PointXYZI>::Ptr cloud_filter(new pcl::PointCloud<pcl::PointXYZI>);
    16. pcl::copyPointCloud(*iter->second->cloud, *cloud_filter);
    17. //创建一个模型参数对象,用于记录结果
    18. pcl::ModelCoefficients::Ptr coefficients(new pcl::ModelCoefficients);
    19. // inliers表示误差能容忍的点,记录点云序号
    20. pcl::PointIndices::Ptr inliers(new pcl::PointIndices);
    21. //创建一个分割器
    22. pcl::SACSegmentation<pcl::PointXYZI> seg;
    23. // Optional,设置结果平面展示的点是分割掉的点还是分割剩下的点
    24. seg.setOptimizeCoefficients(true);
    25. // Mandatory-设置目标几何形状
    26. seg.setModelType(pcl::SACMODEL_PLANE);
    27. //分割方法:随机采样法
    28. seg.setMethodType(pcl::SAC_RANSAC);
    29. //设置误差容忍范围,也就是阈值
    30. if (iter->second->voxel_origin[0] < 10)
    31. {
    32. seg.setDistanceThreshold(ransac_dis_thre);
    33. }
    34. else
    35. {
    36. seg.setDistanceThreshold(ransac_dis_thre);
    37. }
    38. //2.点云分割,提取平面
    39. pcl::PointCloud<pcl::PointXYZRGB> color_planner_cloud;
    40. int plane_index = 0;
    41. while (cloud_filter->points.size() > 10)
    42. {
    43. pcl::PointCloud<pcl::PointXYZI> planner_cloud;
    44. pcl::ExtractIndices<pcl::PointXYZI> extract;
    45. //输入点云
    46. seg.setInputCloud(cloud_filter);
    47. seg.setMaxIterations(500);
    48. //分割点云
    49. seg.segment(*inliers, *coefficients);
    50. if (inliers->indices.size() == 0)
    51. {
    52. ROS_INFO_STREAM("Could not estimate a planner model for the given dataset");
    53. break;
    54. }
    55. extract.setIndices(inliers);
    56. extract.setInputCloud(cloud_filter);
    57. extract.filter(planner_cloud);
    58. if (planner_cloud.size() > plane_size_threshold)
    59. {
    60. pcl::PointCloud<pcl::PointXYZRGB> color_cloud;
    61. std::vector<unsigned int> colors;
    62. colors.push_back(static_cast<unsigned int>(rand() % 256));
    63. colors.push_back(static_cast<unsigned int>(rand() % 256));
    64. colors.push_back(static_cast<unsigned int>(rand() % 256));
    65. pcl::PointXYZ p_center(0, 0, 0);
    66. for (size_t i = 0; i < planner_cloud.points.size(); i++)
    67. {
    68. pcl::PointXYZRGB p;
    69. p.x = planner_cloud.points[i].x;
    70. p.y = planner_cloud.points[i].y;
    71. p.z = planner_cloud.points[i].z;
    72. p_center.x += p.x;
    73. p_center.y += p.y;
    74. p_center.z += p.z;
    75. p.r = colors[0];
    76. p.g = colors[1];
    77. p.b = colors[2];
    78. color_cloud.push_back(p);
    79. color_planner_cloud.push_back(p);
    80. }
    81. p_center.x = p_center.x / planner_cloud.size();
    82. p_center.y = p_center.y / planner_cloud.size();
    83. p_center.z = p_center.z / planner_cloud.size();
    84. Plane single_plane;
    85. single_plane.cloud = planner_cloud;
    86. single_plane.p_center = p_center;
    87. single_plane.normal << coefficients->values[0], coefficients->values[1], coefficients->values[2];
    88. single_plane.index = plane_index;
    89. plane_list.push_back(single_plane);
    90. plane_index++;
    91. }
    92. //3.提取平面后剩下的点云, unused
    93. extract.setNegative(true);
    94. pcl::PointCloud<pcl::PointXYZI> cloud_f;
    95. extract.filter(cloud_f);
    96. *cloud_filter = cloud_f;
    97. }
    98. if (plane_list.size() >= 2)
    99. {
    100. sensor_msgs::PointCloud2 planner_cloud2;
    101. pcl::toROSMsg(color_planner_cloud, planner_cloud2);
    102. planner_cloud2.header.frame_id = "livox";
    103. planner_cloud_pub_.publish(planner_cloud2);
    104. //loop.sleep();
    105. }
    106. //4.获取平面交线点云
    107. std::vector<pcl::PointCloud<pcl::PointXYZI>> line_cloud_list;
    108. calcLine(plane_list, voxel_size_, iter->second->voxel_origin, line_cloud_list);
    109. // ouster 5,normal 3
    110. //if contains too many lines, it is more likely to have fake lines
    111. if (line_cloud_list.size() > 0 && line_cloud_list.size() <= 8)
    112. {
    113. for (size_t cloud_index = 0; cloud_index < line_cloud_list.size(); cloud_index++)
    114. {
    115. for (size_t i = 0; i < line_cloud_list[cloud_index].size(); i++)
    116. {
    117. pcl::PointXYZI p = line_cloud_list[cloud_index].points[i];
    118. plane_line_cloud_->points.push_back(p);
    119. sensor_msgs::PointCloud2 pub_cloud;
    120. pcl::toROSMsg(line_cloud_list[cloud_index], pub_cloud);
    121. pub_cloud.header.frame_id = "livox";
    122. line_cloud_pub_.publish(pub_cloud);
    123. //loop.sleep();
    124. plane_line_number_.push_back(line_number_);
    125. }
    126. line_number_++;
    127. }
    128. }
    129. }
    130. }
    131. }

    把一个体素范围内的平面都分割出来以后,求直线就比较暴力了。因为一个体素里总共也没几个平面,那就俩俩之间暴力求交线就好,这部分代码在这个函数里:

    1. void Calibration::calcLine(
    2. const std::vector<Plane> &plane_list, const double voxel_size,
    3. const Eigen::Vector3d origin,
    4. std::vector<pcl::PointCloud<pcl::PointXYZI>> &line_cloud_list)
    5. {
    6. ...
    7. //5. if current location is close to both 2 plane clouds, so the point is on the depth contiuous edge
    8. if ((dis1 < min_line_dis_threshold_*min_line_dis_threshold_ && dis2 < max_line_dis_threshold_*max_line_dis_threshold_) || (dis1 < max_line_dis_threshold_*max_line_dis_threshold_ && dis2 < min_line_dis_threshold_*min_line_dis_threshold_))
    9. {
    10. line_cloud.push_back(p);
    11. }
    12. ...
    13. }

    那个if的判断就是关于depth-continuous edge。

    3. lidar-camera特征关联

    无论是点云中提取的3D线,还是image中提取的2D线,作者并没有用类似于Ax+By+Cz+D=0的形式去描述,仍然是以点的形式保存的。当3D线点重投影到平面时,就用image像素点构造的kdtree查询最近临作为一组关联的数据,不去用各种trick,简单高效。代码里,void Calibration::buildVPnp()是构造特征匹配的函数。

    1. //find the correspondance of 3D points and 2D points with their directions
    2. void Calibration::buildVPnp(
    3. const Vector6d &extrinsic_params, const int dis_threshold,
    4. const bool show_residual,
    5. const pcl::PointCloud<pcl::PointXYZ>::Ptr &cam_edge_cloud_2d,
    6. const pcl::PointCloud<pcl::PointXYZI>::Ptr &lidar_line_cloud_3d,
    7. std::vector<VPnPData> &pnp_list)
    8. {
    9. //1.initialize containers for 3D lines: for each pixel there is a corresponding cloud
    10. //because some closed 3D points may preject onto same pixel, so the container helpes to calculate averagy value
    11. pnp_list.clear();
    12. std::vector<std::vector<std::vector<pcl::PointXYZI>>> img_pts_container;
    13. for (int y = 0; y < height_; y++)
    14. {
    15. std::vector<std::vector<pcl::PointXYZI>> row_pts_container;
    16. for (int x = 0; x < width_; x++)
    17. {
    18. std::vector<pcl::PointXYZI> col_pts_container;
    19. row_pts_container.push_back(col_pts_container);
    20. }
    21. img_pts_container.push_back(row_pts_container);
    22. }
    23. //2.get 3D lines, initial pose, intrinsics
    24. std::vector<cv::Point3d> pts_3d;
    25. Eigen::AngleAxisd rotation_vector3;
    26. rotation_vector3 = Eigen::AngleAxisd(extrinsic_params[0], Eigen::Vector3d::UnitZ()) *
    27. Eigen::AngleAxisd(extrinsic_params[1], Eigen::Vector3d::UnitY()) *
    28. Eigen::AngleAxisd(extrinsic_params[2], Eigen::Vector3d::UnitX());
    29. for (size_t i = 0; i < lidar_line_cloud_3d->size(); i++)
    30. {
    31. pcl::PointXYZI point_3d = lidar_line_cloud_3d->points[i];
    32. pts_3d.emplace_back(cv::Point3d(point_3d.x, point_3d.y, point_3d.z));
    33. }
    34. cv::Mat camera_matrix = (cv::Mat_<double>(3, 3) << fx_, 0.0, cx_, 0.0, fy_, cy_, 0.0, 0.0, 1.0);
    35. cv::Mat distortion_coeff = (cv::Mat_<double>(1, 5) << k1_, k2_, p1_, p2_, k3_);
    36. cv::Mat r_vec = (cv::Mat_<double>(3, 1)
    37. << rotation_vector3.angle() * rotation_vector3.axis().transpose()[0],
    38. rotation_vector3.angle() * rotation_vector3.axis().transpose()[1],
    39. rotation_vector3.angle() * rotation_vector3.axis().transpose()[2]);
    40. cv::Mat t_vec = (cv::Mat_<double>(3, 1) << extrinsic_params[3], extrinsic_params[4], extrinsic_params[5]);
    41. // debug
    42. // std::cout << "camera_matrix:" << camera_matrix << std::endl;
    43. // std::cout << "distortion_coeff:" << distortion_coeff << std::endl;
    44. // std::cout << "r_vec:" << r_vec << std::endl;
    45. // std::cout << "t_vec:" << t_vec << std::endl;
    46. // std::cout << "pts 3d size:" << pts_3d.size() << std::endl;
    47. //3.project 3d-points into image view and fall into the container, look out these info are from 3D lines
    48. std::vector<cv::Point2d> pts_2d;
    49. cv::projectPoints(pts_3d, r_vec, t_vec, camera_matrix, distortion_coeff, pts_2d);
    50. pcl::PointCloud<pcl::PointXYZ>::Ptr line_edge_cloud_2d(new pcl::PointCloud<pcl::PointXYZ>);
    51. std::vector<int> line_edge_cloud_2d_number;
    52. for (size_t i = 0; i < pts_2d.size(); i++)
    53. {
    54. pcl::PointXYZ p;
    55. p.x = pts_2d[i].x;
    56. p.y = -pts_2d[i].y;
    57. p.z = 0;
    58. pcl::PointXYZI pi_3d;
    59. pi_3d.x = pts_3d[i].x;
    60. pi_3d.y = pts_3d[i].y;
    61. pi_3d.z = pts_3d[i].z;
    62. pi_3d.intensity = 1;
    63. if (p.x > 0 && p.x < width_ && pts_2d[i].y > 0 && pts_2d[i].y < height_)
    64. {
    65. if (img_pts_container[pts_2d[i].y][pts_2d[i].x].size() == 0)
    66. {
    67. line_edge_cloud_2d->points.push_back(p);
    68. line_edge_cloud_2d_number.push_back(plane_line_number_[i]);
    69. img_pts_container[pts_2d[i].y][pts_2d[i].x].push_back(pi_3d);
    70. }
    71. else
    72. {
    73. img_pts_container[pts_2d[i].y][pts_2d[i].x].push_back(pi_3d);
    74. }
    75. }
    76. }
    77. if (show_residual)
    78. {
    79. cv::Mat residual_img = getConnectImg(dis_threshold, cam_edge_cloud_2d, line_edge_cloud_2d);
    80. cv::imshow("residual", residual_img);
    81. cv::waitKey(50);
    82. }
    83. //4.build kdtree to find the closest 2D point of each 3D point
    84. pcl::search::KdTree<pcl::PointXYZ>::Ptr kdtree(new pcl::search::KdTree<pcl::PointXYZ>());
    85. pcl::search::KdTree<pcl::PointXYZ>::Ptr kdtree_lidar(new pcl::search::KdTree<pcl::PointXYZ>());
    86. pcl::PointCloud<pcl::PointXYZ>::Ptr search_cloud = pcl::PointCloud<pcl::PointXYZ>::Ptr(new pcl::PointCloud<pcl::PointXYZ>);
    87. pcl::PointCloud<pcl::PointXYZ>::Ptr tree_cloud = pcl::PointCloud<pcl::PointXYZ>::Ptr(new pcl::PointCloud<pcl::PointXYZ>);
    88. pcl::PointCloud<pcl::PointXYZ>::Ptr tree_cloud_lidar = pcl::PointCloud<pcl::PointXYZ>::Ptr(new pcl::PointCloud<pcl::PointXYZ>);
    89. kdtree->setInputCloud(cam_edge_cloud_2d);
    90. kdtree_lidar->setInputCloud(line_edge_cloud_2d);
    91. tree_cloud = cam_edge_cloud_2d;
    92. tree_cloud_lidar = line_edge_cloud_2d;
    93. search_cloud = line_edge_cloud_2d;
    94. // 指定近邻个数
    95. int K = 5;
    96. // 创建两个向量,分别存放近邻的索引值、近邻的中心距
    97. std::vector<int> pointIdxNKNSearch(K);
    98. std::vector<float> pointNKNSquaredDistance(K);
    99. std::vector<int> pointIdxNKNSearchLidar(K);
    100. std::vector<float> pointNKNSquaredDistanceLidar(K);
    101. int match_count = 0;
    102. double mean_distance;
    103. int line_count = 0;
    104. std::vector<cv::Point2d> lidar_2d_list;
    105. std::vector<cv::Point2d> img_2d_list;
    106. std::vector<Eigen::Vector2d> camera_direction_list;
    107. std::vector<Eigen::Vector2d> lidar_direction_list;
    108. std::vector<int> lidar_2d_number;
    109. //scan each 3D point
    110. for (size_t i = 0; i < search_cloud->points.size(); i++)
    111. {
    112. pcl::PointXYZ searchPoint = search_cloud->points[i];
    113. if ((kdtree ->nearestKSearch(searchPoint, K, pointIdxNKNSearch, pointNKNSquaredDistance) > 0) &&
    114. (kdtree_lidar->nearestKSearch(searchPoint, K, pointIdxNKNSearchLidar, pointNKNSquaredDistanceLidar) > 0))
    115. {
    116. bool dis_check = true;
    117. for (int j = 0; j < K; j++)
    118. {
    119. float distance = sqrt(pow(searchPoint.x - tree_cloud->points[pointIdxNKNSearch[j]].x, 2) +
    120. pow(searchPoint.y - tree_cloud->points[pointIdxNKNSearch[j]].y, 2));
    121. if (distance > dis_threshold)
    122. {
    123. dis_check = false;
    124. }
    125. }
    126. //5.calculate the direction of 3D lines and 2D lines on pixel frame
    127. //if the distances with all 5 closest 2D points is <20
    128. if (dis_check)
    129. {
    130. Eigen::Vector2d direction_cam(0, 0);
    131. std::vector<Eigen::Vector2d> points_cam;
    132. for (size_t i = 0; i < pointIdxNKNSearch.size(); i++)
    133. {
    134. Eigen::Vector2d p(tree_cloud->points[pointIdxNKNSearch[i]].x, tree_cloud->points[pointIdxNKNSearch[i]].y);
    135. points_cam.push_back(p);
    136. }
    137. calcDirection(points_cam, direction_cam);
    138. Eigen::Vector2d direction_lidar(0, 0);
    139. std::vector<Eigen::Vector2d> points_lidar;
    140. for (size_t i = 0; i < pointIdxNKNSearch.size(); i++)
    141. {
    142. Eigen::Vector2d p(tree_cloud_lidar->points[pointIdxNKNSearchLidar[i]].x, tree_cloud_lidar->points[pointIdxNKNSearchLidar[i]].y);
    143. points_lidar.push_back(p);
    144. }
    145. calcDirection(points_lidar, direction_lidar);
    146. // direction.normalize();
    147. cv::Point p_l_2d(search_cloud->points[i].x, -search_cloud->points[i].y);
    148. cv::Point p_c_2d(tree_cloud->points[pointIdxNKNSearch[0]].x, -tree_cloud->points[pointIdxNKNSearch[0]].y);
    149. if (checkFov(p_l_2d))
    150. {
    151. lidar_2d_list.push_back(p_l_2d); //projected point of the 3D point
    152. img_2d_list.push_back(p_c_2d); //corresponding 2D point
    153. camera_direction_list.push_back(direction_cam); //direction of corresponding 2D point
    154. lidar_direction_list.push_back(direction_lidar); //direction of the projected point of the 3D point
    155. lidar_2d_number.push_back(line_edge_cloud_2d_number[i]); //the idx of each 3D lines
    156. }
    157. }
    158. }
    159. }
    160. //6.build correspondance
    161. for (size_t i = 0; i < lidar_2d_list.size(); i++)
    162. {
    163. int y = lidar_2d_list[i].y;
    164. int x = lidar_2d_list[i].x;
    165. int pixel_points_size = img_pts_container[y][x].size();
    166. if (pixel_points_size > 0)
    167. {
    168. VPnPData pnp;
    169. pnp.x = 0;
    170. pnp.y = 0;
    171. pnp.z = 0;
    172. //corresponding 2D point
    173. pnp.u = img_2d_list[i].x;
    174. pnp.v = img_2d_list[i].y;
    175. //3D point (averaged), TODO what if they are far to each other?
    176. for (size_t j = 0; j < pixel_points_size; j++)
    177. {
    178. pnp.x += img_pts_container[y][x][j].x;
    179. pnp.y += img_pts_container[y][x][j].y;
    180. pnp.z += img_pts_container[y][x][j].z;
    181. }
    182. pnp.x = pnp.x / pixel_points_size;
    183. pnp.y = pnp.y / pixel_points_size;
    184. pnp.z = pnp.z / pixel_points_size;
    185. //direction of corresponding 2D point
    186. pnp.direction = camera_direction_list[i];
    187. //direction of the projected point of the 3D point
    188. pnp.direction_lidar = lidar_direction_list[i];
    189. //the idx of the 3D line which the 3D point belongs to
    190. pnp.number = lidar_2d_number[i];
    191. float theta = pnp.direction.dot(pnp.direction_lidar);
    192. if (theta > direction_theta_min_ || theta < direction_theta_max_) //-30-+30, 150-270 deg.
    193. {
    194. pnp_list.push_back(pnp);
    195. }
    196. }
    197. }
    198. }

    对于3D 点云的每一个线,把它的点都根据当前外参Tcl的初值,内参K投影到image上,获得一个像素坐标,然后查询kdree找到最近的几个刚才通过canny算法提取的2D像素,用这几个像素可以计算出一个均值,直线向量,就可以获得了一组特征匹配了:

    3D点+2D点/2D点向量。

    归根到底,获得的还是(lidar)点和(image)点之间的匹配关系,当然了也可以认为是(lidar)点和(image)线的匹配关系,因为后者保存了直线向量。也有别的方案保存的3D线和2D线都是以两个端点描述的,是真正意义上的(lidar)线和(image)线之间的匹配。

    4. 优化模型

    损失函数就是常见的点-线距离或者点-点距离了。对于3D点,根据待优化Tcl转到cam系,转成像素坐标,去畸变,在和匹配的2D点求出像素距离作为residual。

    1. vpnp_calib(VPnPData p) {pd = p;}
    2. template <typename T>
    3. bool operator()(const T *_q, const T *_t, T *residuals) const
    4. {
    5. //camera param
    6. Eigen::Matrix<T, 3, 3> innerT = inner.cast<T>();
    7. Eigen::Matrix<T, 4, 1> distorT = distor.cast<T>();
    8. const T &fx = innerT.coeffRef(0, 0);
    9. const T &cx = innerT.coeffRef(0, 2);
    10. const T &fy = innerT.coeffRef(1, 1);
    11. const T &cy = innerT.coeffRef(1, 2);
    12. //initial value of Tcl
    13. Eigen::Quaternion<T> q_incre{_q[3], _q[0], _q[1], _q[2]};
    14. Eigen::Matrix<T, 3, 1> t_incre{_t[0], _t[1], _t[2]};
    15. //project 3D point onto pixel frame
    16. Eigen::Matrix<T, 3, 1> p_l(T(pd.x), T(pd.y), T(pd.z));
    17. Eigen::Matrix<T, 3, 1> p_c = q_incre.toRotationMatrix() * p_l + t_incre;
    18. Eigen::Matrix<T, 3, 1> p_2 = innerT * p_c;
    19. T uo = p_2[0] / p_2[2];
    20. T vo = p_2[1] / p_2[2];
    21. //undistort
    22. T xo = (uo - cx) / fx;
    23. T yo = (vo - cy) / fy;
    24. T r2 = xo * xo + yo * yo;
    25. T r4 = r2 * r2;
    26. T distortion = 1.0 + distorT[0] * r2 + distorT[1] * r4;
    27. T xd = xo * distortion + (distorT[2] * xo * yo + distorT[2] * xo * yo) + distorT[3] * (r2 + xo * xo + xo * xo);
    28. T yd = yo * distortion + distorT[3] * xo * yo + distorT[3] * xo * yo + distorT[2] * (r2 + yo * yo + yo * yo);
    29. T ud = fx * xd + cx;
    30. T vd = fy * yd + cy;
    31. if (T(pd.direction(0)) == T(0.0) && T(pd.direction(1)) == T(0.0))
    32. {
    33. residuals[0] = ud - T(pd.u);
    34. residuals[1] = vd - T(pd.v);
    35. }
    36. else
    37. {
    38. residuals[0] = ud - T(pd.u);
    39. residuals[1] = vd - T(pd.v);
    40. Eigen::Matrix<T, 2, 2> I = Eigen::Matrix<float, 2, 2>::Identity().cast<T>();
    41. Eigen::Matrix<T, 2, 1> n = pd.direction.cast<T>();
    42. Eigen::Matrix<T, 1, 2> nt = pd.direction.transpose().cast<T>();
    43. Eigen::Matrix<T, 2, 2> V = n * nt;
    44. V = I - V;
    45. Eigen::Matrix<T, 2, 2> R = Eigen::Matrix<float, 2, 2>::Zero().cast<T>();
    46. R.coeffRef(0, 0) = residuals[0];
    47. R.coeffRef(1, 1) = residuals[1];
    48. R = V * R * V.transpose();
    49. residuals[0] = R.coeffRef(0, 0);
    50. residuals[1] = R.coeffRef(1, 1);
    51. }
    52. return true;
    53. }



     

  • 相关阅读:
    通过Python脚本+Jekins实现项目重启
    【Android】ARouter新手快速入门
    git安装使用简介
    解决库存扣减及订单创建时防止并发死锁的问题
    java处理异常这一篇就够了
    canal简单使用
    android google登录和支付
    java基础加进阶学习笔记-------超详细的笔记(java基础知识)
    Mysql性能优化-----持续更新
    闭式旁流水处理器简介
  • 原文地址:https://blog.csdn.net/iwanderu/article/details/125457311