PCL: Scale two Point-Clouds to the same size

2020-07-25 10:37发布

问题:

I got two point clouds and try to scale them to the same size. My first approach was to just divide the square roots from the eigenvalues:

pcl::PCA<pcl::PointNormal> pca;
pca.setInputCloud(model_cloud_ptr);
Eigen::Vector3f ev_M = pca.getEigenValues();

pca.setInputCloud(segmented_cloud_ptr);
Eigen::Vector3f ev_S = pca.getEigenValues();

double s = sqrt(ev_M[0])/sqrt(ev_S[0]);

This helps me to scale my model cloud to have approximately the same size as my segmented cloud. But the result is really not that perfect. It is a simple estimation. I tried doing it with TransformationEstimationSVDScale and also with SampleConsensusModelRegistration like in this tutorial. But when doing this I get the message, that the number of source points/indices differs from the number of target points/indices.

What would be the best approach for me to scale the clouds to the same size, when having different numbers of points in them?

Edit I tried doing what @dspeyer proposed but this gives me a scaling factor of almost 1.0

pcl::PCA<pcl::PointNormal> pca;
pca.setInputCloud(model_cloud_ptr);
Eigen::Matrix3f ev_M = pca.getEigenVectors();
Eigen::Vector3f ev_M1 = ev_M.col(0);
Eigen::Vector3f ev_M2 = ev_M.col(1);

auto dist_M1 = ev_M1.maxCoeff()-ev_M1.minCoeff();
auto dist_M2 = ev_M2.maxCoeff()-ev_M2.minCoeff();  
auto distM_max = std::max(dist_M1, dist_M2);

pca.setInputCloud(segmented_cloud_ptr);
Eigen::Matrix3f ev_S = pca.getEigenVectors();
Eigen::Vector3f ev_S1 = ev_S.col(0);
Eigen::Vector3f ev_S2 = ev_S.col(1);

auto dist_S1 = ev_S1.maxCoeff()-ev_S1.minCoeff();
auto dist_S2 = ev_S2.maxCoeff()-ev_S2.minCoeff();
auto distS_max = std::max(dist_S1, dist_S2);

double s = distS_max / distM_max;

回答1:

I would suggest using eigenvectors of each cloud to identify each ones primary axis of variation and then scaling them based on each clouds variation in that axis. In my example I used an oriented bounding box (max min in eigenspace), but mean value or standard deviation in the primary axis (x axis in eigenspace) could be better metrics depending on the application.

I left some debug flags in the function in case they are helpful to you, but gave them the defaults that I expect you will use. I tested for variable axis stretching and variable rotations of sample and golden clouds. This function should be able to handle that all just fine.

One caveat of this method is that if warping is axially variable AND warping causes one axis to overcome another axis as primary axis of variation, this function could improperly scale the clouds. I am not sure if this edge case is relevant to you. As long as you have uniform scaling between your clouds, this case should never occur.

debugFlags: debugOverlay will leave both input clouds scaled and in their respective eigen orientations (allows more easy comparison). primaryAxisOnly will use only the primary axis of variation to perform scaling if true, if false, it will scale all 3 axes of variation independently.

Function:

void rescaleClouds(pcl::PointCloud<pcl::PointXYZ>::Ptr& goldenCloud, pcl::PointCloud<pcl::PointXYZ>::Ptr& sampleCloud, bool debugOverlay = false, bool primaryAxisOnly = true)
{
    //analyze golden cloud
    pcl::PCA<pcl::PointXYZ> pcaGolden;
    pcaGolden.setInputCloud(goldenCloud);
    Eigen::Matrix3f goldenEVs_Dir = pcaGolden.getEigenVectors();
    Eigen::Vector4f goldenMidPt = pcaGolden.getMean();
    Eigen::Matrix4f goldenTransform = Eigen::Matrix4f::Identity();
    goldenTransform.block<3, 3>(0, 0) = goldenEVs_Dir;
    goldenTransform.block<4, 1>(0, 3) = goldenMidPt;
    pcl::PointCloud<pcl::PointXYZ>::Ptr orientedGolden(new pcl::PointCloud<pcl::PointXYZ>);
    pcl::transformPointCloud(*goldenCloud, *orientedGolden, goldenTransform.inverse());
    pcl::PointXYZ goldenMin, goldenMax;
    pcl::getMinMax3D(*orientedGolden, goldenMin, goldenMax);

    //analyze sample cloud
    pcl::PCA<pcl::PointXYZ> pcaSample;
    pcaSample.setInputCloud(sampleCloud);
    Eigen::Matrix3f sampleEVs_Dir = pcaSample.getEigenVectors();
    Eigen::Vector4f sampleMidPt = pcaSample.getMean();
    Eigen::Matrix4f sampleTransform = Eigen::Matrix4f::Identity();
    sampleTransform.block<3, 3>(0, 0) = sampleEVs_Dir;
    sampleTransform.block<4, 1>(0, 3) = sampleMidPt;
    pcl::PointCloud<pcl::PointXYZ>::Ptr orientedSample(new pcl::PointCloud<pcl::PointXYZ>);
    pcl::transformPointCloud(*sampleCloud, *orientedSample, sampleTransform.inverse());
    pcl::PointXYZ sampleMin, sampleMax;
    pcl::getMinMax3D(*orientedSample, sampleMin, sampleMax);

    //apply scaling to oriented sample cloud 
    double xScale = (sampleMax.x - sampleMin.x) / (goldenMax.x - goldenMin.x);
    double yScale = (sampleMax.y - sampleMin.y) / (goldenMax.y - goldenMin.y);
    double zScale = (sampleMax.z - sampleMin.z) / (goldenMax.z - goldenMin.z);

    if (primaryAxisOnly) { std::cout << "scale: " << xScale << std::endl; }
    else { std::cout << "xScale: " << xScale << "yScale: " << yScale << "zScale: " << zScale << std::endl; }


    for (int i = 0; i < orientedSample->points.size(); i++)
    {
        if (primaryAxisOnly)
        {
            orientedSample->points[i].x = orientedSample->points[i].x / xScale;
            orientedSample->points[i].y = orientedSample->points[i].y / xScale;
            orientedSample->points[i].z = orientedSample->points[i].z / xScale;
        }
        else
        {
            orientedSample->points[i].x = orientedSample->points[i].x / xScale;
            orientedSample->points[i].y = orientedSample->points[i].y / yScale;
            orientedSample->points[i].z = orientedSample->points[i].z / zScale;
        }
    }
    //depending on your next step, it may be reasonable to leave this cloud at its eigen orientation, but this transformation will allow this function to scale in place.

    if (debugOverlay)
    {
        goldenCloud = orientedGolden;
        sampleCloud = orientedSample;
    }
    else
    {
        pcl::transformPointCloud(*orientedSample, *sampleCloud, sampleTransform);
    }
}

Test Code (you will need your own clouds and visualizers):

pcl::PointCloud<pcl::PointXYZ>::Ptr golden(new pcl::PointCloud<pcl::PointXYZ>);
fileIO::loadFromPCD(golden, "CT_Scan_Nov_7_fullSpine.pcd");
CloudVis::simpleVis(golden);

double xStretch = 1.75;
double yStretch = 1.65;
double zStretch = 1.5;
pcl::PointCloud<pcl::PointXYZ>::Ptr stretched(new pcl::PointCloud<pcl::PointXYZ>);
for (int i = 0; i < golden->points.size(); i++)
{
    pcl::PointXYZ pt = golden->points[i];
    stretched->points.push_back(pcl::PointXYZ(pt.x * xStretch, pt.y * yStretch, pt.z * zStretch));
}
Eigen::Affine3f arbRotation = Eigen::Affine3f::Identity();
arbRotation.rotate(Eigen::AngleAxisf(M_PI / 4.0, Eigen::Vector3f::UnitY()));
pcl::transformPointCloud(*stretched, *stretched, arbRotation);

CloudVis::rgbClusterVis(golden, stretched);

rescaleClouds(golden, stretched,true,false);
CloudVis::rgbClusterVis(golden, stretched);


回答2:

Seems like you should be able to:

  • Project everything onto the first two eigenvectors
  • Take the min and max for each
  • Subtract max-min for each eigenvector/dataset pair
  • Take the max of the two ranges (usually, but not always the first eigenvector -- when it isn't you'll want to rotate the final display)
  • Use the ratio of those maxes as the scaling constant