FindFundamentalMatrix doesn't find fundamental

2020-05-29 00:04发布

问题:

I am trying to recover the movement of a camera by using the fundamental matrix, and the algorithm as given on Wikipedia. For this I need to find the fundamental matrix. I am using OpenCV::findFundamentalMat for this.

Two unexpected behaviours:

  1. Using different fitting algorithms produces different results, especially FM_8POINT is different.
  2. Given a set of point pairs (y, x), yFx = 0 is not fulfilled and is always larger than 0.

Have I not understood something here? Is my example false, or what is going on? Can anyone suggest a better test example?

Below is a minimal example. Create 12 artificial points, shift each of those points 10 pixel to the right, find the fundamental matrix from these two sets of points and print yFx for each point.

Example:

int main(int argc, const char* argv[])
{
   // Create two sets of points. Points in pts2 are moved 10pixel to the right of the points in pts1.
   std::vector<cv::Point2f> pts1, pts2;
   for(double y = 0; y < 460; y+=150)
   {
           for(double x= 0; x < 320; x += 150)
           {
                   pts1.push_back(cv::Point2f(x, y));
                   pts2.push_back(cv::Point2f(x+10.0, y));
           }
   }

   cv::Mat F = cv::findFundamentalMat(pts1, pts2);

   for(int i = 0; i < pts1.size(); i++)
   {
           // Creating p1, p2, the two points. Please let me know if this can be done in fewer lines.
           cv::Mat p1(3,1, CV_64FC1), p2(3,1, CV_64FC1);

           p1.at<double>(0) = pts1.at(i).x;
           p1.at<double>(1) = pts1.at(i).y;
           p1.at<double>(2) = 1.0;

           p2.at<double>(0) = pts2.at(i).x;
           p2.at<double>(1) = pts2.at(i).y;
           p2.at<double>(2) = 1.0;

           // Print yFx for each pair of points. This should be 0 for all.
           cout << p1.t() * F * p2 << endl;
   }
}

For FM_RANSAC I get

[1.999], [2], [2], [1.599], [1.599], [1.599], [1.198], [1.198], [1.198], [0.798], [0.798], [0.798]

For FM_8POINT the fundamental matrix is zeros(3,3) and thus yFx is 0 for all y, x.

I only found: T and R estimation from essential matrix but that didn't help much.

Edit: yFx is the wrong way round (p1/p2 switched in the cout-line). This is example is also not working because all points lie on a plane.

回答1:

I believe that the fundamental matrix solves the equation p2.t() * F * p1 = 0, i.e. you have p1 and p2 reversed in your code. As to why the 8-point algorithm is returning the zero matrix, I have no idea, sorry.

Edit: Okay, I believe I recall why the 8-point algorithm is producing a bad result here. Your motion between the two set of points is pure translation without rotation, i.e. it only has three degrees of freedom. The fundamental matrix has 7 degrees of freedom, so it is impossible to estimate; this is called a degenerate case. See this paper for a further description of degenerate cases in fundamental/essential matrix estimation.

It might also be the case that there is no rigid transformation between the two viewpoints you get by artificially moving pixel coordinates, thus there is no fundamental matrix satisfying the requirements. A better test case might be to use a function such as cv::warpPerspective with a known warp matrix.



回答2:

1) Using different fitting algorithms produces different results, especially FM_8POINT is different.

Different methods do not give the same result, it's true :

  • for instance RANSAC (RANdom SAmple Consensus) is the default method in findFundamentalMat(), it estimates the parameters of the transform with a set of points which contains some random points that are prior outliers, it produces a correct result with a certain probability.
  • while FM_8POINT is designed to find the parameters with 8points and using a system with linearly independent equations..

2) Given a set of point pairs (y, x), yFx =0 is not fulfilled and is always larger than 0.

It means that the fundamentalMatrix you found is not correct (with a bad estimation), it is due to the pure translation that you gave as an input, which in real, is not possible (it's the degenerate case of 2 images with a viewpoint located at the infinity...(see epipolar geometry)

I hope it helped you... Julien,