I am trying to recover the movement of a camera by using the
fundamental matrix, and the algorithm as given on Wikipedia. For
this I need to find the fundamental matrix. I am using
OpenCV::findFundamentalMat
for this.
Two unexpected behaviours:
- Using different fitting algorithms produces different results,
especially
FM_8POINT
is different. - Given a set of point pairs (y, x), yFx = 0 is not fulfilled and is always larger than 0.
Have I not understood something here? Is my example false, or what is going on? Can anyone suggest a better test example?
Below is a minimal example. Create 12 artificial points, shift each of
those points 10 pixel to the right, find the fundamental matrix from
these two sets of points and print yFx
for each point.
Example:
int main(int argc, const char* argv[])
{
// Create two sets of points. Points in pts2 are moved 10pixel to the right of the points in pts1.
std::vector<cv::Point2f> pts1, pts2;
for(double y = 0; y < 460; y+=150)
{
for(double x= 0; x < 320; x += 150)
{
pts1.push_back(cv::Point2f(x, y));
pts2.push_back(cv::Point2f(x+10.0, y));
}
}
cv::Mat F = cv::findFundamentalMat(pts1, pts2);
for(int i = 0; i < pts1.size(); i++)
{
// Creating p1, p2, the two points. Please let me know if this can be done in fewer lines.
cv::Mat p1(3,1, CV_64FC1), p2(3,1, CV_64FC1);
p1.at<double>(0) = pts1.at(i).x;
p1.at<double>(1) = pts1.at(i).y;
p1.at<double>(2) = 1.0;
p2.at<double>(0) = pts2.at(i).x;
p2.at<double>(1) = pts2.at(i).y;
p2.at<double>(2) = 1.0;
// Print yFx for each pair of points. This should be 0 for all.
cout << p1.t() * F * p2 << endl;
}
}
For FM_RANSAC
I get
[1.999], [2], [2], [1.599], [1.599], [1.599], [1.198], [1.198], [1.198], [0.798], [0.798], [0.798]
For FM_8POINT
the fundamental matrix is zeros(3,3)
and thus yFx
is 0
for all y
, x
.
I only found: T and R estimation from essential matrix but that didn't help much.
Edit: yFx
is the wrong way round (p1
/p2
switched in the cout-line). This is example is also not working because all points lie on a plane.