I am working on a project for my university where we want the Quadrcopter to stabilize with its camera. Unfortunately, the Fundamental Matrix reacts very intelligently to small changes inside featurpoints, I will give you examples later.
I think my match already works very well thanks to ocv. I use SURF Features and map them to the knn-Method:
SurfFeatureDetector surf_detect; surf_detect = SurfFeatureDetector(400); //detect keypoints surf_detect.detect(fr_one.img, fr_one.kp); surf_detect.detect(fr_two.img, fr_two.kp); //extract keypoints SurfDescriptorExtractor surf_extract; surf_extract.compute(fr_one.img, fr_one.kp, fr_one.descriptors); surf_extract.compute(fr_two.img, fr_two.kp, fr_two.descriptors); //match keypoints vector<vector<DMatch> > matches1,matches2; vector<DMatch> symMatches,goodMatches; FlannBasedMatcher flann_match; flann_match.knnMatch(fr_one.descriptors, fr_two.descriptors, matches1,2); flann_match.knnMatch(fr_two.descriptors, fr_one.descriptors, matches2,2); //test matches in both ways symmetryTest(matches1,matches2,symMatches); std::vector<cv::Point2f> points1, points2; for (std::vector<cv::DMatch>::const_iterator it= symMatches.begin(); it!= symMatches.end(); ++it) { //left keypoints float x= fr_one.kp[it->queryIdx].pt.x; float y= fr_one.kp[it->queryIdx].pt.y; points1.push_back(cv::Point2f(x,y)); //right keypoints x = fr_two.kp[it->trainIdx].pt.x; y = fr_two.kp[it->trainIdx].pt.y; points2.push_back(cv::Point2f(x,y)); } //kill outliers with ransac vector<uchar> inliers(points1.size(),0); findFundamentalMat(Mat(points1),Mat(points2), inliers,CV_FM_RANSAC,3.f,0.99f); std::vector<uchar>::const_iterator itIn= inliers.begin(); std::vector<cv::DMatch>::const_iterator itM= symMatches.begin(); for ( ;itIn!= inliers.end(); ++itIn, ++itM) { if (*itIn) { goodMatches.push_back(*itM); } }
Now I want to calculate the main matrix with these matches. I use the 8POINT method for this example - I already tried it with LMEDS and RANSAC - there it only worsens because there are more matches that change.
vector<int> pointIndexes1; vector<int> pointIndexes2; for (vector<DMatch>::const_iterator it= goodMatches.begin(); it!= goodMatches.end(); ++it) { pointIndexes1.push_back(it->queryIdx); pointIndexes2.push_back(it->trainIdx); } vector<Point2f> selPoints1, selPoints2; KeyPoint::convert(fr_one.kp,selPoints1,pointIndexes1); KeyPoint::convert(fr_two.kp,selPoints2,pointIndexes2); Mat F = findFundamentalMat(Mat(selPoints1),Mat(selPoints2),CV_FM_8POINT);
When I call these calculations in a cycle on the same pair of images, the result F changes very much - it is not possible to extract motion from such calculations.
I generated an example in which I filtered out some matches so you can see the effect that I mentioned for myself.
http://abload.de/img/div_c_01ascel.png
http://abload.de/img/div_c_02zpflj.png
Is there something wrong with my code, or should I think about other reasons, such as image quality and so on?
Thanks in advance for your help! derfreak