Matlab and OpenCV compute different moment of m00 image for the same image

For exactly the same image

Opencv Code:

img = imread("testImg.png",0);
threshold(img, img_bw, 0, 255, CV_THRESH_BINARY | CV_THRESH_OTSU);
Mat tmp;
img_bwR.copyTo(tmp);
findContours(tmp, contours, hierarchy, CV_RETR_EXTERNAL, CV_CHAIN_APPROX_NONE);

// Get the moment
vector<Moments> mu(contours.size() );
for( int i = 0; i < contours.size(); i++ )
 { mu[i] = moments( contours[i], false ); 

 }

// Display area (m00)
for( int i = 0; i < contours.size(); i++ )
 { 
     cout<<mu[i].m00 <<endl;
     // I also tried the code 
     //cout<<contourArea(contours.at(i))<<endl;  
     // But the result is the same
 }

Matlab Code:

Img = imread('testImg.png');
lvl = graythresh(Img);
bw = im2bw(Img,lvl);
stats = regionprops(bw,'Area');
for k = 1:length(stats)
    Area = stats(k).Area; %m00
end

Does anyone have any thoughts? How to unify them? I think they use different methods to find outlines.

I uploaded the test image from the link below so that someone who is interested in this can reproduce the procedure

This is 100 per 100 small 8-bit grayscale images with an intensity of only 0 and 255 pixels. For simplicity, there is only one blob on it. For OpenCV, the contour area (image moment m00) is 609.5 (a very odd value) For Matlab, the contour area (image moment m00) is 763.

thank

+4
source share
1

, . , , . OpenCV, , , Matlab. . , findContour(), , " ". Edge pixel - N4.

: , 100x100 . . ( ). 200 1 : (0,0), (1,0), (1,1), (2,1), (2,2),... (100, 99), (100, 100), (0,100). , . , OpenCV, 3 , : (0,0), (99,99), (0,99). (99 99/2) . . . , .

. . ( ) , OpenCV. , .

. MATLAB OpenCV, , - , () . , OpenCV 3 , , .

+3

Source: https://habr.com/ru/post/1570807/


All Articles