Calculate OpenCV distance (mismatch)

- Update 2 -

The following article is really useful (although it uses Python instead of C ++) if you use a single camera to calculate the distance: Find the distance from the camera to the object / marker using Python and OpenCV

Best Link - Depth detection of stereo webcam . The implementation of this open source project is really clear.

Below is the original question.


For my project, I use two cameras (stereo view) to track objects and calculate distance. I calibrated them with the OpenCV sample code and created a mismatch map.

I have already implemented a color-based object tracking method (this creates a threshold image).

My question is: how can I calculate the distance to tracked colored objects using a map / mismatch matrix?

Below you can find a piece of code that gets the x, y, and z coordinates for each pixel. Question: Is Point.z in cm, pixels, mm?

Is it possible to get the distance to the tracked object using this code?

Thank you in advance!

cvReprojectImageTo3D(disparity, Image3D, _Q); vector<CvPoint3D32f> PointArray; CvPoint3D32f Point; for (int y = 0; y < Image3D->rows; y++) { float *data = (float *)(Image3D->data.ptr + y * Image3D->step); for (int x = 0; x < Image3D->cols * 3; x = x + 3) { Point.x = data[x]; Point.y = data[x+1]; Point.z = data[x+2]; PointArray.push_back(Point); //Depth > 10 if(Point.z > 10) { printf("%f %f %f", Point.x, Point.y, Point.z); } } } cvReleaseMat(&Image3D); 

- Update 1 -

For example, I created this threshold image (left camera). I have almost the same correct camera.

enter image description here

In addition to the above threshold image, the application creates a map of differences. How can I get the z-coordinates of the pixels on the mismatch map?

In fact, I want to get all the Z-coordinates of the pixels in my hand in order to calculate the average Z (distance) (using the map of inconsistencies).

+6
source share
2 answers

The math for converting the mismatch (as a percentage of pixels or the width of the image) to the actual distance is pretty well documented (and not very complicated), but I will also write it down here.

The following is an example of a mismatch image (in pixels) and an input image width of 2K (2048 pixels across) the image:

The convergence distance is determined by the rotation between the camera lenses. In this example, it will be 5 meters. A convergence distance of 5 (meters) means that the mismatch of objects at 5 meters is 0.

 CD = 5 (meters) 

Inverse Convergence Distance: 1 / CD

 IZ = 1/5 = 0.2M 

Camera sensor size in meters

 SS = 0.035 (meters) //35mm camera sensor 

The width of the pixel on the sensor in meters

 PW = SS/image resolution = 0.035 / 2048(image width) = 0.00001708984 

The focal length of your cameras in meters

 FL = 0.07 //70mm lens 

InterAxial distance: distance from the center of the left lens to the center of the right lens

 IA = 0.0025 //2.5mm 

A combination of the physical settings of your camera

 A = FL * IA / PW 

Adjusted with the camera: (for left viewing, only in the right view will use positive [mismatch value])

 AD = 2 * (-[disparity value] / A) 

From here you can calculate the actual distance using the following equation:

 realDistance = 1 / (IZ – AD) 

This equation only works for β€œtoe” camera systems, parallel cameras will use a slightly different equation to avoid infinity values, but for now I will leave it to that. If you need parallel material, just let me know.

+1
source

Source: https://habr.com/ru/post/947242/


All Articles