To measure distances with a single camera, you need to know some numbers. To measure the height of something, say a chair, the only thing you have is the size of it in the camera (which is in pixels and can be converted to inches using the screen size), that's all. A chance to measure height and width using a link, say a person 6 feet tall, standing next to a chair.
Thus, you can work in reverse order, using, for example, an object measuring 10 feet, using its size as displayed in the camera, you can develop the size of things at the same distance, on a surface that is not flat, even that they are on the same distance is a problem.
Thus, using a camera and a camera is not possible. You need to know the distance somehow, or need a link.
If you use the application to measure the height of objects that you know, and then using GPS, you can find the distance, and the rest is mathematical.
I found some links using google, they can help.
They can help you find out what other information is needed, in addition to what the camera can provide, so you can think about your application, as well as what you can do and what are the limitations.
One way is to use multiple cameras that can be compensated for using multiple shots taken at a known distance. Thus, the application may ask the user to take several images, track the distance using GPS and, possibly, it can work.
See also these links:
SpeedBirdNine Jan 28 2018-12-12T00: 00Z
source share