For those who wonder where this calibration tool comes from. It seems to need to build it from the source. This is what I did on Linux:
git clone https://github.com/opencv/opencv.git cd opencv git checkout -b 3.1.0 3.1.0
Then for calibration:
./bin/cpp-example-calibration -w=8 -h=6 -o=camera.yml -op -oe -su image_list.xml
-su lets you check how images look after distortion. The -w and -h options take "internal angles" which are not the number of squares in a checkerboard pattern, but rather (num-black-squares - 1) * 2 .
Here's how the perspective transform is applied at the end, using Scala and JavaCV:
import org.bytedeco.javacpp.indexer.FloatRawIndexer import org.bytedeco.javacpp.opencv_core.Mat import org.bytedeco.javacpp.{opencv_core, opencv_imgcodecs, opencv_imgproc} import java.io.File // from the camera_matrix > data part of the yml: val cameraFocal = 1.4656877976320607e+03 val cameraCX = 1920.0/2 val cameraCY = 1080.0/2 val cameraMatrixData = Array[Double]( cameraFocal, 0.0 , cameraCX, 0.0 , cameraFocal, cameraCY, 0.0 , 0.0 , 1.0 ) // from the distortion_coefficients of the yml: val distMatrixData = Array[Double]( -4.016824381742e-01, 4.368842493074e-02, 0.0, 0.0, 1.096412142704e-01 ) def run(in: File, out: File): Unit = { val matOut = new Mat val camMat = new Mat(3, 3, opencv_core.CV_32FC1) val camIdx = camMat.createIndexer[FloatRawIndexer] for (row <- 0 until 3) { for (col <- 0 until 3) { camIdx.put(row, col, cameraMatrixData(row * 3 + col).toFloat) } } val distVec = new Mat(1, 5, opencv_core.CV_32FC1) val distIdx = distVec.createIndexer[FloatRawIndexer] for (col <- 0 until 5) { distIdx.put(0, col, distMatrixData(col).toFloat) } val matIn = opencv_imgcodecs.imread(in.getPath) opencv_imgproc.undistort(matIn, matOut, camMat, distVec) opencv_imgcodecs.imwrite(out.getPath, matOut) }