I am trying to implement a geometry modeling mechanism. One of the parts occupies a prototype polygonal mesh and aligns the instance with some points in a larger object.
Thus, the problem is this: if three-dimensional position positions are specified for some (possibly all) vertices in a polygonal grid, find a scaled rotation that minimizes the difference between the converted vertices and these points. I also have a center that can stay fixed if that helps. The correspondence between vertices and three points is fixed.
I think this can be done by solving for the coefficients of the transformation matrix, but I'm a little unsure how to create a system for the solution.
An example of this is a cube. The prototype would be a unit cube centered at the origin, with vert indices:
4----5
|\ \
| 6----7
| | |
0 | 1 |
\| |
2----3
An example of the location of the vertices:
- v0: 1.243,2.163, -3.426
- v1: 4.190, -0.408, -0.485
- v2: -1.974, -1.525, -3.426
- v3: 0.974, -4.096, -0.485
- v5: 1.974.1.525.3.426
- v7: -1.243, -2.163,3.426
So, given this prototype and these points, how can I find a single scale factor and rotation around x, y and z, which minimizes the distance between the vertices and these positions? It would be best if the method were generalized to an arbitrary grid, and not just to a cube.
source
share