I have two functions that describe two curves in 2D.
p1 = f1(t1)
p2 = f2(t2)
where p1and p2are vectors, t1and t2are scalars with a value from 0.0 to 1.0.
The curves are convex, and the "bellies" are facing each other. They can be rotated and new functions can be defined y = h(x), so that their derivatives on xwill monotonically increase / decrease.
Example:

I am trying to find an efficient algorithm to find the minimum distance between these curves.
A possible approach, I think, could be to define a distance function:
g(t1, t2) = |f1(t1) - f2(t2)|
and then use the generalization of Newton's method to solve the system of equations
0 = ∂g(t1, t2)/∂t1 // partial derivative of g for t1
0 = ∂g(t1, t2)/∂t2 // partial derivative of g for t2
, , , g, .
, ?