The distance between a point and a line is defined as follows:
d = | (x_2 - x_1) (y_1 - y_0) - (x_1 - x_0) (y_2 - y_1) | / sqrt ((x_2 - x_1) ^ 2 - (y_2 - y_1) ^ 2),
which is an extension of the point product, where (x_0, y_0) are the coordinates of the point, and (x_1, y_1) and (x_2, y_2) are the end points of the line.
It would be quite simple to calculate this for each set of points, and then just determine which one is the lowest. I am not convinced that there is no more elegant way to do this, but I do not know about it. Although I would like to see someone here answers alone!
Edit: Sorry, the math here looks so messy without formatting. Here the image of what this equation looks like is beautifully executed:
Point in a line! http://mathworld.wolfram.com/images/equations/Point-LineDistance2-Dimensional/NumberedEquation8.gif
Another edit: as Chris noted in his post, this does not work if the points are in a row, i.e. if the line is defined (0,0) - (0,1), and the point is (0,10). As he explains, you need to check to make sure that the point in question is not actually on the “extended path” of the line itself. If so, then this is simply the distance between the closest endpoint and the point. All loans to Chris!
source share