I implemented this method in Javascript, and I'm about 2.5%, and I would like to understand why.
My input is an array of points represented as latitude, longitude, and height above the WGS84 ellipsoid. These moments are taken from the data collected from the GPS device on the wrist during the marathon race.
My algorithm was to convert each point to a Cartesian geocentric coordinate, and then calculate the Euclidean distance (cf Pythagoras). The Cartesian geocentric is also known as the Earth with a grounded fixed. that is, the coordinate system X, Y, Z, which rotates with the earth.
My test data was data from a marathon, so the distance should be very close to 42.26 km. However, the distance reaches 43.4 km. I tried different approaches and did not change the result by more than a meter. for example, I replaced the altitude data with data from the NAST SRTM mission, I set the altitude to zero, etc.
Using Google, I found two points in the literature where lat, lon, height were converted and my conversion algorithm matches.
What can this explain? Am I Expecting Too Much from the Double Javascript View? (The numbers X, Y, Z are very large, but the differences between the two points are very small).
My alternative is to go to the calculation of the geodesic through the ellipsoid WGS84 using the Vincenti algorithm (or similar), and then calculate the Euclidean distance with two heights, but this seems inaccurate.
Thanks in advance for your help!