Given a list of geocoded locations with an unknown error value and a database with less noisy public corrections closer to the true location (most of which are reliable), how should I develop an algorithm so that, taking into account all the corrections, the true location is most accurate?
Both stationary coordinates and sensor readings are noisy, so it looks like a geographic recording problem. This reminds me of a known problem with several noisy sensors, where you simulate noise and calculate the most probable value, but I donβt remember the solution.
All coordinates are saved as the type geography::POINT in SQL Server 2008, so an effective solution for this platform would be most useful.
Clarification: The coordinates are not temporary. Each reading comes from a unique sensor without repeated measurements.
source share