This class of algorithms is called Nearest Neighbor or K Nearest Neighbor.
similarity to cosine , as excepeiont says, will work if vector direction is important. If the vector represents a position in space, then any metric to represent the distance in space will work.
For example, Euclidean distance : take the square root of the sum of the squared differences in each dimension. This will give you the distance for each vector, and then sort your set of vectors increasing at that distance.
This process will be O (N) in time. If this is too slow for you, you can look at some common K Nearest Neighbor algorithms.
source share