Suppose you have a magic data structure representing a linear sequence of elements that supports finding, inserting, and deleting any index in the worst case O (1). (Iโm sure that such a model is not possible given the memory model of modern machines, but let's assume that we have it for fun).
A friend of mine pointed out that if you have this data structure, you can create a cool sorting algorithm for integers that works in the expected O (n lg lg n) time as follows. First create one of the magic data structures mentioned above. Next, for each element of the input array, use the interpolation search to find the index in this magic array in which the element is located in the expected O (log lg n) time, and then in O (1) the time to insert the element. Finally, in O (n) time, read the sorted structure of the magic data. This makes n calls for the interpolation search O (lg lg n), which will work in O (n lg lg n).
I understand that this approach above will not give the worst O (n lg lg n) time for sorting, since there are pathologically bad input arrays that, if used in the interpolation search, degenerate the search to O (n 2 ). My question is, given this magical data structure , what is the fastest integer sorting algorithm that can be built if we only care about the worst-case execution time of the algorithm?
(This may be better suited for cstheory, but I thought I'd ask here first, since in the past I had answers to large algorithms.)
source share