O (1), of course, is impossible.
Some pseudo codes that I would use as a starting point:
sort(B1) sort(B2) i1 = 0 i2 = 0 mindist = MAX_INT // when one of the buckets is empty, we'll simply return MAX_INT. while(i1 < B1.size() && i2 < B2.size()) t = B1[i1] - B2[i2] mindist = min(mindist, abs(t)) if t > 0 i2 ++ else i1 ++ return mindist
At least, this is O (n log n), since sorting at the beginning prevails in it. If your buckets are already sorted, you can have O (n).
Edit:
After the new information that the elements are almost sorted, I would suggest sorting them by insertion. Binary search insertion sorting is not suitable for this situation. Just add a new item and swap it forward until it fits. Usually it will not be swaps, but for 1% where you need swaps, in 99% of cases it will be only one. In the worst case, the complexity is O (n), but the average value will be almost equal to O (1).
If you think that a mindist is needed for all pairs of buckets, you will need to store i1 and i2 and a mindist . Say B1 is a bucket where you add a new item. You sort it and decrease i2 until it is 0 or B2[i2] < B1[i1] . Since items are timestamps, this will be no more than one step in most cases. Then you run the while loop again, which will usually be just one step. Thus, the computational complexity O (k) for k buckets and the memory complexity O (k ^ 2).
source share