Display values โ€‹โ€‹in binary heap in sorted order using width search?

I am currently reading this document and on page 5 discusses the properties of binary heaps, which he considers generally known. However, one of the things they do is that I have not seen before and cannot understand. The authors claim that if you are given a balanced binary heap, you can list the elements of this heap in sorted order by O (log n) time for each element using a standard width search. Here is their original wording:

In a balanced heap, any new item can be inserted at logarithmic time. We can list heap elements by weight, taking logarithmic time to generate each element, simply using a width search.

I'm not sure what the authors mean. The first thing that comes to mind when they say โ€œwidth searchโ€ will be the first search for tree elements starting with the root, but not guaranteed to list the elements in sorted order and does not require logarithmic time per element. For example, running BFS on this mini-heap causes items to fail regardless of how you break the links:

1 / \ 10 100 / \ 11 12 

This always indicates 100 to 11 or 12, which is clearly wrong.

Am I missing something? Is there a simple width search you can do on the heap to get items in sorted order using logarithmic time each? Clearly, you can do this by destructively modifying the heap, removing the minimum element each time, but the intention of the authors seems to be that this can be done without destruction.

+6
source share
2 answers

You can get the items in sorted order by going through the heap with the priority queue (which requires another heap!). I guess this is what he calls the "broad first search."

I think you should be able to figure this out (given your reputation in algorithms), but basically the priority queue key is the weight of the node. You push the heap root to the priority queue. Then:

 while pq isn't empty pop off pq append to output list (the sorted elements) push children (if any) onto pq 

I am not quite sure (in general) if this is what he was talking about, but he vaguely corresponded to the description and there was not much activity, so I thought that I could also put it there.

+4
source

If you know that all elements below 100 are on the left, you can go left, but in any case, even if you get to 100, you will see that there are no elements on the left for you to exit. In any case, you go from node (or any other node) at worst twice before realizing that there is no element you are looking for. Than men you go in this tree no more than 2 * log (N) times. This is simplified for logarithmic (N) complexity.

The point is that even if you โ€œscrewโ€ and go to the โ€œwrongโ€ node, you go that node in the worst case once.

EDIT
This is how heapsort works. You can imagine that you need to restore the heap using complexity N (log n) every time you take out the top element.

0
source

Source: https://habr.com/ru/post/896016/


All Articles