Argument for O (1) Complicated Heap Insertion Complexity

The statement on the Wikipedia page about binary heaps is that in the worst case, O (log n) is used for insertion, and on average O (1):

The number of operations required depends only on the number of levels that the new element must raise in order to satisfy the heap property, so the insert operation has the worst case time complexity O (log n), but the complexity of the average case O (1)).

The linked page tries to justify this as follows:

However, on average, a newly inserted item does not move very far up the tree. In particular, assuming an even distribution of keys, he has half the chance of being more than his parent; he has half the chance to be more than his ancestor, given that he is more than his parent; he has half the chance to be more than his great-grandfather, given that he is more than his parent, and so on [...], so in the average case the insert takes a constant time

Is this, of course, stupidity? It seems to me that if the tree were randomly ordered, then there would be a 50/50 chance that the new element would be larger than its parent; but since, roughly speaking, large elements sink to the bottom, the chances are much less than 50/50 as the heap grows.

It is right?

So it was on Wikipedia for several months ...

+8
1

, O (1): " " 1991 . ( , 4 .) , , 1975 , " ", , , O (1).

, . - , , , . , ( ), , , , - 0,5. ( 0,5), , , , , : 0,25. , , 0,125 . , , 1 * 0,5 + 2 * 0,25 + 3 * 0,125..., 2.

, , , 0,5; . , , , , . , 2,6.

, , BST, .

+20

Source: https://habr.com/ru/post/1654713/


All Articles