Build binary tree from permutation in n log n time

Numbers 1 through n are inserted into the binary search tree in the given order p_1, p_2, ..., p_n. Describe the O (nlog n) time algorithm for constructing a finite binary search tree.

Note that: -

  • I don't need average time n log n, but at worst time.
  • I need the exact tree that occurs when the insertion occurs with the usual rules. AVL or red black trees are not allowed.

This task. This is very nontrivial. At first glance, this seemed impossible. I thought a lot about it. My observations: -

  • The argument we use to prove that sorting takes the least time n log n does not exclude the existence of such an algorithm here.
  • If you can always find a subtree in O (n) time, the size of which is between two parts of the tree size, the problem can be easily solved.
  • Selecting the median or left root from the root as the root of the subtree does not work.
+6
source share
6 answers

Since David Eisenstat does not have time to extend his answer, I will try to add more detailed information to a similar algorithm.

Intuition

The basic intuition underlying the algorithm is based on the following statements:

operator # 1 : if BST contains values a and b ( a < b ) and there are no values ​​between them, then either a (node ​​for value a ) is (possibly indirect) the parent of b (node ​​for value b ) or b is (possibly , indirect) by the parent a .

This operator is obviously true because if their youngest common ancestor of C is some other node than a and b , its C value must be between a and b . Note that operator # 1 is true for any BST (balanced or unbalanced).

statement # 2 : if a simple (unbalanced) BST contains the values a and b ( a < b ) and there are no values ​​between them. And we are trying to add a value x such that a < x < b , then x (node ​​for the value x ) will either direct, or large (child) child of a , or direct left (smaller) child of b , depending on what node is lower in the tree.

Suppose that the lower of the two nodes is equal to a (the other case is symmetric). During the insert phase, the value of x will move along the same path as a during insertion, because the tree does not contain any values ​​between a and x , that is, for any comparison values a and x are indistinguishable. This means that the value x will move through the tree to node a and will go through node b at some earlier stage (see Operator # 1). As x > a it should become the right child of a . The right right child of a must be empty at this point because a is in the subtree of b , that is, all values ​​in this subtree are less than b , and since there are no values ​​in the tree between a and b , no value can be the correct child of the node a .

Note that statement # 2 could potentially be incorrect for some balanced BST after rebalancing, although this should be a strange case.

operator # 3 : in balanced BST for any value of x , and not in the tree, you can find the closest and closest smaller values ​​in O(log(N)) time.

This follows directly from the operators # 1 and # 2: all you need to do is find the potential insertion point for the x value in BST (accepts O(log(N)) ), one of the two values ​​will be the direct parent of the insertion point and to search of the other you need to move the tree back to the root (again takes O(log(N)) ).

So now the idea behind the algorithm, it becomes clear: to quickly insert into an unbalanced BST, we need to find the nodes with the smallest and smaller values. We can easily do this if we additionally maintain a balanced BST with the same keys as our target (unbalanced) BST, and with the corresponding nodes from this BST as values. Using this additional data structure, we can find the insertion point for each new value in O(log(N)) and update this data structure with a new value in O(log(N)) time.

Algorithm

  • Initiate "main" root and balancedRoot with null .
  • for each x value in the list do:
  • if this is the first value, just add it as root nodes for both trees and go to # 2
  • in the tree indicated by balancedRoot , find the nodes that correspond to the nearest smaller ( BalancedA , points to node a in the main BST) and the closest to the most ( BalancedB , points to node b in the main BST values).
    • If there is no nearest lower value, i.e. we add the minimum element, add it as the left child in node b
    • If there is no next larger value, that is, we add the maximum element, add it as the right child in node a
    • Find which of the nodes a or b smaller in the tree. You can use the explicit level stored in node. If the bottom node is a (less than node), add x as the direct right child of a else add x as the direct left child of b (more than node). Alternatively (and more smartly), you can notice that from statements No. 1 and 2, it follows that exactly one of the two candidate insertion positions ( a right child or b left child) will be empty, and that is where you want to insert your x value.
  • Add the x value to the balanced tree (can be reused from step 4).

  • Go to step # 2

Since no inner step of the cycle exceeds O(log(N)) , the total complexity is O(N*log(N))

Java implementation

I'm too lazy to implement balanced BST, so I used the standard Java TreeMap , which implements the Red-Black tree and has useful lowerEntry and higherEntry that correspond to step 4 of the algorithm (you can see the source code so that both are actually O(log(N)) ).

 import java.util.Map; import java.util.TreeMap; public class BSTTest { static class Node { public final int value; public Node left; public Node right; public Node(int value) { this.value = value; } public boolean compareTree(Node other) { return compareTrees(this, other); } public static boolean compareTrees(Node n1, Node n2) { if ((n1 == null) && (n2 == null)) return true; if ((n1 == null) || (n2 == null)) return false; if (n1.value != n2.value) return false; return compareTrees(n1.left, n2.left) && compareTrees(n1.right, n2.right); } public void assignLeftSafe(Node child) { if (this.left != null) throw new IllegalStateException("left child is already set"); this.left = child; } public void assignRightSafe(Node child) { if (this.right != null) throw new IllegalStateException("right child is already set"); this.right = child; } @Override public String toString() { return "Node{" + "value=" + value + '}'; } } static Node insertToBst(Node root, int value) { if (root == null) root = new Node(value); else if (value < root.value) root.left = insertToBst(root.left, value); else root.right = insertToBst(root.right, value); return root; } static Node buildBstDirect(int[] values) { Node root = null; for (int v : values) { root = insertToBst(root, v); } return root; } static Node buildBstSmart(int[] values) { Node root = null; TreeMap<Integer, Node> balancedTree = new TreeMap<Integer, Node>(); for (int v : values) { Node node = new Node(v); if (balancedTree.isEmpty()) { root = node; } else { Map.Entry<Integer, Node> lowerEntry = balancedTree.lowerEntry(v); Map.Entry<Integer, Node> higherEntry = balancedTree.higherEntry(v); if (lowerEntry == null) { // adding minimum value higherEntry.getValue().assignLeftSafe(node); } else if (higherEntry == null) { // adding max value lowerEntry.getValue().assignRightSafe(node); } else { // adding some middle value Node lowerNode = lowerEntry.getValue(); Node higherNode = higherEntry.getValue(); if (lowerNode.right == null) lowerNode.assignRightSafe(node); else higherNode.assignLeftSafe(node); } } // update balancedTree balancedTree.put(v, node); } return root; } public static void main(String[] args) { int[] input = new int[]{7, 6, 9, 4, 1, 8, 2, 5, 3}; Node directRoot = buildBstDirect(input); Node smartRoot = buildBstSmart(input); System.out.println(directRoot.compareTree(smartRoot)); } } 
+2
source

The trick is not to use the built BST for search. Instead, keep an extra, balanced BST for your search. Tie the leaves.

For example, we could have

 Constructed Balanced 3 2 / \ / \ 2 D 1 3 / \ / | | \ 1 C abcd / \ AB 

where a, b, c, d are pointers to a, b, c, d respectively, and a, b, c, d are what are usually null pointers.

To insert, first insert into balanced BST (O (log n)), follow the pointer to the constructed tree (O (1)), make the constructed insert (O (1)) and move the new leaves (O (1)).

+4
source

Here is my attempt at O ​​(n log ^ 2 n), which does not require creating a balanced tree.

Place the nodes in the array in natural order (1 to n). Also link them in a linked list in insertion order. Each node maintains its insertion order along with the key.

The algorithm is as follows.

The input is a node in a linked list, and the range of (low, high) indices in a node array

  • Call root entry node, its key is rootkey . Untie it from the list.
  • Determine which subtree of input node is smaller.
  • Go to the appropriate range of arrays, disconnect each node from the linked list, then merge them into a separate linked list and sort the list again in the insertion order.
  • The heads of the two result lists are the children of the node input.
  • Run the algorithm recursively for the node input children, passing the ranges (low, rootkey-1) and (rootkey+1, high) as index ranges.

The sorting operation at each level gives the algorithm an additional complexity factor log n .

+1
source

This uses the O(n log n) algorithm, which can also be adapted to O(n log log m) time, where m is a range using a Y-fast trie rather than a balanced binary tree.

In the binary search tree, lower values ​​are left with higher values. The insertion order corresponds to the choice of a node on the right or left when walking along a leaf tree. The parent of any node, x is either the smallest number previously inserted, or the largest earlier number previously inserted, depending on what was later inserted.

We can identify and associate the listed nodes with their correct parents using the worst-case O(n log n) logic above, maintaining a balanced binary tree with visited nodes while we go through the insertion order.

Explanation:

Imagine the proposed lower parent, p . Now imagine the number l > p , but still below x , inserted before p . Or (1) p passed l during insertion, and in this case x would have to go l to get to p , but this contradicts the fact that x had to go to the right if it reached l ; or (2) p did not pass l , and in this case p is in the subtree to the left of l , but this will mean that the number was inserted less than l , but more than x , a contradiction.

It is clear that a number <<222> greater than p that was inserted after p would also contradict p as x parent, because either (1) l passed p during insertion, which means that p right child was already assigned when x was inserted; or (2) l is in a subtree to the right of p , which again means that the number was inserted less than l , but more than x , a contradiction.

Therefore, for any node, x , with a lower parent, this parent must be the largest number below and inserted before x . Similar logic covers the scenario of a higher proposed parent.

Now suppose the parent x , p < x , was inserted before h , the lowest number, large and set to x . Then either (1) h passed p , in which case the p right node was already assigned when x was inserted; or (2) h is in the p subtree right, which means that a number less than h and more than x was inserted earlier, but this would contradict our assertion that h is the lowest number inserted so far more, than x .

+1
source

Here is a linear time algorithm. (I said that I will not work on this question, so if you like this answer, please award the SergGr Award.)

Create a doubly linked list with nodes 1..n and calculate the inverse p. For i from n to 1, let q be the left neighbor p_i in the list and r be the right neighbor. If p ^ -1 (q)> p ^ -1 (r), then let p_i be the right child of q. If p ^ -1 (q) p ^ -1 (r), then make p_i the left child of r. Remove p_i from the list.

In Python:

 class Node(object): __slots__ = ('left', 'key', 'right') def __init__(self, key): self.left = None self.key = key self.right = None def construct(p): # Validate the input. p = list(p) n = len(p) assert set(p) == set(range(n)) # 0 .. n-1 # Compute p^-1. p_inv = [None] * n for i in range(n): p_inv[p[i]] = i # Set up the list. nodes = [Node(i) for i in range(n)] for i in range(n): if i >= 1: nodes[i].left = nodes[i - 1] if i < n - 1: nodes[i].right = nodes[i + 1] # Process p. for i in range(n - 1, 0, -1): # n-1, n-2 .. 1 q = nodes[p[i]].left r = nodes[p[i]].right if r is None or (q is not None and p_inv[q.key] > p_inv[r.key]): print(p[i], 'is the right child of', q.key) else: print(p[i], 'is the left child of', r.key) if q is not None: q.right = r if r is not None: r.left = q construct([1, 3, 2, 0]) 
+1
source

Since this is an assignment, I am sending a hint instead of an answer.

Sort the numbers, keeping the insertion order. Say you have an input: [1,7,3,5,8,2,4] . Then after sorting you will have [[1,0], [2,5], [3,2], [4, 6], [5,3], [7,1], [8,4]] . This is actually a crawl as a result of the resulting tree. Think about how to restore the tree, given the move in order and the insertion order (this part will be linear time).

More tips if you really need them.

0
source

Source: https://habr.com/ru/post/1015872/


All Articles