Weight Balanced Trees

Searching for self balancing trees on the internet will yield plenty of results for red-black and AVL trees, but almost no results for weight balanced trees. This is unfortunate, as weight balanced tree's provide the same logarithmic performance as red-black and AVL trees, with one important benefit: they can also index elements in logarithmic time.

For instance, finding the i'th element of an AVL tree will take O(n) time. But with a weight balanced tree, it will only take O(log(n)) time.

The full implementation can be downloaded here: WBTree.py

And this is an example of using the tree:

tree=WBTree() tree.add((5,"Friday")) tree.add((3,"Wednesday")) tree.add((1,"Monday")) tree.add((6,"Saturday")) tree.add((2,"Tuesday")) tree.add((4,"Thursday")) tree.remove((6,"Saturday")) print("Iterate:") for node in tree: print("{0}: {1}".format(*node.value)) print("Index:") for i in range(len(tree)): node=tree[i] print("{0}: {1}".format(*node.value))

Console output:

Iterate: 1: Monday 2: Tuesday 3: Wednesday 4: Thursday 5: Friday Index: 1: Monday 2: Tuesday 3: Wednesday 4: Thursday 5: Friday

The tree also accepts custom sorting functions and can be toggled to accept (or reject) duplicate elements

Performance

We've compared WBTree's performance against various other self balancing trees. The data was generated using speedtest.py in testing.zip using the python3 runtime. Our own timing is denoted by "Dee".

Timing was conducted by

1.Adding 100,000 nodes to a tree
2.Searching for each node
3.Removing all nodes from the tree

If an implementation didn't support searching or removing, those timings were omitted.

AlgorithmRankAdd (s)Remove (s) Search (s)
Pathak AVL 112.25--
Yoo RB 100.82--
Girish AVL 95.273.55-
Reid RB 81.46-0.77
Stanisl RB 71.281.781.19
Dee WBT61.821.630.49
Pgrafov AVL 51.951.390.46
Mozman RB 41.211.980.25
Mozman AVL 31.471.020.26
Msingh RB 21.020.840.61
Dee AVL11.030.770.47

Dee WBT is the algorithm this article focuses on, but it's obvious it doesn't perform as well as the other trees. There are a few reasons for this:

It allows for a custom search function, which slows searching down.
Rebalancing is slightly slower due updating weights and how it compares sibling nodes.

It's important to remember that the weight balanced tree is still bounded by O(log(n)) for all of the normal add(), remove(), and search() operations. So even though its relative performance isn't good, it will never become pathologically bad.

Explanation

Imagine keeping track of the residents in a city. As people decide to move to the city, or move elsewhere, we'd like to update our ledger of residents and also query basic information about them as fast as possible. An array would work, but its O(n) time would make it extremely slow as more people decide to move to the city. This is where self balancing tree's come in. Most of the common operations we'll perform will take O(log(n)) time, so even a city of 1 million people will only need about 20 operations to update its ledger.

All of the common self balancing trees have a similar structure: nodes with 2 children per node, a root node, and some balancing mechanism.

Root 7 3 11 1 5 9 13 0 2 4 6 8 10 12 14

Where the types of trees differ is in their balancing mechanism:

Red-Black:Traveling to any descendant node will cross the same number of black nodes.
AVL:The heights of child nodes will be kept within ±1 of eachother.
Weight:The weights of child nodes will be kept within a ratio of 2.5.

In our weight balanced tree we define the weight as the number of descendants of a node, plus the node itself. Mathematically, we can define the weight recursively as left.weight + right.weight + 1. We also define null nodes to have a weight of 0.

A 4 B 2 C 1 D 1 NUL NUL NUL NUL NUL Node A B C D NUL Weight 4 2 1 1 0 Calculation 2 + 1 + 1 1 + 0 + 1 0 + 0 + 1 0 + 0 + 1

We will omit drawing null nodes from now on in order to avoid clutter. If a node is drawn without a child, then it can be assumed that the child is null.

The tree will focus on maintaining two properties: the relative weights of 2 children should have a ratio of 2.5 or less, and nodes will be in ascending order from left to right. For instance, in the tree below, all children to the left of the root node are less than 7, and all children to the right are greater than 7.

7 3 11 1 5 9 13 0 2 4 6 8 10 12 14 greater than or equal to 7 lesser than or equal to 7

The importance of a self balancing tree is that we may add and remove nodes from the tree while maintaining a logarithmic bound on the tree's height and a sorted ordering on the nodes. This allows us to dynamically track and query data relatively quickly even on large data sets.

We are now prepared to implement the various operations that will maintain the tree's logarithmic height, and also the operations that can make use of the tree once it's populated.

Rotations

Rotations are the fundamental operation that trees use to rebalance nodes. They work by swapping a node's position with one of it's children and, in the process, raise one of its children and lower another. There are two type of rotations that can be performed: left rotations and right rotations.

Let us consider a right rotation on the node A below.

A B z x y B x A y z

Note that node B will take the place of node A, wherever A was in the tree. It can be seen that this right rotation has the effect of shifting the majority of nodes from the left side of the tree to the right. By doing this, the relative weights of 2 children can be kept close, and it is the crux of the tree's logarithmic performance. The rotation above is what we'll use to rebalance a right-leaning node. The rotation also preserves the sorted order of the nodes by maintaining the following order:

x is to the left of B
B is to the left of y
y is to the left of A
A is to the left of z

Likewise, a left rotation on node A looks like so:

A x B y z B A z x y

Left rotations can conversely be used to rebalance a node which is leaning to the right.

The python code to perform these rotations is given below. Note that the weights of nodes A and B need to be recalculated after each rotation, since their childs' weights may have changed.

def rotleft(a): b=a.right r=b.left b.parent=a.parent b.left=a a.parent=b a.right=r if r: r.parent=a a.calcweight() b.calcweight() return b def rotright(a): b=a.left l=b.right b.parent=a.parent b.right=a a.parent=b a.left=l if l: l.parent=a a.calcweight() b.calcweight() return b def calcweight(self): l,r,weight=self.left,self.right,1 if l: weight+=l.weight if r: weight+=r.weight self.weight=weight

We will show a concrete of example of how rotations are used to rebalance a tree in the next section.

Balancing

As we add and remove nodes from the tree, the tree will become lopsided; with some branches becoming extremely heavy and some extremely light. In order to maintain an O(log(n)) worst-case performance bound on tree operations, the tree will need to be rebalanced periodically.

Ideally, in a perfectly balanced tree, all children will have the same weight. However, since we're working with a tree that can be updated dynamically, we will have to allow some slack in balancing the tree in order to balance in a timely manner. That is, if we're too strict with rebalancing, the act of rebalancing will take longer than any time we might save by using a self balancing tree.

For our tree, we will define a node to be balanced if its left and right child weights are within a ratio of 2.5. That is 1 / 2.5 ≤ left.weight / right.weight ≤ 2.5. However, defining the actual inequality in python is fairly tricky, and even slight errors can lead to trees that become imbalanced. The actual inequality we will use to detect an imbalance only involves integer math:

if right.weight * 5 + 2 < left.weight * 2: # Node is leaning to the left. if left.weight * 5 + 2 < right.weight * 2: # Node is leaning to the right. # Otherwise, the node is balanced.

Choosing other values could lead to the balancing invariant being broken over time, or lead to trees with worse balancing ratios. Hirai et al cover research into other invariants that can be used.

5 3 6 2 4 1 weight(3) = 4 weight(6) = 1 weight(6)*5+2 < weight(3)*2 Leaning to the left

To actually rebalance a node, we will make use of rotations. Note from the previous section that left and right rotations shift weights differently. In particular, rotating to the left tilts the node's children to the left, and rotating to the right tilts them to the right. In some situations, where the middle node (denoted by y above) is too heavy, simply performing one rotation won't balance the tree. In those situations, we'll need to perform a double rotation. Thus, for a node leaning to the left, we have the following cases:

if right.weight * 5 + 2 < left.weight * 2: if left.left.weight * 5 < left.weight * 2: # Double rotation. # Rotate node.left to the left. # Rotate node to the right.

Single and double rotations will handle every type of imbalance we'll encounter, so there's need to introduce any more. We'll first show what a single rotation looks like:

Single Rotation 5 3 6 2 4 1 weight(6)*5+2<weight(3)*2 Right rotate 5 3 2 5 1 4 6 Balanced!

Single rotations can fail to balance the node if the middle node is too heavy. If this is the case, then we must left rotate the node's left child before actually rotating the node. The example below shows what would happen if we didn't make this correction.

Single Rotation - Incorrect 5 2 6 1 3 4 weight(6)*5+2<weight(2)*2 Right rotate 5 2 1 5 3 6 4 weight(1)*5+2<weight(5)*2 Not balanced!

Rotating the left child beforehand results in a correctly balanced tree.

Double Rotation - Correct 5 2 6 1 3 4 weight(1)*5<weight(2)*2 Left rotate 2 5 3 6 2 4 1 Right rotate 3 3 2 5 1 4 6 Balanced!

The cases where a node is leaning to the right are handled exactly the same, only all "left" and "right" references are swapped, including rotations.

The code below is our full rebalancing function. In order to make sure all node weights are correct, we need to rebalance from the node we've added (or removed) all the way up to the root.

def rebalance(self,next): # Rebalance from next upward. If 2 children differ in weight by a ratio of 2.5 or # more, we can rotate to rebalance. def Weight(n): return n.weight if n else 0 while next: node,orig=next,next next=node.parent l,r=node.left,node.right lw,rw=Weight(l),Weight(r) if rw*5+2<lw*2: # Leaning to the left. if Weight(l.left)*5<lw*2: node.left=l.rotleft() node=node.rotright() elif lw*5+2<rw*2: # Leaning to the right. if Weight(r.right)*5<rw*2: node.right=r.rotright() node=node.rotleft() else: #Balanced. node.weight=lw+rw+1 continue if next is None: self.root=node elif next.left is orig: next.left=node else: next.right=node

Searching

When defining self balancing trees, we stated that any child to the left of a node will satisfy node.value >= child.value, and any child to the right will satisfy node.value <= child.value. While this property allows us to search for nodes, it also presents a problem since the tree can have multiple nodes with the same key. In such a case, which node should we return when searching for a key?

For the purposes of this article, we will have the search function return the last node going by sorted order. Thus, the rules it will follow are:

1:Start at the root node.
2:If node.value <= value, then move to the node's right child.
3:Otherwise, move to the node's left child.
4:Continue until we reach a null node.
5:Return the last node with node.value = value.

For example, assume we want to search for value = 5 in the following tree.

7 3 11 1 5 9 13 0 2 4 6 8 10 12 14 NUL start at the root 7>5: go left 3<=5: go right 5<=5: go right record node 6>5: go left null node: abort

The python code below shows how our basic search function works.

def search(tree,value): node,ret=tree.root,None while node: if node.value<=value: if value==node.value: ret=node node=node.right else: node=node.left return ret

In WBTree.py, the search function has been expanded to allow us to query different ranges in the tree. They include:

Return the least node = value.
Return the least node > value.
Return the least node >= value.
Return the greatest node = value.
Return the greatest node < value.
Return the greatest node <= value.

Adding

Adding nodes is as simple as finding where the node would be if we searched for it, adding it as a child, and then balancing up. Because we defined our search function to go right whenever node.value <= value, this will stably (in the sorting sense) add the node to the tree.

For instance, assume we want to add value = 4 to the following tree:

1 0 2 3 NUL start at the root 1<=4: go right 2<=4: go right 3<=4: go right null node: place 4 here 1 0 2 3 4

We would then balance from node 3 upwards until we reach the root node.

1 0 2 3 4 leaning right: rotate left balanced: continue 1 0 3 2 4 reached root: abort

The code for adding values is given below. Note that this function will always add a new value. Implementations that require duplicate additions to replace old values will need to modify the add() function.

def add(tree,value): prev,node=None,self.root while node: right=node.value<=value if right: node=node.right else : node=node.left node=self.Node(value) node.parent=prev if prev is None: self.root=node else: if right: prev.right=node else : prev.left=node self.rebalance(prev) return node

Traversal

Given a particular node in the tree, we may be interested in finding the next or previous node going by sorted order. That is, if this tree was a sorted array, what would the next and previous elements be? Answering this question will help us not only with navigating a tree, but also with removing nodes in the next section.

Finding next(node) for some node can be split into 2 cases:

1.If node.right is not null, take node.right's left children until we reach a node with no left child.
2.If node.right is null, follow the node up the tree until we move up a left branch.

We can illustrate these movements in the following diagram:

Case 1 - Down 4 2 6 1 3 5 7 next(4) = 5 right is not null: go right go left left is null: return node 5 Case 2 - Up 4 2 6 1 3 5 7 next(3) = 4 2 is left child: return node 4 3 is not left child: go up right is null: go up

This procedure yields the following python function for next():

def next(node): child=node.right if child: while child: node,child=child,child.left else: while node and node.right is child: child,node=node,node.parent return node

The prev() function is exactly the same as next(), except we swap all instances of node.left with node.right:

def prev(node): child=node.left if child: while child: node,child=child,child.right else: while node and node.left is child: child,node=node,node.parent return node

Removing

Removing a node involves replacing all links to the node with that of its successor, next(node), and then rebalancing. It would be simpler to swap the node's value with that of its successor, and then remove the successor, but this would invalidate any external pointers a user might have to the successor node. Indeed, a user might find it odd that the node that was supposed to be removed still exists, and another node has been deallocated without warning.

There are 3 cases to consider when removing a node.

1.The node has no right child.
2.The successor is the right child of the node.
3.The successor is a distant descendant of the node.

If the node has a null child, then we simply replace the node with its other child and rebalance from the node's parent upwards. For instance, if we want to remove node 5 in the diagram below.

Case 1 1 0 5 4 2 3 remove node 5 NUL 1 0 4 2 3 replace with node 4

If a right child exists, then we know that the next node will be a child on the right-hand side of the node. We also know that, by definition, the next node will have no left child. Thus, for case 2, if the next node is the node's right child, then we can replace the node with the next node without having to worry about shuffling around either nodes' children.

Case 2 1 0 3 2 4 5 NUL remove node 3 next(3) = 4 1 0 4 2 5 node 4 is right child of node 3; replace node 3

For case 3, the next node will be a distant child of the node. So, we must first remove the next node (as in case 1), and then replace the node with the next node.

Case 3 1 0 4 2 5 3 NUL remove node 1 next(1) = 2 1 0 4 3 5 node 2 not right child of node 1; remove node 2 2 0 4 3 5 replace node 1 with node 2

The complete python code for removing a given node is shown below.

def removenode(self,node): # Remove a specific node. p=node.parent l,r=node.left,node.right if r is None: # Case 1 bal,next,l=p,l,None elif r.left: # Case 3 next=r while next.left: bal,next=next,next.left c=next.right bal.left=c if c: c.parent=bal else: # Case 2 bal,next,r=r,r,None # Replace node with next. if p is None: self.root=next elif p.left is node: p.left=next else: p.right=next if next: next.parent=p if l: next.left,l.parent=l,next if r: next.right,r.parent=r,next self.rebalance(bal)

Indexing

The biggest benefit to balancing by weights (instead of red-black or AVL) is that weights grants us the ability to index nodes in O(log(n)) time, even as we are modifying the tree.

We'll index nodes in the same way that we would index values if they were sorted from least to greatest. So index 0 would be the smallest value in the tree, and index n-1 would be the greatest. We also need to remember that children to the left of the node have values equal to or less than the node. Putting all that together, we'll follow these simple rules:

1.If index > left.weight, go to the right and subtract index -= left.weight + 1.
2.If index = left.weight, return the current node.
3.Otherwise go to the left.

Using the tree below, we can show how to find the 10'th element in the tree. Note that all of the numbers we'll be using will refer to weights.

7 3 11 1 5 9 13 0 2 4 6 8 10 12 14 10>7 index=10-7-1=2, go right 2<3 go left 2>1 index=2-1-1=0 go right 0=0 return node

The code for this ends up being fairly simple.

def __len__(self): # Return the number of nodes in the tree. return self.root.weight if self.root else 0 def __getitem__(self,i): # Index nodes like an array. node=self.root weight=node.weight if node else 0 if i<0: i+=weight if i<0 or i>=weight: return None while True: left=node.left lw=left.weight if left else 0 if i>lw: i-=lw+1 node=node.right elif i==lw: break else: node=left return node

To find the index of a specific node, we simply do the opposite. We trace the path from a node to the root. Every time our node is a right descendant, we add the weight of the left child plus 1 to our index.

def index(node): # Returns the node's index within the tree. Ex: tree[node.index()]=node. i=-1 prev=node.right while node: if node.right is prev: i+=node.left.weight+1 if node.left else 1 prev=node node=node.parent return i

Time Complexity

Proving that the weights of two children will always stay within a ratio 2.5 is fairly difficult. We will instead be taking this balance as fact, and simply proving that the height of the tree is bound by h <= 2.07 * log2(n).

Let N be the weight of the node, and let L and R be the weights of its two children. For simplicity, we'll assume N = L + R. Our first step is to bound the weight of either of the two children by N.

L / R <= 2.5 L / (N-L) <= 2.5 L <= 2.5 * (N-L) 1.4 * L <= N L <= N / 1.4

Thus, the maximum weight of a child is 1 / 1.4 the weight of the parent. We now need to see at what height the weight of a child becomes 1.

n * (1/1.4) ^ h >= 1 (1/1.4) ^ h >= 1 / n 1.4 ^ h <= n log2(1.4) * h <= log2(n) h <= 2.07 * log2(n)

Thus the height is logarithmically bound, as desired.

Operational Time Complexity

Now, we can show that the worst-case time complexity for our operations is also O(log(n)).

Rebalancing only moves up the tree, which is O(log(n)), and performs 1 or 2 rotations per node, which are O(1). Thus rebalancing is O(log(n)).

Searching only moves down the tree, which is O(log(n)), and performs one comparison per node, which is O(1). Thus searching is O(log(n)).

Next/prev must either travel all the way up the tree or down the tree, which is O(log(n)). The average number of operations per next/prev is 2.

Adding a node involves searching for a node's destined position, which is O(log(n)), and rebalancing, which is O(log(n)). Thus adding nodes is O(log(n)).

Removing a node involves at most finding the node, which is O(log(n)), finding the next node, which is O(log(n)), swapping, which is O(1), and balancing, which is O(log(n)). Thus, removing a node is O(log(n)).

Finding a node given an index a node involves comparing weights and moving down the tree, which is O(log(n)). Finding an index given a node only moves up the tree, which is also O(log(n)). Thus indexing is O(log(n)).

Notes

Better balancing is possible by avoiding ratios entirely. Instead, we can attempt a double or single rotation and see which one results in a better balance. Research is needed to simplify when balancing is performed and what its worst case balance is. I've left some notes in testing.zip.

Unrolling rebalance() speeds up the tree dramatically, but I omitted it from the article and reference implementation because it's much more complicated. The unrolled version can also be found in testing.zip.

For environments with finite integer sizes, performing the balance comparison without overflows is possible, but it's slower and obscures the reference implementation too much.

I've also implemented an AVL tree for education and to compare speeds. Its ability to terminate early gives it a speed advantage over weight balanced trees.

I'd like to implement set theory operations in the future. Until I actually need them, I don't know what I'd like the API to look like, so I'll hold off for now.