Skip to content
Related Articles

Related Articles

Improve Article

Weak Heap

  • Last Updated : 05 May, 2020

It is a binary tree that has following properties:-
(1) Every key in the right sub-tree of a node is greater than the key stored in the node itself,
(2) The root has no left child, and
(3) Leaves are only found on the last two levels of the tree.

It is used for implementing priority queues. The reason why a user might prefer a weak heap over a binary heap is that weak heaps are capable of performing less element comparisons in the worst case.

The worst case time complexities of insertion and deletion of an array based implementation of a weak heap are:-
1.insert : O(lg n)
2.delete : O(lg n)

The weak heap construction uses a buffer that supports constant-time insertion. A new element is inserted into the buffer as long as the buffer size is below a threshold. Once the buffer is full, all the elements of the buffer are moved to the weak heap.

A weak heap is obtained by loosening the requirements of a binary heap. To represent a weak heap in memory, two arrays are used. The first one is the element array a and the other is an array r of reverse bits .

We use a_i to refer to either the element at index i of array a or to a node in the corresponding tree structure. A weak heap is constructed such that, for a_i , the index of its left child is 2_i + r_i, the index of its right child is2_i + 1 $-$ r_i, and (where i != 0) the index of its parent is  \lfloor\dfrac{i}{2}\rfloor .

Weak heap Example:
If 10 integers are given as an input, e.g. 8, 7, 4, 5, 2, 6, 9, 3, 11, 1, then the weak heap constructed from the following input will be as shown in the diagram below.

Operations on weak heap and how desired time complexities are achieved using these operations
The basic weak heap operations along with the pseudo code are as follows:-

1. Distinguished Ancestor: The distinguished ancestor of  a_j , j != 0, is the parent of  a_j if  a_j is a right child, and the distinguished ancestor of the parent of  a_j if  a_j is a left child. We use d-ancestor (j) to denote the index of such ancestor. The weak-heap ordering enforces that no element is smaller than that at its distinguished ancestor.

// Finding the distinguished ancestor in a weak heap
procedure: d-ancestor
input: j: index
while (j & 1) =  r_\lfloor\dfrac{j}{2}\rfloor  
 j \leftarrow \lfloor\dfrac{j}{2}\rfloor
return  \lfloor\dfrac{j}{2}\rfloor 

2. Join: The subroutine combines two weak heaps into one weak heap conditioned on the following settings. Join requires O(1) time as and it involves one element comparison.

// Joining two weak heaps
procedure: join
input: i, j: indices
if ( a_j < a_i){
swap( a_i, a_j)
r_j \leftarrow 1 $-$ r_j 
return false
return true

3. Construct: A weak heap of size n can be constructed using  n$-$1 element comparisons by performing  n$-$ 1 calls to the join subroutine.

//Constructing a weak heap
procedure: construct
input: a: array of n elements; r: array of n bits
for i \in{0, 1, . . ., n  $-$  1}
 r_i \leftarrow 0 
for j \in {n  $-$  1, n  $-$  2, . . ., 1}
 i \leftarrow d$-$ancestor(j) 
join(i, j)

4. Sift-up: The subroutine sift-up(j) is used to re-establish the weak-heap ordering between the element e, initially at location j, and those at the ancestors of  a_j . Starting from location j, while e is not at the root and is smaller than the element at its distinguished ancestor, we swap the two elements, flip the bit of the node that previously contained e, and repeat from the new location of e.

// reestablishing the weak-heap ordering 
// on the path from  a_j  upwards
procedure: sift-up
input: j: index
while (j != 0){
 i \leftarrow d$-$ancestor(j) 
if join(i, j){
 j \leftarrow i 

5. Insert: To insert an element e, we first add e to the next available array entry making it a leaf in the heap. If this leaf is the only child of its parent, we make it a left child by updating the reverse bit at the parent. To reestablish the weak-heap ordering, we call the sift-up subroutine starting from the location of e. It follows that insert requires O(lg n) time and involves at most  \lceil lg   n\rceil  element comparisons.

// Inserting an element into a weak heap.
procedure: insert
input: a: array of n elements; r: array of n bits; e: element
 a_n \leftarrow e 
 r_n \leftarrow 0 
if( (n & 1) = 0){
r_\lfloor\dfrac{n}{2}\rfloor  \leftarrow 0 

6. Sift-down: The subroutine sift-down(j) is used to re-establish the weak-heap ordering between the element at location j and those in the right sub-tree of  a_j . Starting from the right child of  a_j , the last node on the left spine of the right sub-tree of  a_j is identified; this is done by repeatedly visiting left children until reaching a node that has no left child. The path from this node to the right child of  a_j is traversed upwards, and join operations are repeatedly performed between  a_j and the nodes along this path. After each join, the element at location j is less than or equal to every element in the left sub-tree of the node considered in the next join.

7. delete-min: To perform delete-min, the element stored at the root of the weak heap is replaced with that stored at the last occupied array entry. To restore the weak heap ordering, a sift-down is called for the new root. Thus, delete-min requires O(lg n) time and involves at most lg n element comparisons.

Relation Of Weak Heap with Binary Heap and Binomial Heap

The structure of weak heap is same as binary tree arrangement. A perfect weak heap that stores exactly  $2^r$ elements is a binary-tree representation of a heap-ordered binomial tree of rank r. The node k has a left child at an index of 2k and right child at an index of 2k + 1, assuming the root is at index 0(In binary heap it was 2k+1 for left child and 2k+2 for right child). The only difference is that the root of weak heap has no left child, only right child, which is stored at an index of 2*0+1=1.

Also the structure of Weak Heap is very similar to a binomial heap, with a tree of height h being composed of a root plus trees of heights h – 1, h – 2, …, 1. Like binomial heaps, the fundamental operation on weak heaps is merging two heaps of equal height h, to make a weak heap of height h+1. This requires exactly one comparison, between the roots. Whichever root is greater (assuming a max-heap) is the final root. The first child of the final root is the losing root, which retains its children (right subtree). The winning root’s children are inserted as siblings of the losing root.

The distinguishable properties of a weak heap are:
1) It can be imperfect (in contrast to a binomial tree);
2) It is a single tree (in contrast to a binomial queue, which is a collection of perfect trees);
3) It is fairly balanced .

Applications of weak heap
1. It can be used as an intermediate step for efficient construction of binary heaps.
2. A weak heap variant, which allows some of the nosed to violate weak heap ordering, is used for graph search and network optimization, and is known to be provably better than a Fibonacci Heap.

Attention reader! Don’t stop learning now. Get hold of all the important DSA concepts with the DSA Self Paced Course at a student-friendly price and become industry ready.  To complete your preparation from learning a language to DS Algo and many more,  please refer Complete Interview Preparation Course.

In case you wish to attend live classes with experts, please refer DSA Live Classes for Working Professionals and Competitive Programming Live for Students.

My Personal Notes arrow_drop_up
Recommended Articles
Page :