Introduction to Algorithms: Heaps

Paul Taylor

12 & 19 March 2001

A heap is a binary tree, represented as an array. Each node carries a value that is ≥ the value of its parent. This is what we mean by heap order. Any sorted array is in heap order, but not conversely. For example, the sequence 1,3,2 is in heap order, but it is not sorted.
This means that the root carries the least value. Insertion and deletion can be performed in O(logn) time.
A heap could be used to implement a priority queue, i.e. in which the entries need not be processed in precisely the order in which they arrived, but according to some assigned priority measurement.

1  The place of this topic in the course

The following instructions assume that you are familiar with Richard Bornat's notes about the heap data structure and the sifting up, sifting down, heapifying and sorting algorithms. See also Chapter 20 of Weiss's book.
Heap sort does seem to be an excellent teaching example, because Some of the common errors obviously make the program logically incorrect, but other misunderstandings lead to larger time complexities, and in particular to a quadratic sorting algorithm. This was one of the things that inspired me to write the TESTER.
The operations of moving things up and down the tree are called by different names by the various authors. I call it sifting; 1 this follows Peter Burton, who has taught the same algorithm in his third year course, Algorithms and Complexity. Mark Allen Weiss calls it percolation. Richard Bornat calls it bubbling, but this has got students confusing it with bubble sort.

2  The tree as an array

Draw a 15-node binary tree on paper. Number the nodes, starting with 1 at the root, and going from left to right across each level of the tree.
If a ("parent") node is numbered p, what is the number of its left child?
If a ("child") node is numbered c, what is the number of its parent?
These are paper exercises - move away from your computer to do them!
However, the first element of an array in JAVA is numbered 0. Draw another tree, starting the numbering from 0, and work out the formulae for the parent and left child.
In each of the JAVA classes that you implement (for various X), you will need a copy of the following code, with the formulae that you have just found inserted:
   class HeapX implements HeapXing {
      final static int ROOT = 0;
      public int root () { return ROOT; }
      public int parent      (int child)  { return ???; }
      public int left_child  (int parent) { return ???; }
      public int right_child (int parent) { return ???; }
      public int sibling     (int child)  { return ???; }
      public void choice (int c) {}
      public void heap_X (...) {
A sibling is either a sister or a brother. This method should convert between the left- and right- children of the same parent.
We shall also need to refer to the contents of the heap and capacity of the array used to implement it, as numbers.
Think of the array as a jug: its capacity may be 1 litre, but it may currently contain just 0.6 litres of water. The capacity is the greatest amount of water that it can contain. We may pour some of the water out, or add some more water to the jug, but if we want to store more than 1 litre of water, we shall need a bigger jug.
At first we shall pass around the array and its contents as program variables (of course, capacity=array.length), but at the end these will become instance variables in an objected-oriented implementation of heaps.

3  Sifting up and insertion

To insert a new value in a heap, you first put it at the end (increasing the contents by 1) and then sift it up.
So if the value is < that of its parent (violating heap order), you swap parent and child. The new value is now in the parent position, but may still violate heap order (by being < the value of the grandparent of the original position), so you carry on swapping.
This makes the algorithm sound like bubble sort, but See Richard Bornat's lecture notes for a pictorial explanation.
You can also watch my code in action by doing
   trace sift-up
in the usual way. The array will be shown as a tree, with the branches drawn in different ways depending on whether heap order is satisfied or not.
Sifting up very often terminates after one generation, so you may have to run it several times to see it working.
The sift_up method has the following signature, and belongs in a class that implements HeapSiftingUp. It also needs the parent and child formulae above.
   class SiftUp implements HeapSiftingUp {
       public void sift_up (int[] array, int position, int contents) {
The sift_up method does not itself do an insertion, or increment contents - that will be done in the object-oriented version in Section 9.
What the TESTER actually does here is to create a random heap, choose some (leaf) position in it, reduce the value there so that heap order is violated, and give the array and the faulty position to your method to fix.
If the heap has n elements, what is the (worst case) time complexity for inserting and sifting up a new value?
The TESTER will not do the timings for you here, because the time taken to restore the array between runs is greater than that taken to do the run of your code - it is far too delicate a measurement! You saw how difficult it was to get good results for binary search, but that doesn't alter the array, so it is not necessary for the TESTER to restore it between runs.
The tracing output from my code uses the method
   Trace.printheap (int root, boolean root_is_max,
                 int[] array, int contents);
for which the first two arguments should be 0 and false if you have followed the instructions above.

4  Sifting down and deletion

In the applications of heaps to priority queues and sorting, it is only the root element (which is at array[ROOT]) that we need to delete, so we shall only consider that case. This element is at the head of the queue, so it is deleted from the queue to be "dealt with".
However, we cannot simply discard array[ROOT] - something has to go in its place! Although it may perhaps seem the least appropriate value to choose, we move the last value in the heap, array[contents-1], to the root, and then sift it down to restore heap order.
Sifting up involved comparing the position that potentially violates heap order with its parent. Sifting down is similar, except that now there are (potentially) two children.
As before, you can trace sift-down to watch my code in action.
The code for sifting down is the most difficult part of the programming in this topic. Usually, students write complex, deeply nested if statements to handle all of the possibilities: It is rather common in programming - especially when doing filthy things like form-processing for Web sites - to have proliferations of cases like this. So it is an important programming skill to be able to prune tangled trees of if statements.
In fact, there is a trick in this case that makes sift_down only a couple of lines longer than sift_down. The idea is to decide between the two children (if there are indeed two) before looking at the parent at all.
You will, of course, find this trick implemented in the many versions of the code that can be found on the Web. Needless to say, these versions do not explain how they work. So, if you manage to work out this trick, use comments to explain it. I hardly need to say that it is good programming practice to do so, but it will also tell me whether you have found out how to do it for yourself or just copied the code from the Web.
Even though we shall only be deleting the root value, we shall later need to sift down from other positions. As usual, this generalisation is only a matter of using the argument as the initial value of the loop variable.
The template is as follows, together with the parent and child formulae above.
   class SiftDown implements HeapSiftingDown {
      void sift_down (int[] array, int position, int contents) {
Once again, the sift_down method does not itself do a deletion. The TESTER creates a random heap, with one of the values near the root increased, so that heap order is violated. The array and the faulty position are given to your method to fix.
Again, what is the (worst case) time complexity?
The TESTER cannot, unfortunately, tell you, for the same reason as before.

5  Making an array into a heap - standard way

The way that you normally construct a data structure is - to use its constructor methods.
In this case, you turn an array into a heap (i.e. impose heap order on it) by inserting the values one by one into the heap. Do
   trace heapify-by-insertion
to watch the model answer in the TESTER doing this.
The template is as follows, together with the parent and child formulae above.
   class Heapify implements Heapifying {
      public int[] heapify (int[] array, int contents) {
      private HeapSiftingUp sifter_up;
      public Heapify () {
         sifter_up = new TestHeapUp(); // or SiftUp() to use your own code
The contents argument for the sift_up method is not that of the heapify method: in fact it is heapify's loop counter.
You may allocate a new array if you wish, but the input and output can share the same array, with the output (the heap) first, essentially as you did with the sorted and unsorted segments in insertion sort and selection sort.
Considering the time complexity of each insertion operation, what would you expect the time complexity of heapify to be?
Use the TESTER to check your answer.

6  Making an array into a heap - fast way

Although what you have just done is the standard way to build a data structure, there is in fact a faster way to turn a random array into a heap.
Think of it as a tree. Each node also defines a tree, consisting of its descendants.
In particular, the leaves of the big tree (which, remember, constitute half of its nodes) define one-element trees.
Are these one-element trees in heap order?
Now consider a node whose tree of descendants is not, perhaps, in heap order, but assume that its two subtrees are. (This is the loop invariant.) What operation (that you have already programmed) do you need to perform to turn the bigger subtree (the node under consideration plus its two subtrees) into a heap?
So, working up (backwards) from (the parent of) the last element in the tree, you can impose heap order on the whole tree (array). Do
   trace heapify-by-sifting-down
to watch the model answer in the TESTER doing this.
In, implement this new way of heapifying an array.
Considering the time complexity of the operation that you performed on each node, what would you expect the time complexity of heapify to be?
Use the TESTER to check your answer.

7  Some more difficult questions to think about

(a) Explain why the complexity of the faster heapify is what the TESTER says, not what you originally thought. The analysis is the same as that for the "early exit" from binary search and the hybrid merge sort algorithm.
(b) Prove that this way to do heapify works, i.e. that the result is in heap order and is a permutation of the original array. The appropriate diagram is a tree, not a rectangle!
(c) If you look carefully at the graph, you will see a bend in it, just as you did in the logarithmic plot for binary search. The explanation is the same: fetching stuff from the secondary memory cache within the CPU or from RAM. Why, considering the way in which modern microprocessors are designed, is heap sort not such a good idea?
(d) If make an arbitrary change to the value at a particular node, or if you want to delete some node other than the root, what combination of sift_up and sift_down does the job?

8  Sorting

Use heapify to turn a given array into a heap, and then the other heap operations to extract a sorted array from the heap. As before, you can use the same array for input an output.
Your class HeapSort implements Sorting as before.
What do you expect the complexity to be? What does the TESTER say?

9  Heaps as Objects

Use the heap methods that you have written and already tested to fill in the code in the following template file called
When you insert a new element, check whether contents would exceed capacity. If so, create a new bigger array and copy the old one into it (cf. the jug). What is the best new capacity to choose?
   class Heap {
      private int capacity;
      private int contents;
      private int [ ] array;
      final static int ROOT = 0;
      public int root () { return ROOT; }
      private int parent      (int child)  { return ???; }
      private int left_child  (int parent) { return ???; }
      private int right_child (int parent) { return ???; }
      private int sibling     (int child)  { return ???; }
      private int choice = 0;
      public void choice (int c) { choice=c; }
      // make a new heap with contents=0 and capacity=size
      public Heap (int size) {
      // make a new heap with default size
      public Heap () { this (10); }
      // make a new heap by heafifying a given array
      public Heap (int[ ] data) {
      // transfer the heap to a bigger array (cf. jugs)
      private void rebuild (int new_size) {
      private sift_up (int position) {
      private sift_down (int position) {
      public void insert (int new_value) {
      public int delete_root () {
      private void heapify () {
      public int[ ] sort_me () {
      // use a "new Heap" so that your class "implements Sorting"
      public int[ ] sort (int[ ] array) {
      // maybe write your own (command line) user interface
      public static void main (String[ ] args) {
You can add other methods to make this class implement the various TESTER interfaces and thereby re-test your code. However, the TESTER will only allow you to do these one at a time.


1Sifting (not "shifting") is what you might do with a sieve to flour or soil to separate the finer grains from the lumps.

This is www.PaulTaylor.EU/algorithms/heap.html and it was derived from algorithms/heap.tex which was last modified on 21 July 2007.