0% found this document useful (0 votes)
21 views35 pages

CS 332: Algorithms: Introduction To Heapsort

The running time of Heapify() is O(h), where h is the height of the subtree being heapified. This is because: - It makes at most one comparison per level of the subtree - The height of a nearly complete binary tree is O(logn) - So the number of comparisons is O(logn), which is O(h) Therefore, the overall running time of Heapify() is O(h). David Luebke 23 10/12/20 Analyzing Heapify(): Formally  Let T(h) be the worst-case running time of Heapify() on a subtree of height h

Uploaded by

Cympah Shuroh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views35 pages

CS 332: Algorithms: Introduction To Heapsort

The running time of Heapify() is O(h), where h is the height of the subtree being heapified. This is because: - It makes at most one comparison per level of the subtree - The height of a nearly complete binary tree is O(logn) - So the number of comparisons is O(logn), which is O(h) Therefore, the overall running time of Heapify() is O(h). David Luebke 23 10/12/20 Analyzing Heapify(): Formally  Let T(h) be the worst-case running time of Heapify() on a subtree of height h

Uploaded by

Cympah Shuroh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 35

CS 332: Algorithms

Introduction to heapsort

David Luebke 1
10/12/20
Review: The Master Theorem
 Given: a divide and conquer algorithm
 An algorithm that divides the problem of size n
into a subproblems, each of size n/b
 Let the cost of each stage (i.e., the work to divide
the problem + combine solved subproblems) be
described by the function f(n)
 Then, the Master Theorem gives us a
cookbook for the algorithm’s running time:

David Luebke 2
10/12/20
Review: The Master Theorem
 if T(n) = aT(n/b) + f(n) then
 

  n 
logb a
  
f (n)  O n logb a  

 
   0
T (n)   n 
logb a
log n  
f ( n)   n 
logb a

  c 1
 
 f (n)   
f (n)   n logb a  AND 
 
 af (n / b)  cf (n) for large n
David Luebke 3
10/12/20
Sorting Revisited
 So far we’ve talked about two algorithms to sort
an array of numbers
 What is the advantage of merge sort?
 Answer: O(n lg n) worst-case running time
 What is the advantage of insertion sort?
 Answer: sorts in place
 Also: When array “nearly sorted”, runs fast in practice
 Next on the agenda: Heapsort
 Combines advantages of both previous algorithms

David Luebke 4
10/12/20
Heaps
 A heap can be seen as a complete binary tree:
16

14 10

8 7 9 3

2 4 1

 What makes a binary tree complete?


 Is the example above complete?

David Luebke 5
10/12/20
Heaps
 A heap can be seen as a complete binary tree:
16

14 10

8 7 9 3

2 4 1 1 1 1 1 1

 The book calls them “nearly complete” binary trees;


can think of unfilled slots as null pointers

David Luebke 6
10/12/20
Heaps
 In practice, heaps are usually implemented as
arrays:
16

14 10

8 7 9 3
A = 16 14 10 8 7 9 3 2 4 1 =
2 4 1

David Luebke 7
10/12/20
Heaps
 To represent a complete binary tree as an array:
 The root node is A[1]
 Node i is A[i]
 The parent of node i is A[i/2] (note: integer divide)
 The left child of node i is A[2i]
 The right child of node i is A[2i + 1]
16

14 10

8 7 9 3
A = 16 14 10 8 7 9 3 2 4 1 =
2 4 1

David Luebke 8
10/12/20
Referencing Heap Elements
 So…
Parent(i) { return i/2; }
Left(i) { return 2*i; }
right(i) { return 2*i + 1; }
 An aside: How would you implement this
most efficiently?
 Trick question, I was looking for “i << 1”, etc.
 But, any modern compiler is smart enough to do this
for you (and it makes the code hard to follow)

David Luebke 9
10/12/20
The Heap Property
 Heaps also satisfy the heap property:
A[Parent(i)]  A[i] for all nodes i > 1
 In other words, the value of a node is at most the
value of its parent
 Where is the largest element in a heap stored?

David Luebke 10
10/12/20
Heap Height
 Definitions:
 The height of a node in the tree = the number of
edges on the longest downward path to a leaf
 The height of a tree = the height of its root
 What is the height of an n-element heap?
Why?
 This is nice: basic heap operations take at most
time proportional to the height of the heap

David Luebke 11
10/12/20
Heap Operations: Heapify()
 Heapify(): maintain the heap property
 Given: a node i in the heap with children l and r
 Given: two subtrees rooted at l and r, assumed to be
heaps
 Problem: The subtree rooted at i may violate the heap
property (How?)
 Action: let the value of the parent node “float down” so
subtree at i satisfies the heap property
 What do you suppose will be the basic operation between i, l,
and r?

David Luebke 12
10/12/20
Heap Operations: Heapify()
Heapify(A, i)
{
l = Left(i); r = Right(i);
if (l <= heap_size(A) && A[l] > A[i])
largest = l;
else
largest = i;
if (r <= heap_size(A) && A[r] > A[largest])
largest = r;
if (largest != i)
Swap(A, i, largest);
Heapify(A, largest);
}

David Luebke 13
10/12/20
Heapify() Example

16

4 10

14 7 9 3

2 8 1

A = 16 4 10 14 7 9 3 2 8 1

David Luebke 14
10/12/20
Heapify() Example

16

4 10

14 7 9 3

2 8 1

A = 16 4 10 14 7 9 3 2 8 1

David Luebke 15
10/12/20
Heapify() Example

16

4 10

14 7 9 3

2 8 1

A = 16 4 10 14 7 9 3 2 8 1

David Luebke 16
10/12/20
Heapify() Example

16

14 10

4 7 9 3

2 8 1

A = 16 14 10 4 7 9 3 2 8 1

David Luebke 17
10/12/20
Heapify() Example

16

14 10

4 7 9 3

2 8 1

A = 16 14 10 4 7 9 3 2 8 1

David Luebke 18
10/12/20
Heapify() Example

16

14 10

4 7 9 3

2 8 1

A = 16 14 10 4 7 9 3 2 8 1

David Luebke 19
10/12/20
Heapify() Example

16

14 10

8 7 9 3

2 4 1

A = 16 14 10 8 7 9 3 2 4 1

David Luebke 20
10/12/20
Heapify() Example

16

14 10

8 7 9 3

2 4 1

A = 16 14 10 8 7 9 3 2 4 1

David Luebke 21
10/12/20
Heapify() Example

16

14 10

8 7 9 3

2 4 1

A = 16 14 10 8 7 9 3 2 4 1

David Luebke 22
10/12/20
Analyzing Heapify(): Informal
 Aside from the recursive call, what is the
running time of Heapify()?
 How many times can Heapify() recursively
call itself?
 What is the worst-case running time of
Heapify() on a heap of size n?

David Luebke 23
10/12/20
Analyzing Heapify(): Formal
 Fixing up relationships between i, l, and r takes
(1) time
 If the heap at i has n elements, how many
elements can the subtrees at l or r have?
 Draw it
 Answer: 2n/3 (worst case: bottom row 1/2 full)
 So time taken by Heapify() is given by

T(n)  T(2n/3) + (1)

David Luebke 24
10/12/20
Analyzing Heapify(): Formal
 So we have
T(n)  T(2n/3) + (1)
 By case 2 of the Master Theorem,

T(n) = O(lg n)
 Thus, Heapify() takes logarithmic time

David Luebke 25
10/12/20
Heap Operations: BuildHeap()
 We can build a heap in a bottom-up manner by
running Heapify() on successive subarrays
 Fact: for array of length n, all elements in range
A[n/2 + 1 .. n] are heaps (Why?)
 So:
 Walk backwards through the array from n/2 to 1, calling
Heapify() on each node.
 Order of processing guarantees that the children of node
i are heaps when i is processed

David Luebke 26
10/12/20
BuildHeap()
// given an unsorted array A, make A a heap
BuildHeap(A)
{
heap_size(A) = length(A);
for (i = length[A]/2 downto 1)
Heapify(A, i);
}

David Luebke 27
10/12/20
BuildHeap() Example
 Work through example
A = {4, 1, 3, 2, 16, 9, 10, 14, 8, 7}

1 3

2 16 9 10

14 8 7

David Luebke 28
10/12/20
Analyzing BuildHeap()
 Each call to Heapify() takes O(lg n) time
 There are O(n) such calls (specifically, n/2)
 Thus the running time is O(n lg n)
 Is this a correct asymptotic upper bound?
 Is this an asymptotically tight bound?
 A tighter bound is O(n)
 How can this be? Is there a flaw in the above
reasoning?

David Luebke 29
10/12/20
Analyzing BuildHeap(): Tight
 To Heapify() a subtree takes O(h) time
where h is the height of the subtree
 h = O(lg m), m = # nodes in subtree
 The height of most subtrees is small
 Fact: an n-element heap has at most n/2h+1
nodes of height h
 CLR 7.3 uses this fact to prove that
BuildHeap() takes O(n) time

David Luebke 30
10/12/20
Heapsort
 Given BuildHeap(), an in-place sorting
algorithm is easily constructed:
 Maximum element is at A[1]
 Discard by swapping with element at A[n]
 Decrement heap_size[A]
 A[n] now contains correct value
 Restore heap property at A[1] by calling
Heapify()
 Repeat, always swapping A[1] for A[heap_size(A)]

David Luebke 31
10/12/20
Heapsort
Heapsort(A)
{
BuildHeap(A);
for (i = length(A) downto 2)
{
Swap(A[1], A[i]);
heap_size(A) -= 1;
Heapify(A, 1);
}
}

David Luebke 32
10/12/20
Analyzing Heapsort
 The call to BuildHeap() takes O(n) time
 Each of the n - 1 calls to Heapify() takes
O(lg n) time
 Thus the total time taken by HeapSort()
= O(n) + (n - 1) O(lg n)
= O(n) + O(n lg n)
= O(n lg n)

David Luebke 33
10/12/20
Priority Queues
 Heapsort is a nice algorithm, but in practice
Quicksort (coming up) usually wins
 But the heap data structure is incredibly useful
for implementing priority queues
 A data structure for maintaining a set S of elements,
each with an associated value or key
 Supports the operations Insert(), Maximum(),
and ExtractMax()
 What might a priority queue be useful for?

David Luebke 34
10/12/20
Priority Queue Operations
 Insert(S, x) inserts the element x into set S
 Maximum(S) returns the element of S with
the maximum key
 ExtractMax(S) removes and returns the
element of S with the maximum key
 How could we implement these operations
using a heap?

David Luebke 35
10/12/20

You might also like