Divide and Conquer: Slides by Kevin Wayne. All Rights Reserved
Divide and Conquer: Slides by Kevin Wayne. All Rights Reserved
Chapter 5
Divide and Conquer
Slides by Kevin Wayne.
Copyright 2005 Pearson-Addison Wesley.
All rights reserved.
2
Divide-and-Conquer
Divide-and-conquer.
Break up problem into several parts.
Solve each part recursively.
Combine solutions to sub-problems into overall solution.
Most common usage.
Break up problem of size n into two equal parts of size n.
Solve two parts recursively.
Combine two solutions into overall solution in linear time.
Consequence.
Brute force: n
2
.
Divide-and-conquer: n log n.
Divide et impera.
Veni, vidi, vici.
- Julius Caesar
5.1 Mergesort
4
Obvious sorting
applications.
List files in a
directory.
Organize an MP3
library.
List names in a phone
book.
Display Google
PageRank results.
Problems become easier
once sorted.
Find the median.
Find the closest pair.
Binary search in a
database.
Identify statistical
outliers.
Find duplicates in a
mailing list.
Non-obvious sorting
applications.
Data compression.
Computer graphics.
Interval scheduling.
Computational biology.
Minimum spanning tree.
Supply chain
management.
Simulate a system of
particles.
Book recommendations
on Amazon.
Load balancing on a
parallel computer.
. . .
Sorting
Sorting. Given n elements, rearrange in ascending order.
5
Mergesort
Mergesort.
Divide array into two halves.
Recursively sort each half.
Merge two halves to make sorted whole.
merge
sort
divide
A L G O R I T H M S
A L G O R I T H M S
A G L O R H I M S T
A G H I L M O R S T
Jon von Neumann (1945)
O(n)
2T(n/2)
O(1)
6
Merging
Merging. Combine two pre-sorted lists into a sorted whole.
How to merge efficiently?
Linear number of comparisons.
Use temporary array.
Challenge for the bored. In-place merge. [Kronrud, 1969]
A G L O R H I M S T
A G H I
using only a constant amount of extra storage
7
A Useful Recurrence Relation
Def. T(n) = number of comparisons to mergesort an input of size n.
Mergesort recurrence.
Solution. T(n) = O(n log
2
n).
Assorted proofs. We describe several ways to prove this recurrence.
Initially we assume n is a power of 2 and replace s with =.
T(n) s
0 if n =1
T n/ 2 (
( )
solve left half
+ T n/ 2
( )
solve right half
+ n
merging
otherwise
8
Proof by Recursion Tree
T(n)
T(n/2) T(n/2)
T(n/4) T(n/4) T(n/4) T(n/4)
T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2)
n
T(n / 2
k
)
2(n/2)
4(n/4)
2
k
(n / 2
k
)
n/2
(2)
. . .
. . .
log
2
n
n
log
2
n
T(n) =
0 if n =1
2T(n/ 2)
sorting both halves
+ n
merging
otherwise
9
Proof by Telescoping
Claim. If T(n) satisfies this recurrence, then T(n) = n log
2
n.
Pf. For n > 1:
T(n)
n
=
2T(n/ 2)
n
+ 1
=
T(n/ 2)
n/ 2
+ 1
=
T(n/ 4)
n/ 4
+ 1 + 1
=
T(n/ n)
n/ n
+ 1 + + 1
log
2
n
= log
2
n
T(n) =
0 if n =1
2T(n/ 2)
sorting both halves
+ n
merging
otherwise
assumes n is a power of 2
10
Proof by Induction
Claim. If T(n) satisfies this recurrence, then T(n) = n log
2
n.
Pf. (by induction on n)
Base case: n = 1.
Inductive hypothesis: T(n) = n log
2
n.
Goal: show that T(2n) = 2n log
2
(2n).
T(2n) = 2T(n) + 2n
= 2nlog
2
n + 2n
= 2n log
2
(2n) 1
( )
+ 2n
= 2nlog
2
(2n)
assumes n is a power of 2
T(n) =
0 if n =1
2T(n/ 2)
sorting both halves
+ n
merging
otherwise
11
Analysis of Mergesort Recurrence
Claim. If T(n) satisfies the following recurrence, then T(n) s n lg
n(.
Pf. (by induction on n)
Base case: n = 1.
Define n
1
= n / 2 , n
2
= n / 2(.
Induction step: assume true for 1, 2, ... , n1.
T(n) s T(n
1
) + T(n
2
) + n
s n
1
lg n
1
(
+ n
2
lg n
2
( + n
s n
1
lg n
2
( + n
2
lg n
2
( + n
= n lg n
2
( + n
s n( lg n (1 ) + n
= n lg n (
n
2
= n/ 2 (
s 2
lg n (
/ 2
(
= 2
lg n (
/ 2
lg n
2
s lg n ( 1
T(n) s
0 if n =1
T n/ 2 (
( )
solve left half
+ T n/ 2
( )
solve right half
+ n
merging
otherwise
log
2
n
5.3 Counting Inversions
13
Music site tries to match your song preferences with others.
You rank n songs.
Music site consults database to find people with similar tastes.
Similarity metric: number of inversions between two rankings.
My rank: 1, 2, , n.
Your rank: a
1
, a
2
, , a
n
.
Songs i and j inverted if i < j, but a
i
> a
j
.
Brute force: check all O(n
2
) pairs i and j.
You
Me
1 4 3 2 5
1 3 2 4 5
A B C D E
Songs
Counting Inversions
Inversions
3-2, 4-2
14
Applications
Applications.
Voting theory.
Collaborative filtering.
Measuring the "sortedness" of an array.
Sensitivity analysis of Google's ranking function.
Rank aggregation for meta-searching on the Web.
Nonparametric statistics (e.g., Kendall's Tau distance).
15
Counting Inversions: Divide-and-Conquer
Divide-and-conquer.
4 8 10 2 1 5 12 11 3 7 6 9
16
Counting Inversions: Divide-and-Conquer
Divide-and-conquer.
Divide: separate list into two pieces.
4 8 10 2 1 5 12 11 3 7 6 9
4 8 10 2 1 5 12 11 3 7 6 9
Divide: O(1).
17
Counting Inversions: Divide-and-Conquer
Divide-and-conquer.
Divide: separate list into two pieces.
Conquer: recursively count inversions in each half.
4 8 10 2 1 5 12 11 3 7 6 9
4 8 10 2 1 5 12 11 3 7 6 9
5 blue-blue inversions 8 green-green inversions
Divide: O(1).
Conquer: 2T(n / 2)
5-4, 5-2, 4-2, 8-2, 10-2 6-3, 9-3, 9-7, 12-3, 12-7, 12-11, 11-3, 11-7
18
Counting Inversions: Divide-and-Conquer
Divide-and-conquer.
Divide: separate list into two pieces.
Conquer: recursively count inversions in each half.
Combine: count inversions where a
i
and a
j
are in different halves,
and return sum of three quantities.
4 8 10 2 1 5 12 11 3 7 6 9
4 8 10 2 1 5 12 11 3 7 6 9
5 blue-blue inversions 8 green-green inversions
Divide: O(1).
Conquer: 2T(n / 2)
Combine: ???
9 blue-green inversions
5-3, 4-3, 8-6, 8-3, 8-7, 10-6, 10-9, 10-3, 10-7
Total = 5 + 8 + 9 = 22.
19
13 blue-green inversions: 6 + 3 + 2 + 2 + 0 + 0
Counting Inversions: Combine
Combine: count blue-green inversions
Assume each half is sorted.
Count inversions where a
i
and a
j
are in different halves.
Merge two sorted halves into sorted whole.
Count: O(n)
Merge: O(n)
10 14 18 19 3 7 16 17 23 25 2 11
7 10 11 14 2 3 18 19 23 25 16 17
T(n) s T n/2
( )
+T n/2 (
( )
+ O(n) T(n) = O(nlogn)
6 3 2 2 0 0
to maintain sorted invariant
20
Counting Inversions: Implementation
Pre-condition. [Merge-and-Count] A and B are sorted.
Post-condition. [Sort-and-Count] L is sorted.
Sort-and-Count(L) {
if list L has one element
return 0 and the list L
Divide the list into two halves A and B
(r
A
, A) Sort-and-Count(A)
(r
B
, B) Sort-and-Count(B)
(r
B
, L) Merge-and-Count(A, B)
return r = r
A
+ r
B
+ r and the sorted list L
}
5.4 Closest Pair of Points
22
Closest Pair of Points
Closest pair. Given n points in the plane, find a pair with smallest
Euclidean distance between them.
Fundamental geometric primitive.
Graphics, computer vision, geographic information systems,
molecular modeling, air traffic control.
Special case of nearest neighbor, Euclidean MST, Voronoi.
Brute force. Check all pairs of points p and q with O(n
2
) comparisons.
1-D version. O(n log n) easy if points are on a line.
Assumption. No two points have same x coordinate.
to make presentation cleaner
fast closest pair inspired fast algorithms for these problems
23
Closest Pair of Points: First Attempt
Divide. Sub-divide region into 4 quadrants.
L
24
Closest Pair of Points: First Attempt
Divide. Sub-divide region into 4 quadrants.
Obstacle. Impossible to ensure n/4 points in each piece.
L
25
Closest Pair of Points
Algorithm.
Divide: draw vertical line L so that roughly n points on each side.
L
26
Closest Pair of Points
Algorithm.
Divide: draw vertical line L so that roughly n points on each side.
Conquer: find closest pair in each side recursively.
12
21
L
27
Closest Pair of Points
Algorithm.
Divide: draw vertical line L so that roughly n points on each side.
Conquer: find closest pair in each side recursively.
Combine: find closest pair with one point in each side.
Return best of 3 solutions.
12
21
8
L
seems like O(n
2
)
28
Closest Pair of Points
Find closest pair with one point in each side, assuming that distance < o.
12
21
o = min(12, 21)
L
29
Closest Pair of Points
Find closest pair with one point in each side, assuming that distance < o.
Observation: only need to consider points within o of line L.
12
21
o
L
o = min(12, 21)
30
12
21
1
2
3
4
5
6
7
o
Closest Pair of Points
Find closest pair with one point in each side, assuming that distance < o.
Observation: only need to consider points within o of line L.
Sort points in 2o-strip by their y coordinate.
L
o = min(12, 21)
31
12
21
1
2
3
4
5
6
7
o
Closest Pair of Points
Find closest pair with one point in each side, assuming that distance < o.
Observation: only need to consider points within o of line L.
Sort points in 2o-strip by their y coordinate.
Only check distances of those within 11 positions in sorted list!
L
o = min(12, 21)
32
Closest Pair of Points
Def. Let s
i
be the point in the 2o-strip, with
the i
th
smallest y-coordinate.
Claim. If |i j| > 12, then the distance between
s
i
and s
j
is at least o.
Pf.
No two points lie in same o-by-o box.
Two points at least 2 rows apart
have distance > 2(o).
Fact. Still true if we replace 12 with 7.
o
27
29
30
31
28
26
25
o
o
2 rows
o
o
39
i
j
33
Closest Pair Algorithm
Closest-Pair(p
1
, , p
n
) {
Compute separation line L such that half the points
are on one side and half on the other side.
o
1
= Closest-Pair(left half)
o
2
= Closest-Pair(right half)
o
= min(o
1
, o
2
)
Delete all points further than o from separation line L
Sort remaining points by y-coordinate.
Scan points in y-order and compare distance between
each point and next 11 neighbors. If any of these
distances is less than o, update o.
return o.
}
O(n log n)
2T(n / 2)
O(n)
O(n log n)
O(n)
34
Closest Pair of Points: Analysis
Running time.
Q. Can we achieve O(n log n)?
A. Yes. Don't sort points in strip from scratch each time.
Each recursive returns two lists: all points sorted by y coordinate,
and all points sorted by x coordinate.
Sort by merging two pre-sorted lists.
T(n) s 2T n/2
( )
+ O(n) T(n) = O(n logn)
T(n) s 2T n/ 2
( )
+ O(n logn) T(n) = O(n log
2
n)
5.5 Integer Multiplication
36
Integer Arithmetic
Add. Given two n-digit integers a and b, compute a + b.
O(n) bit operations.
Multiply. Given two n-digit integers a and b, compute a b.
Brute force solution: O(n
2
) bit operations.
1
1
0
0
0
1
1
1
0
0
1
1
1
1
0
0
1
1
1
1
0
1
0
1
0
0
0
0
0
0
0
0
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
1
0
1
1
1
0
1
1
1
1
1
0
0
*
1
0
1
1
1
1
1
0
1
+
0
1
0
1
1
1
1
0
1
0
1
0
1
1
1
1
0
0
0
1
0
1
1
1
Add
Multiply
37
To multiply two n-digit integers:
Multiply four n-digit integers.
Add two n-digit integers, and shift to obtain result.
Divide-and-Conquer Multiplication: Warmup
T(n) = 4T n/2
( )
recursive calls
+ O(n)
add, shift
T(n) = O(n
2
)
x = 2
n/ 2
x
1
+ x
0
y = 2
n/ 2
y
1
+ y
0
xy = 2
n/ 2
x
1
+ x
0 ( )
2
n/ 2
y
1
+ y
0 ( )
= 2
n
x
1
y
1
+ 2
n/ 2
x
1
y
0
+ x
0
y
1
( )
+ x
0
y
0
assumes n is a power of 2
38
To multiply two n-digit integers:
Add two n digit integers.
Multiply three n-digit integers.
Add, subtract, and shift n-digit integers to obtain result.
Theorem. [Karatsuba-Ofman, 1962] Can multiply two n-digit integers
in O(n
1.585
) bit operations.
Karatsuba Multiplication
x = 2
n/ 2
x
1
+ x
0
y = 2
n/ 2
y
1
+ y
0
xy = 2
n
x
1
y
1
+ 2
n / 2
x
1
y
0
+ x
0
y
1
( )
+ x
0
y
0
= 2
n
x
1
y
1
+ 2
n / 2
(x
1
+ x
0
) (y
1
+ y
0
) x
1
y
1
x
0
y
0
( )
+ x
0
y
0
T(n) s T n/ 2
( )
+ T n/ 2 (
( )
+ T 1+ n/ 2 (
( )
recursive calls
+ O(n)
add, subtract, shift
T(n) = O(n
log
2
3
) = O(n
1.585
)
A B C A C
39
Karatsuba: Recursion Tree
T(n) =
0 if n =1
3T(n/ 2) + n otherwise
n
3(n/2)
9(n/4)
3
k
(n / 2
k
)
3
lg n
(2)
. . .
. . .
T(n)
T(n/2)
T(n/4) T(n/4)
T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2)
T(n / 2
k
)
T(n/4)
T(n/2)
T(n/4) T(n/4) T(n/4)
T(n/2)
T(n/4) T(n/4) T(n/4)
. . .
. . .
T(n) = n
3
2
( )
k
k=0
log
2
n
=
3
2
( )
1+log
2
n
1
3
2
1
= 3n
log
2
3
2
Matrix Multiplication
41
Matrix multiplication. Given two n-by-n matrices A and B, compute C = AB.
Brute force. O(n
3
) arithmetic operations.
Fundamental question. Can we improve upon brute force?
Matrix Multiplication
c
ij
= a
ik
b
kj
k=1
n
c
11
c
12
c
1n
c
21
c
22
c
2n
c
n1
c
n2
c
nn
(
(
(
(
(
=
a
11
a
12
a
1n
a
21
a
22
a
2n
a
n1
a
n2
a
nn
(
(
(
(
(
b
11
b
12
b
1n
b
21
b
22
b
2n
b
n1
b
n2
b
nn
(
(
(
(
(
42
Matrix Multiplication: Warmup
Divide-and-conquer.
Divide: partition A and B into n-by-n blocks.
Conquer: multiply 8 n-by-n recursively.
Combine: add appropriate products using 4 matrix additions.
C
11
= A
11
B
11
( )
+ A
12
B
21
( )
C
12
= A
11
B
12
( )
+ A
12
B
22
( )
C
21
= A
21
B
11
( )
+ A
22
B
21
( )
C
22
= A
21
B
12
( )
+ A
22
B
22
( )
C
11
C
12
C
21
C
22
(
(
=
A
11
A
12
A
21
A
22
(
(
B
11
B
12
B
21
B
22
(
(
T(n) = 8T n/2
( )
recursive calls
+ O(n
2
)
add, form submatrices
T(n) = O(n
3
)
43
Matrix Multiplication: Key Idea
Key idea. multiply 2-by-2 block matrices with only 7 multiplications.
7 multiplications.
18 = 10 + 8 additions (or subtractions).
P
1
= A
11
( B
12
B
22
)
P
2
= ( A
11
+ A
12
) B
22
P
3
= ( A
21
+ A
22
) B
11
P
4
= A
22
( B
21
B
11
)
P
5
= ( A
11
+ A
22
) ( B
11
+ B
22
)
P
6
= ( A
12
A
22
) ( B
21
+ B
22
)
P
7
= ( A
11
A
21
) ( B
11
+ B
12
)
C
11
= P
5
+ P
4
P
2
+ P
6
C
12
= P
1
+ P
2
C
21
= P
3
+ P
4
C
22
= P
5
+ P
1
P
3
P
7
C
11
C
12
C
21
C
22
(
(
=
A
11
A
12
A
21
A
22
(
(
B
11
B
12
B
21
B
22
(
(
44
Fast Matrix Multiplication
Fast matrix multiplication. (Strassen, 1969)
Divide: partition A and B into n-by-n blocks.
Compute: 14 n-by-n matrices via 10 matrix additions.
Conquer: multiply 7 n-by-n matrices recursively.
Combine: 7 products into 4 terms using 8 matrix additions.
Analysis.
Assume n is a power of 2.
T(n) = # arithmetic operations.
T(n) = 7T n/ 2
( )
recursive calls
+ O(n
2
)
add, subtract
T(n) = O(n
log
2
7
) = O(n
2.81
)
45
Fast Matrix Multiplication in Practice
Implementation issues.
Sparsity.
Caching effects.
Numerical stability.
Odd matrix dimensions.
Crossover to classical algorithm around n = 128.
Common misperception: "Strassen is only a theoretical curiosity."
Advanced Computation Group at Apple Computer reports 8x speedup
on G4 Velocity Engine when n ~ 2,500.
Range of instances where it's useful is a subject of controversy.
Remark. Can "Strassenize" Ax=b, determinant, eigenvalues, and other
matrix ops.
46
Fast Matrix Multiplication in Theory
Q. Multiply two 2-by-2 matrices with only 7 scalar multiplications?
A. Yes! [Strassen, 1969]
Q. Multiply two 2-by-2 matrices with only 6 scalar multiplications?
A. Impossible. [Hopcroft and Kerr, 1971]
Q. Two 3-by-3 matrices with only 21 scalar multiplications?
A. Also impossible.
Q. Two 70-by-70 matrices with only 143,640 scalar multiplications?
A. Yes! [Pan, 1980]
Decimal wars.
December, 1979: O(n
2.521813
).
January, 1980: O(n
2.521801
).
O(n
log
3
21
) =O(n
2.77
)
O(n
log
70
143640
) =O(n
2.80
)
O(n
log
2
6
) =O(n
2.59
)
O(n
log
2
7
) =O(n
2.81
)
47
Fast Matrix Multiplication in Theory
Best known. O(n
2.376
) [Coppersmith-Winograd, 1987.]
Conjecture. O(n
2+c
) for any c > 0.
Caveat. Theoretical improvements to Strassen are progressively less
practical.