Selection Sort
Selection Sort
Selection sort
From Wikipedia, the free encyclopedia
The algorithm divides the input list into two parts: the sublist of items
already sorted, which is built up from left to right at the front (left) of
the list, and the sublist of items remaining to be sorted that occupy the
rest of the list. Initially, the sorted sublist is empty and the unsorted
sublist is the entire input list. The algorithm proceeds by finding the
smallest (or largest, depending on sorting order) element in the
unsorted sublist, exchanging it with the leftmost unsorted element
(putting it in sorted order), and moving the sublist boundaries one
element to the right. Selection sort animation
Here is an example of this sort algorithm sorting five elements: Class Sorting algorithm
Data structure Array
64 25 12 22 11 // this is the initial, starting state of the array
Worst case performance О(n2)
11 25 12 22 64 // sorted sublist = {11}
Selection sort can also be used on list structures that make add and remove efficient, such as a linked list. In this case it is
more common to remove the minimum element from the remainder of the list, and then insert it at the end of the values
sorted so far. For example:
64 25 12 22 11
11 64 25 12 22
11 12 64 25 22
11 12 22 64 25
11 12 22 25 64
http://en.wikipedia.org/wiki/Selection_sort 1/5
2/26/2015 Selection sort - Wikipedia, the free encyclopedia
Selection sort
animation. Red
is current min.
Yellow is sorted
list. Blue is
current item.
if(iMin != j) {
swap(a[j], a[iMin]);
}
Contents
http://en.wikipedia.org/wiki/Selection_sort 2/5
2/26/2015 Selection sort - Wikipedia, the free encyclopedia
1 Mathematical definition
2 Analysis
3 Comparison to other sorting algorithms
4 Variants
5 See also
6 References
7 External links
Mathematical definition
Let be a non-empty set and such that where:
1. is a permutation of ,
2. for all and ,
3. ,
Analysis
Selection sort is not difficult to analyze compared to other sorting algorithms since none of the loops depend on the data
in the array. Selecting the lowest element requires scanning all n elements (this takes n − 1 comparisons) and then
swapping it into the first position. Finding the next lowest element requires scanning the remaining n − 1 elements and so
on, for (n − 1) + (n − 2) + ... + 2 + 1 = n(n − 1) / 2 ∈ Θ(n2) comparisons (see arithmetic progression). Each of these scans
requires one swap for n − 1 elements (the final element is already in place).
Simple calculation shows that insertion sort will therefore usually perform about half as many comparisons as selection
sort, although it can perform just as many or far fewer depending on the order the array was in prior to sorting. It can be
seen as an advantage for some real-time applications that selection sort will perform identically regardless of the order of
the array, while insertion sort's running time can vary considerably. However, this is more often an advantage for
insertion sort in that it runs much more efficiently if the array is already sorted or "close to sorted."
While selection sort is preferable to insertion sort in terms of number of writes (Θ(n) swaps versus Ο(n2) swaps), it
almost always far exceeds (and never beats) the number of writes that cycle sort makes, as cycle sort is theoretically
optimal in the number of writes. This can be important if writes are significantly more expensive than reads, such as with
EEPROM or Flash memory, where every write lessens the lifespan of the memory.
http://en.wikipedia.org/wiki/Selection_sort 3/5
2/26/2015 Selection sort - Wikipedia, the free encyclopedia
Finally, selection sort is greatly outperformed on larger arrays by Θ(n log n) divide-and-conquer algorithms such as
mergesort. However, insertion sort or selection sort are both typically faster for small arrays (i.e. fewer than 10–20
elements). A useful optimization in practice for the recursive algorithms is to switch to insertion sort or selection sort for
"small enough" sublists.
Variants
Heapsort greatly improves the basic algorithm by using an implicit heap data structure to speed up finding and removing
the lowest datum. If implemented correctly, the heap will allow finding the next lowest element in Θ(log n) time instead
of Θ(n) for the inner loop in normal selection sort, reducing the total running time to Θ(n log n).
A bidirectional variant of selection sort, called cocktail sort, is an algorithm which finds both the minimum and
maximum values in the list in every pass. This reduces the number of scans of the list by a factor of 2, eliminating some
loop overhead but not actually decreasing the number of comparisons or swaps. Note, however, that cocktail sort more
often refers to a bidirectional variant of bubble sort.
Selection sort can be implemented as a stable sort. If, rather than swapping in step 2, the minimum value is inserted into
the first position (that is, all intervening items moved down), the algorithm is stable. However, this modification either
requires a data structure that supports efficient insertions or deletions, such as a linked list, or it leads to performing
Θ(n2) writes.
In the bingo sort variant, items are ordered by repeatedly looking through the remaining items to find the greatest value
and moving all items with that value to their final location.[1] Like counting sort, this is an efficient variant if there are
many duplicate values. Indeed, selection sort does one pass through the remaining items for each item moved. Bingo sort
does one pass for each value (not item): after an initial pass to find the biggest value, the next passes can move every item
with that value to its final location while finding the next value as in the following pseudocode (arrays are zero-based
and the for-loop includes both the top and bottom limits, as in Pascal):
bingo(array A)
{ The first iteration is written to look very similar to the subsequent ones, but
without swaps. }
nextValue := A[max];
for i := max - 1 downto 0 do
if A[i] > nextValue then
nextValue := A[i];
while (max > 0) and (A[max] = nextValue) do
max := max - 1;
Thus, if on average there are more than two items with the same value, bingo sort can be expected to be faster because it
executes the inner loop fewer times than selection sort.
http://en.wikipedia.org/wiki/Selection_sort 4/5
2/26/2015 Selection sort - Wikipedia, the free encyclopedia
See also
Selection algorithm
References
1. ^ Paul E. Black, Bingo sort (http://www.nist.gov/dads/HTML/bingosort.html) at the NIST Dictionary of Algorithms and Data
Structures.
Donald Knuth. The Art of Computer Programming, Volume 3: Sorting and Searching, Third Edition. Addison–Wesley, 1997.
ISBN 0-201-89685-0. Pages 138–141 of Section 5.2.3: Sorting by Selection.
Anany Levitin. Introduction to the Design & Analysis of Algorithms, 2nd Edition. ISBN 0-321-35828-7. Section 3.1: Selection
Sort, pp 98–100.
Robert Sedgewick. Algorithms in C++, Parts 1–4: Fundamentals, Data Structure, Sorting, Searching: Fundamentals, Data
Structures, Sorting, Searching Pts. 1–4, Second Edition. Addison–Wesley Longman, 1998. ISBN 0-201-35088-2. Pages 273–
274
External links
Animated Sorting Algorithms: Selection Sort (http://www.sorting- The Wikibook Algorithm
algorithms.com/selection-sort) – graphical demonstration and discussion of implementation has a page
on the topic of: Selection
selection sort sort
http://en.wikipedia.org/wiki/Selection_sort 5/5