Shell sort
From Wikipedia, the free encyclopedia
Shell sort is a sorting algorithm that, with its original implementation, requires O(n2) comparisons and exchanges in the worst case. A minor change given in V. Pratt's book produces an implementation with worst case performance of O(nlog2 n). This is better than the Ω(n2) comparisons required by naive algorithms but worse than the optimal O(n log n) (see comparison sort). Although it is easy to develop an intuitive sense of how this algorithm works, it is very difficult to analyze its execution time.
Shell sort is a generalization of insertion sort, with two observations in mind:
- Insertion sort is efficient if the input is "almost sorted".
- Insertion sort is inefficient, on average, because it moves values just one position at a time.
Shell sort improves insertion sort by comparing elements separated by a gap of several positions. This lets an element take "bigger steps" toward its expected position. Multiple passes over the data are taken with smaller and smaller gap sizes. The last step of Shell sort is a plain insertion sort, but by then, the array of data is guaranteed to be almost sorted.
Consider a small value that is initially stored in the wrong end of the array. Using an O(n2) sort such as bubble sort or insertion sort, it will take roughly n comparisons and exchanges to move this value all the way to the other end of the array. Shell sort first moves values using giant step sizes, so a small value will move a long way towards its final position, with just a few comparisons and exchanges.
One can visualize Shellsort in the following way: arrange the list into a table and sort the columns (using an insertion sort). Repeat this process, each time with smaller number of longer columns. At the end, the table has only one column. While transforming the list into a table makes it easier to visualize, the algorithm itself does its sorting in-place (by incrementing the index by the step size, i.e. using i += step_size
instead of i++
).
For example, consider a list of numbers like [ 13 14 94 33 82 25 59 94 65 23 45 27 73 25 39 10 ]
. If we started with a step-size of 5, we could visualize this as breaking the list of numbers into a table with 5 columns. This would look like this:
13 14 94 33 82 25 59 94 65 23 45 27 73 25 39 10
We then sort each column, which gives us
10 14 73 25 23 13 27 94 33 39 25 59 94 65 82 45
When read back as a single list of numbers, we get [ 10 14 73 25 23 13 27 94 33 39 25 59 94 65 82 45 ]
. Here, the 10 which was all the way at the end, has moved all the way to the beginning. This list is then again sorted using a 3-gap sort, and then 1-gap sort (simple insertion sort).
The Shell sort is named after its inventor, Donald Shell, who published it in 1959. Some older textbooks and references call this the "Shell-Metzner" sort after Marlene Metzner Norton, but according to Metzner, "I had nothing to do with the sort, and my name should never have been attached to it." [1]
Contents |
[edit] Gap sequence
The gap sequence is an integral part of the shellsort algorithm. Any increment sequence will work, so long as the last element is 1. The algorithm begins by performing a gap insertion sort, with the gap being the first number in the gap sequence. It continues to perform a gap insertion sort for each number in the sequence, until it finishes with a gap of 1. When the gap is 1, the gap insertion sort is simply an ordinary insertion sort, guaranteeing that the final list is sorted.
The gap sequence that was originally suggested by Donald Shell was to begin with N / 2 and to halve the number until it reaches 1. While this sequence provides significant performance enhancements over the quadratic algorithms such as insertion sort, it can be changed slightly to further decrease the average and worst-case running times. Weiss' textbook demonstrates that this sequence allows a worst case O(n2) sort, if the data is initially in the array as (small_1, large_1, small_2, large_2, ...) - that is, the upper half of the numbers are placed, in sorted order, in the even index locations and the lower end of the numbers are placed similarly in the odd indexed locations.
Perhaps the most crucial property of Shellsort is that the elements remain k-sorted even as the gap diminishes. For instance, if a list was 5-sorted and then 3-sorted, the list is now not only 3-sorted, but both 5- and 3-sorted. If this were not true, the algorithm would undo work that it had done in previous iterations, and would not achieve such a low running time.
Depending on the choice of gap sequence, Shellsort has a proven worst-case running time of O(n2) (using Shell's increments that start with 1/2 the array size and divide by 2 each time), O(n3 / 2) (using Hibbard's increments of 2k − 1), O(n4 / 3) (using Sedgewick's increments of 9(4i) − 9(2i) + 1, or 4i + 1 + 3(2i) + 1), or O(nlog2n), and possibly unproven better running times. The existence of an O(nlogn) worst-case implementation of Shellsort remains an open research question.
[edit] Shell sort algorithm in C/C++
Shell sort is commonly used in programming languages; this is an implementation of the algorithm in C/C++ for sorting an array of integers. The increment sequence used in this example code gives an O(n2) worst-case running time.
void shell_sort(int A[], int size) { int i, j, incrmnt, temp; incrmnt = size / 2; while (incrmnt > 0) { for (i=incrmnt; i < size; i++) { j = i; temp = A[i]; while ((j >= incrmnt) && (A[j-incrmnt] > temp)) { A[j] = A[j - incrmnt]; j = j - incrmnt; } A[j] = temp; } incrmnt /= 2; } }
[edit] Shell sort algorithm in Java
The Java implementation of Shell sort is as follows:
public static void shellSort(int[] a) { for (int increment = a.length / 2; increment > 0; increment = (increment == 2 ? 1 : (int) Math.round(increment / 2.2))) { for (int i = increment; i < a.length; i++) { for (int j = i; j >= increment && a[j - increment] > a[j]; j -= increment) { int temp = a[j]; a[j] = a[j - increment]; a[j - increment] = temp; } } } }
For some unexplained reason, when the increment is divided by 2.2 as opposed to being continuously halved, this algorithm works about 25 to 30 per cent faster on large inputs.[citation needed]
[edit] Reference
- Weiss, Mark Allen (2002). Data Structures & Problem Solving using Java. Addison Wesley. ISBN 0-201-74835-5.
- Pratt, V (1979). Shellsort and sorting networks (Outstanding dissertations in the computer sciences). Garland. ISBN 0-824-04406-1. (This was originally presented as the author's Ph.D. thesis, Stanford University, 1971)