CSC 153: Computer Science Fundamentals  Grinnell College  Spring, 2005 
Laboratory Exercise Reading  
This laboratory exercises introduces and analyzes the insertion sort and Quicksort as mechanisms to order numbers in an array.
Many applications require the maintenance of ordered data. In Java (and Scheme), a simple way to structure ordered data is in an array (or vector), such as the following ordered array A of integers:
One common sorting approach is based on code that assumes that the first part of an array is ordered and then adds successive items to this array segment until the entire array is sorted. To understand this approach, we first consider how to add one item to an ordered array segment. We then apply this work to each array element in turn to yield an ordered array.
Suppose items A[0], ..., A[k1] are ordered in array A:
The following code inserts an item into the array, so that items A[0], ..., A[k] become ordered:
int i = k1;
while ((i >= 0) && a[i] > item){
a[i+1] = a[i];
i;
}
a[i+1] = item;
Using this basic insertion step, an array A can be sorted iteratively according to the following outline:
This outline gives rise the the following code, called an insertion sort.
public static void insertionSort (int [] a) {
// method to sort using the insertion sort
for (int k = 1; k < a.length; k++) {
int item = a[k];
int i = k1;
while ((i >= 0) && a[i] > item){
a[i+1] = a[i];
i;
}
a[i+1] = item;
}
}
Initial Notes:
The quicksort is a recursive approach to sorting, and we begin by outlining the principal recursive step. In this step, we make a guess at the value that should end up in the middle of the array. In particular, given the array a[0], ..., a[N1] of data, arranged at random, then we might guess that the first data item a[0] often should end up in about the middle of the array when the array is finally ordered. (a[0] is easy to locate, and it is as good a guess at the median value as another.) This suggests the following steps:
A specific example is shown below:
Outline to Move the First Array Element to the Appropriate Middle
With the above outline, we now consider how to move the first array element into its appropriate location in the array. The basic approach is to work from the ends of the array toward the middle, comparing data elements to the first element and rearranging the array as necessary. The outline details follow:
These steps are illustrated in the following diagram:
The following code implements this basic step:
int left=first+1;
int right=last;
int temp;
while (right >= left) {
// search left to find small array item
while ((right >= left) && (a[first] <= a[right]))
right;
// search right to find large array item
while ((right >= left) && (a[first] >= a[left]))
left++;
// swap large left item and small right item, if needed
if (right > left) {
temp = a[left];
a[left] = a[right];
a[right] = temp;
}
}
// put a[first] in its place
temp = a[first];
a[first] = a[right];
a[right] = temp;
Given the above code to place the first element of an array segment appropriately and rearrange small and large items, the full array may be sorted by applying the algorithm recursively to the first part and the last part of the array. The base case of the recursion arises if there are no further elements in an array segment to sort.
This gives rise the the following code, called a quicksort.
public static void quicksort (int [] a) {
// method to sort using the quicksort
quicksortKernel (a, 0, a.length1);
}
private static void quicksortKernel (int [] a, int first, int last) {
int left=first+1;
int right=last;
int temp;
while (right >= left) {
// search left to find small array item
while ((right >= left) && (a[first] <= a[right]))
right;
// search right to find large array item
while ((right >= left) && (a[first] >= a[left]))
left++;
// swap large left item and small right item, if needed
if (right > left) {
temp = a[left];
a[left] = a[right];
a[right] = temp;
}
}
// put a[first] in its place
temp = a[first];
a[first] = a[right];
a[right] = temp;
// recursively apply algorithm to a[first]..a[right1]
// and a[right+1]..a[last], provided these segments contain >= 2 items
if (first < right1)
quicksortKernel (a, first, right1);
if (right+1 < last)
quicksortKernel (a, right+1, last);
}
This code illustrates the huskandkernel programming style which arose frequently earlier in CSC 153.
The quicksort is called a divideandconquer algorithm, because the first step normally divides the array into two pieces and the approach is applied recursively to each piece.
Suppose this code is applied an array containing n randomly ordered data. For the most part, we might expect that the quicksort's divideandconquer strategy will divide the array into two pieces, each of size n/2, after the first main step. Applying the algorithm to each half, in turn, will divide the array further  roughly 4 pieces, each of size n/4. Continuing a third time, we would expect to get about 8 pieces, each of size n/8.
Applying this process i times, would would expect to get about 2^{i} pieces, each of size n/2^{i}.
This process continues until each array piece just has 1 element in it, so 1 = n/2^{i} or 2^{i} = n or i = log_{2} n. Thus, the total number of main steps for this algorithms should be about log_{2} n. For each main step, we must examine the various array elements to move the relevant first items of each array segment into their correct places, and this requires us to examine roughly n items.
Altogether, for random data, this suggests that the quicksort requires about log_{2} n main steps with n operations per main step. Combining these results, quicksort on random data has O(n log_{2} n).
This document is available on the World Wide Web as
http://www.walker.cs.grinnell.edu/courses/153.sp05/readings/readingsorting.shtml
created April 24, 2001 last revised March 11, 2005 

For more information, please contact Henry M. Walker at walker@cs.grinnell.edu. 