CSC 153: Computer Science Fundamentals Grinnell College Spring, 2005
 
Laboratory Exercise Reading
 

Sorting

Abstract

This laboratory exercises introduces and analyzes the insertion sort and Quicksort as mechanisms to order numbers in an array.

Introduction

Many applications require the maintenance of ordered data. In Java (and Scheme), a simple way to structure ordered data is in an array (or vector), such as the following ordered array A of integers:

A:  2 3 5 7 9 10 13 18 24 27 33 35 37

The Insertion Sort

One common sorting approach is based on code that assumes that the first part of an array is ordered and then adds successive items to this array segment until the entire array is sorted. To understand this approach, we first consider how to add one item to an ordered array segment. We then apply this work to each array element in turn to yield an ordered array.

Maintaining An Ordered Array Segment

Suppose items A[0], ..., A[k-1] are ordered in array A:

A:  3 7 9 10 18 27 33 37

The following code inserts an item into the array, so that items A[0], ..., A[k] become ordered:


int i = k-1;
while ((i >= 0) && a[i] > item){
   a[i+1] = a[i];
   i--;
}
a[i+1] = item;

Using this basic insertion step, an array A can be sorted iteratively according to the following outline:

This outline gives rise the the following code, called an insertion sort.


public static void insertionSort (int [] a) {
// method to sort using the insertion sort
   for (int k = 1; k < a.length; k++) {
      int item = a[k];
      int i = k-1;
      while ((i >= 0) && a[i] > item){
         a[i+1] = a[i];
         i--;
      }
      a[i+1] = item;
   }
}

Quicksort: An Example of Divide-and-Conquer Algorithms

Initial Notes:


The quicksort is a recursive approach to sorting, and we begin by outlining the principal recursive step. In this step, we make a guess at the value that should end up in the middle of the array. In particular, given the array a[0], ..., a[N-1] of data, arranged at random, then we might guess that the first data item a[0] often should end up in about the middle of the array when the array is finally ordered. (a[0] is easy to locate, and it is as good a guess at the median value as another.) This suggests the following steps:

  1. Rearrange the data in the a array, so that A[0] is moved to its proper position. In other words, move a[0] to a[mid] and rearrange the other elements so that:
    a[0], a[1], ..., a[mid-1] < a[mid]
    and
    a[mid} < a[mid+1], ..., a[N-1].


  2. Repeat this process on the smaller lists
    a[0], a[1], ..., a[mid-1]
    and
    a[mid+1], ..., a[N-1].

A specific example is shown below:


A:  Main Steps in a Quicksort


Outline to Move the First Array Element to the Appropriate Middle

With the above outline, we now consider how to move the first array element into its appropriate location in the array. The basic approach is to work from the ends of the array toward the middle, comparing data elements to the first element and rearranging the array as necessary. The outline details follow:

  1. Compare a[first] to a[last], a[last-1], etc. until an element a[right] is found where a[right] < a[first].

  2. Compare a[first] to a[first+1], a[first+2], etc. until an element a[left] is found where a[left] > a[first].

  3. Swap a[left] and a[right].
    At this point,

  4. Continue steps A and B, comparing the original first element against the end of the arrays, until all elements of the array have been checked.

  5. Swap a[first] with a[right], to put it in its correct location.

These steps are illustrated in the following diagram:


A:  Putting the First Array Element in its Place

The following code implements this basic step:


int left=first+1;
int right=last;
int temp;

while (right >= left) {
    // search left to find small array item
    while ((right >= left) && (a[first] <= a[right]))
        right--;
    // search right to find large array item
    while ((right >= left) && (a[first] >= a[left]))
        left++;
    // swap large left item and small right item, if needed
    if (right > left) {
        temp = a[left];
        a[left] = a[right];
        a[right] = temp;
    }
}
// put a[first] in its place
temp = a[first];
a[first] = a[right];
a[right] = temp;

Given the above code to place the first element of an array segment appropriately and rearrange small and large items, the full array may be sorted by applying the algorithm recursively to the first part and the last part of the array. The base case of the recursion arises if there are no further elements in an array segment to sort.

This gives rise the the following code, called a quicksort.


public static void quicksort (int [] a) {
// method to sort using the quicksort
    quicksortKernel (a, 0, a.length-1);
}

private static void quicksortKernel (int [] a, int first, int last) {
    int left=first+1;
    int right=last;
    int temp;

    while (right >= left) {
        // search left to find small array item
        while ((right >= left) && (a[first] <= a[right]))
            right--;
        // search right to find large array item
        while ((right >= left) && (a[first] >= a[left]))
            left++;
        // swap large left item and small right item, if needed
        if (right > left) {
            temp = a[left];
            a[left] = a[right];
            a[right] = temp;
        }
    }
    // put a[first] in its place
    temp = a[first];
    a[first] = a[right];
    a[right] = temp;

    // recursively apply algorithm to a[first]..a[right-1] 
    // and a[right+1]..a[last], provided these segments contain >= 2 items
    if (first < right-1)
        quicksortKernel (a, first, right-1);
    if (right+1 < last)
        quicksortKernel (a, right+1, last);   
}

This code illustrates the husk-and-kernel programming style which arose frequently earlier in CSC 153.

Analysis and Timing

The quicksort is called a divide-and-conquer algorithm, because the first step normally divides the array into two pieces and the approach is applied recursively to each piece.

Suppose this code is applied an array containing n randomly ordered data. For the most part, we might expect that the quicksort's divide-and-conquer strategy will divide the array into two pieces, each of size n/2, after the first main step. Applying the algorithm to each half, in turn, will divide the array further -- roughly 4 pieces, each of size n/4. Continuing a third time, we would expect to get about 8 pieces, each of size n/8.

Applying this process i times, would would expect to get about 2i pieces, each of size n/2i.

This process continues until each array piece just has 1 element in it, so 1 = n/2i or 2i = n or i = log2 n. Thus, the total number of main steps for this algorithms should be about log2 n. For each main step, we must examine the various array elements to move the relevant first items of each array segment into their correct places, and this requires us to examine roughly n items.

Altogether, for random data, this suggests that the quicksort requires about log2 n main steps with n operations per main step. Combining these results, quicksort on random data has O(n log2 n).


This document is available on the World Wide Web as

http://www.walker.cs.grinnell.edu/courses/153.sp05/readings/reading-sorting.shtml

created April 24, 2001
last revised March 11, 2005
Valid HTML 4.01! Valid CSS!
For more information, please contact Henry M. Walker at walker@cs.grinnell.edu.