Skip to content

7.2 Performance of quicksort

7.2-1

Use the substitution method to prove that the recurrence $T(n) = T(n - 1) + \Theta(n)$ has the solution $T(n) = \Theta(n^2)$, as claimed at the beginning of section 7.2.

We represent $\Theta(n)$ as $c_2n$ and we guess that $T(n) \le c_1n^2$,

$$ \begin{aligned} T(n) & = T(n - 1) + c_2n \\ & \le c_1(n - 1)^2 + c_2n \\ & = c_1n^2 - 2c_1n + c_1 + c_2n & (2c_1 > c_2, n \ge c_1 / (2c_1 - c_2)) \\ & \le c_1n^2. \end{aligned} $$

7.2-2

What is the running time of $\text{QUICKSORT}$ when all elements of the array $A$ have the same value?

It is $\Theta(n^2)$, since one of the partitions is always empty (see exercise 7.1-2.)

7.2-3

Show that the running time of $\text{QUICKSORT}$ is $\Theta(n^2)$ when the array $A$ contains distict elements and is sorted in decreasing order.

$\text{PARTITION}$ does a "worst-case partitioning" when the elements are in decreasing order. It reduces the size of the subarray under consideration by only $1$ at each step, which we've seen has running time $\Theta(n^2)$.

In particular, $\text{PARTITION}$, given a subarray $A[p..r]$ of distinct elements in decreasing order, produces an empty partition in $A[p..q - 1]$, puts the pivot (originally in $A[r]$) into $A[p]$, and produces a partition $A[p + 1..r]$ with only one fewer element than $A[p..r]$. The recurrence for $\text{QUICKSORT}$ becomes $T(n) = T(n - 1) + \Theta(n)$, which has the solution $T(n) = \Theta(n^2)$.

7.2-4

Banks often record transactions on an account in order of the times of the transactions, but many people like to receive their bank statements with checks listed in order by check numbers. People usually write checks in order by check number, and merchants usually cash the with reasonable dispatch. The problem of converting time-of-transaction ordering to check-number ordering is therefore the problem of sorting almost-sorted input. Argue that the procedure $\text{INSERTION-SORT}$ would tend to beat the procedure $\text{QUICKSORT}$ on this problem.

The more sorted the array is, the less work insertion sort will do. Namely, $\text{INSERTION-SORT}$ is $\Theta(n + d)$, where $d$ is the number of inversions in the array. In the example above the number of inversions tends to be small so insertion sort will be close to linear.

On the other hand, if $\text{PARTITION}$ does picks a pivot that does not participate in an inversion, it will produce and empty partition. Since there is a small number of inversions, $\text{QUICKSORT}$ is very likely to produce empty partitions.

7.2-5

Suppose that the splits at every level of quicksort are in proportion $1 - \alpha$ to $\alpha$, where $0 < \alpha \le 1 / 2$ is a constant. Show that the minumum depth of a leaf in the recursion tree is approximately $-\lg n / \lg\alpha$ and the maximum depth is approximately $-\lg n / \lg(1 - \alpha)$. (Don't worry about integer round-off.)

The minimum depth follows a path that always takes the smaller part of the partition—i.e., that multiplies the number of elements by $\alpha$. One iteration reduces the number of elements from $n$ to $\alpha n$, and $i$ iterations reduces the number of elements to $\alpha^i n$. At a leaf, there is just one remaining element, and so at a minimum-depth leaf of depth $m$, we have $\alpha^m n = 1$. Thus, $\alpha^m = 1 / n$. Taking logs, we get $m\lg\alpha = -\lg n$, or $m = -\lg n / \lg\alpha$.

Similarly, maximum depth corresponds to always taking the larger part of the partition, i.e., keeping a fraction $1 - \alpha$ of the elements each time. The maximum depth $M$ is reached when there is one element left, that is, when $(1 - \alpha)^M n = 1$. Thus, $M = -\lg n / \lg (1 - \alpha)$.

All these equations are approximate because we are ignoring floors and ceilings.

7.2-6 $\star$

Argue that for any constant $0 < \alpha \le 1 / 2$, the probability is approximately $1 - 2\alpha$ that on a random input array, $\text{PARTITION}$ produces a split more balanced than $1 - \alpha$ to $\alpha$.

In order to produce a worse split than $\alpha$ to $1 - \alpha$, $\text{PARTITION}$ must pick a pivot that will be either within the smallest $\alpha n$ elements or the largest $\alpha n$ elements. The probability of either is (approximately) $\alpha n / n = \alpha$ and the probability of both is $2\alpha$. Thus, the probability of having a better partition is the complement, $1 - 2\alpha$.