# 17.1 Aggregate analysis

## 17.1-1

If the set of stack operations included a $\text{MULTIPUSH}$ operation, which pushses $k$ items onto the stack, would the $O(1)$ bound on the amortized cost of stack operations continue to hold?

No. The time complexity of such a series of operations depends on the number of pushes (pops vise versa) could be made. Since one $\text{MULTIPUSH}$ needs $\Theta(k)$ time, performing $n$ $\text{MULTIPUSH}$ operations, each with $k$ elements, would take $\Theta(kn)$ time, leading to amortized cost of $\Theta(k)$.

## 17.1-2

Show that if a $\text{DECREMENT}$ operatoin were included in the $k$-bit counter example, $n$ operations could cost as much as $\Theta(nk)$ time.

The logarithmic bit flipping predicate does not hold, and indeed a sequence of events could consist of the incrementation of all $1$s and decrementation of all $0$s; yielding $\Theta(nk)$.

## 17.1-3

Suppose we perform a sequence of $n$ operations on a data structure in which the $i$th operation costs $i$ if $i$ is an exact power of $2$, and $1$ otherwise. Use aggregate analysis to determine the amortized cost per operation.

Let $c_i =$ cost of $i$th operation.

$$c_i = \begin{cases} i & \text{if i is an exact power of 2}, \\ 1 & \text{otherwise}. \end{cases}$$

$$\begin{array}{cc} \text{Operation} & \text{Cost} \\ \hline 1 & 1 \\ 2 & 2 \\ 3 & 1 \\ 4 & 4 \\ 5 & 1 \\ 6 & 1 \\ 7 & 1 \\ 8 & 8 \\ 9 & 1 \\ 10 & 1 \\ \vdots & \vdots \end{array}$$

$n$ operations cost:

$$\sum_{i = 1}^n c_i \le n + \sum_{j = 0}^{\lg n} 2^j = n + (2n - 1) < 3n.$$

(Note: Ignoring floor in upper bound of $\sum 2^j$.)

Average cost of operation:

$$\frac{\text{Total case}}{\text{\# operations}} < 3.$$

By aggregate analysis, the amoritzed cost per operation $= O(1)$.