3.1 Asymptotic notation

3.1-1

Let $f(n) + g(n)$ be asymptotically nonnegative functions. Using the basic definition of $\Theta$-notation, prove that $\max(f(n), g(n)) = \Theta(f(n) + g(n))$.

For asymptotically nonnegative functions $f(g)$ and $g(n)$, we know that

\begin{aligned} \exists n_1, n_2: & f(n) \ge 0 & \text{for} \, n > n_1 \\ & g(n) \ge 0 & \text{for} \, n > n_2. \end{aligned}

Let $n_0 = \max(n_1, n_2)$ and we know the equations below would be true for $n > n_0$:

\begin{aligned} f(n) & \le \max(f(n), g(n)) \\ g(n) & \le \max(f(n), g(n)) \\ (f(n) + g(n))/2 & \le \max(f(n), g(n)) \\ \max(f(n), g(n)) & \le \max(f(n), g(n)). \end{aligned}

Then we can combine last two inequalities:

$$0 \le \frac{f(n) + g(n)}{2} \le \max{(f(n), g(n))} \le f(n) + g(n).$$

Which is the definition of $\Theta{(f(n) + g(n))}$ with $c_1 = \frac{1}{2}$ and $c_2 = 1$

3.1-2

Show that for any real constants $a$ and $b$, where $b > 0$,

$$(n + a)^b = \Theta(n^b). \tag{3.2}$$

Expand $(n + a)^b$ by the Binomial Expansion, we have

$$(n + a)^b = C_0^b n^b a^0 + C_1^b n^{b - 1} a^1 + \cdots + C_b^b n^0 a^b.$$

Besides, we know below is true for any polynomial

$$a_0 x^1 + a_1 x^2 + \cdots + a_n x^n \le (a_0 + a_1 + \cdots + a_n) x^n.$$

Thus,

$$C_0^b n^b \le C_0^b n^b a^0 + C_1^b n^{b - 1} a^1 + \cdots + C_n^n n^0 a^n \le (C_0^b + C_1^b + \cdots + C_b^b) n^b.$$

$$\implies (n + a)^b = \Theta(n^b).$$

3.1-3

Explain why the statement, "The running time of algorithm $A$ is at least $O(n^2)$," is meaningless.

Let $T(n)$ be the running time for algorithm A and function $f(n) = O(n^2)$. The statement "The running time for algorithm A is at least $O(n^2)$" means that $T(n)$ is an upper bound of $f(n)$.

Since $f(n)$ could be any function smaller than $n^2$ (including constant function), we can rephase the statement as "The running time of algorithm A is at least constant." Back to the problem, the statement is meaningless because the running time for every algorithm is at least constant.

3.1-4

Is $2^{n + 1} = O(2^n)$? Is $2^{2n} = O(2^n)$?

• True. Note that $2^{n + 1} = 2 \times 2^n$. We can choose $c \ge 2$ and $n_0 = 0$, such that $0 \le 2^{n + 1} \le c \times 2^n$ for all $n \ge n_0$. By definition, $2^{n + 1} = O(2^n)$.

• False. Note that $2^{2n} = 2^n \times 2^n = 4^n$. We can't find any $c$ and $n_0$, such that $0 \le 2^{2n} = 4^n \le c \times 2^n$ for all $n \ge n_0$.

3.1-5

Prove Theorem 3.1.

The theorem states:

For any two functions $f(n)$ and $g(n)$, we have $f(n) = \Theta(g(n))$ if and only if $f(n) = O(g(n))$ and $f(n) = \Omega(g(n))$.

From $f = \Theta(g(n))$, we have that

$$0 \le c_1 g(n) \le f(n) \le c_2g(n) \text{ for } n > n_0.$$

We can pick the constants from here and use them in the definitions of $O$ and $\Omega$ to show that both hold.

From $f(n) = \Omega(g(n))$ and $f(n) = O(g(n))$, we have that

\begin{aligned} & 0 \le c_3g(n) \le f(n) & \text{ for all } n \ge n_1 \\ \text{and } & 0 \le f(n) \le c_4g(n) & \text{ for all } n \ge n_2. \end{aligned}

If we let $n_3 = \max(n_1, n_2)$ and merge the inequalities, we get

$$0 \le c_3g(n) \le f(n) \le c_4g(n) \text{ for all } n > n_3.$$

Which is the definition of $\Theta$.

3.1-6

Prove that the running time of an algorithm is $\Theta(g(n))$ if and only if its worst-case running time is $O(g(n))$ and its best-case running time is $\Omega(g(n))$.

If $T_w$ is the worst-case running time and $T_b$ is the best-case running time, we know that

\begin{aligned} & 0 \le c_1g(n) \le T_b(n) & \text{ for } n > n_b \\ \text{and } & 0 \le T_w(n) \le c_2g(n) & \text{ for } n > n_w. \end{aligned}

Combining them we get

$$0 \le c_1g(n) \le T_b(n) \le T_w(n) \le c_2g(n) \text{ for } n > \max(n_b, n_w).$$

Since the running time is bound between $T_b$ and $T_w$ and the above is the definition of the $\Theta$-notation, proved.

3.1-7

Prove $o(g(n)) \cap w(g(n))$ is the empty set.

We know that for any $c > 0$,

\begin{aligned} & \exists n_1 > 0: 0 \le f(n) < cg(n) \\ \text{and } & \exists n_2 > 0: 0 \le cg(n) < f(n). \end{aligned}

If we pick $n_0 = \max(n_1, n_2)$, from the problem definition we get

$$f(n) < cg(n) < f(n).$$

There is no solutions, which means that the intersection is the empty set.

3.1-8

We can extend our notation to the case of two parameters $n$ and $m$ that can go to infinity independently at different rates. For a given function $g(n, m)$ we denote $O(g(n, m))$ the set of functions:

\begin{aligned} O(g(n, m)) = \{f(n, m): & \text{ there exist positive constants } c, n_0, \text{ and } m_0 \\ & \text{ such that } 0 \le f(n, m) \le cg(n, m) \\ & \text{ for all } n \ge n_0 \text{ or } m \ge m_0.\} \end{aligned}

Give corresponding definitions for $\Omega(g(n, m))$ and $\Theta(g(n, m))$.

\begin{aligned} \Omega(g(n, m)) = \{ f(n, m): & \text{ there exist positive constants c, n_0, and m_0 such that } \\ & \text{ 0 \le cg(n, m) \le f(n, m) for all n \ge n_0 and m \ge m_0}.\} \end{aligned}

\begin{aligned} \Theta(g(n, m)) = \{ f(n, m): & \text{ there exist positive constants c_1, c_2, n_0, and m_0 such that } \\ & \text{ c_1 g(n, m) \le f(n, m) \le c_2 g(n, m) for all n \ge n_0 and m \ge m_0}.\} \end{aligned}