Skip to content

35.4 Randomization and linear programming

35.4-1

Show that even if we allow a clause to contain both a variable and its negation, randomly setting each variable to 1 with probability $1 / 2$ and to $0$ with probability $1 / 2$ still yields a randomized $8 / 7$-approximation algorithm.

(Omit!)

35.4-2

The MAX-CNF satisfiability problem is like the $\text{MAX-3-CNF}$ satisfiability problem, except that it does not restrict each clause to have exactly $3$ literals. Give a randomized $2$-approximation algorithm for the $\text{MAX-CNF}$ satisfiability problem.

(Omit!)

35.4-3

In the $\text{MAX-CUT}$ problem, we are given an unweighted undirected graph $G = (V, E)$. We define a cut $(S, V - S)$ as in Chapter 23 and the weight of a cut as the number of edges crossing the cut. The goal is to find a cut of maximum weight. Suppose that for each vertex $v$, we randomly and independently place $v$ in $S$ with probability $1 / 2$ and in $V - S$ with probability $1 / 2$. Show that this algorithm is a randomized $2$-approximation algorithm.

$\gdef\Ex{\mathbf{E}}$ $\gdef\Prob{\mathbf{P}}$ $\gdef\opt{\texttt{opt}}$

We first rewrite the algorithm for clarity.

APPROX-MAX-CUT(G)
    for each v in V
        flip a fair coin
        if heads
            add v to S
        else
            add v to V - S

This algorithm clearly runs in linear time. For each edge $(u, v) \in E$, define the event $A_{uv}$ to be the event where edge $(i, j)$ crosses the cut $(S, V - S)$, and let $1_{A_{uv}}$ be the indicator random variable for $A_{uv}$.

The event $A_{ij}$ occurs if and only if the vertices $u$ and $v$ are placed in different sets during the main loop in $\text{APPROX-MAX-CUT}$. Hence,

$$ \begin{aligned} \Prob \{A_{uv}\} &= \Prob \{u \in S \wedge v \in V - S\} + P\{u \in V - S \wedge v\in S\} \\ &= \frac{1}{2}\cdot\frac{1}{2} + \frac{1}{2}\cdot\frac{1}{2} \\ &= \frac{1}{2}. \end{aligned} $$

Let $\opt$ denote the cost of a maximum cut in $G$, and let $c = |(S, V - S)|$, that is, the size of the cut produced by $\text{APPROX-MAX-CUT}$. Clearly $c = \sum_{(u, v) \in E} 1_{A_{uv}}$. Also, note that $\texttt{opt} \leq |E|$ (this is tight iff $G$ is bipartite). Hence,

$$ \begin{aligned} \Ex[c] &= \Ex\left[\sum_{(u, v) \in E} 1_{A_{uv}}\right]\\ &= \sum_{(u, v)\in E}\Ex[1_{A_{uv}}]\\ &= \sum_{(u, v)\in E}\Prob\{A_{uv}\}\\ &= \frac{1}{2}|E|\\ &\ge \frac{1}{2}\opt \end{aligned} $$

Hence, $\Ex\left[\frac{\opt}{c}\right] \le \frac{|E|}{1 / 2|E|} = 2$, and so $\text{APPROX-MAX-CUT}$ is a randomized $2$-approximation algorithm.

35.4-4

Show that the constraints in line $\text{(35.19)}$ are redundant in the sense that if we remove them from the linear program in lines $\text{(35.17)}–\text{(35.20)}$, any optimal solution to the resulting linear program must satisfy $x(v) \le 1$ for each $v \in V$.

(Omit!)