Geometry of MAX-CUT


The equivalence between an $nxn$ real matrix $M$ where $M_{ij} \sim N(0,1)$ and random points uniformly sampled from the unit sphere $S^{n-1}$ is an important result in probability theory, measure theory, geometry, and theoretical computer science.

Let $g \sim N(0, I_n)$.  Show that $g/||g|| \sim U(S^{n-1})$.  Prove that choosing a random hyperplane through the origin is equivalent to drawing $r \sim N(0, I_n)$ and cutting according to the sign of the projection.

Proof

Let $\vec{u}, \vec{v} \in \mathbb{R}^n$ such that $|\vec{u}| = |\vec{v}| = 1$ be the set of $n$-dimensional unit vectors and let $G = O(n)$ be the orthogonal group action acting upon this set. For rotational invariance on $S^{n-1}$, an isometry must be shown such that $\vec{u} \cdot \vec{v} = o\vec{u} \cdot o\vec{v}$, where $o \in O(n)$.

$(o\vec{u}) \cdot (o\vec{v}) = (o\vec{u})^T (o\vec{v}) = \vec{u}^T o^T o \vec{v} = \vec{u}^T \vec{v} = \vec{u} \cdot \vec{v}$

Note that $o^T o = I$. Because the columns of $o \in O(n)$ are orthogonal, a transformation by $o$ yields an inner product-preserving isometry.

Now, pick a random unit vector $\vec{u} \sim S^{n-1}$ and define the orthogonal hyperplane as $H_{\vec{u}} = \{ \vec{x} \in \mathbb{R}^n | \langle \vec{x}, \vec{u} \rangle = 0 \}$. The hyperplane may act as a partition barrier, where $sign(\langle \vec{v_i}, \vec{u} \rangle)$ partitions vectors $\vec{v_i}$.

Let $\vec{r} \sim N(0, I)$ and let its polar decomposition be $\vec{r} = \rho \vec{w}$, where $\rho$ represents the length of the vector, $\rho = ||r|| \ge 0$ and $\vec{w}$ is the direction, $\vec{w} = r / ||r|| \in S^{n-1}$. Because of the invariance proved above, $w \sim U(S^{n-1})$, meaning it is independent of $\rho$.

${sign}(\langle \vec{v}_i, \vec{r} \rangle) = {sign}(\rho \langle \vec{v}_i, \vec{w} \rangle) = {sign}(\langle \vec{v}_i, \vec{w} \rangle)$

In the case of ties, randomly pick a partition to assign to $v_i$. However, notice that ${Pr}(\langle v_i, r \rangle) = 0$, since Gaussians have a continuous distribution. (I originally wrote the preceding ‘.’ with a ‘!’ due to my excitement of this fact, but I changed it due to the factorial ambiguity.)

Thus, it has been proved that choosing a random hyperplane through the origin is equivalent to drawing $r \sim N(0, I_n)$ and cutting according to the sign of the projection. $\square$

Remarks

Invariance on the unit $(n-1)$-sphere implies a lack of preference for any vector centered at the origin and pointing to the surface; applying any rotation or reflection to a Gaussian vector does not change its distribution. On a compact homogeneous space like $S^{n-1}$, the Haar measure is the only probability distribution that is invariant under all orthogonal transformations. Proof of this is a fun exercise involving left cosets and is left as an exercise to the reader.

The proof above is a key part in the seminal Goemans-Williamson algorithm for MaxCut polynomial time approximation. The original scalar binary variable domain $D = \{-1, 1 \}$ is relaxed into unit vectors, $\vec{v}_i \in S^{n-1}$, where $||\vec{v}_i|| = 1$ and $n$ is the number of nodes on the constraint graph. These vectors form Gram matrices, $Y$, where $Y_{ij} = \vec{u}_i \cdot \vec{v}_j$ subject to the constraints $Y_{ii} = 1$ and $Y \succeq 0$. This is solvable as a semi-definite program, yielding Cholesky-decomposed vector solutions to the relaxation. A random hyperplane is selected from $S^{n-1}$, and these vectors are partitioned according to which side of the hyperplane the fall on.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *