# 随机算法 (Fall 2011)/Expander Mixing Lemma

Given a ${\displaystyle d}$-regular graph ${\displaystyle G}$ on ${\displaystyle n}$ vertices with the spectrum ${\displaystyle d=\lambda _{1}\geq \lambda _{2}\geq \cdots \geq \lambda _{n}}$, we denote ${\displaystyle \lambda _{\max }=\max(|\lambda _{2}|,|\lambda _{n}|)\,}$, which is the largest absolute value of an eigenvalue other than ${\displaystyle \lambda _{1}=d}$. Sometimes, the value of ${\displaystyle (d-\lambda _{\max })}$ is also referred as the spectral gap, because it is the gap between the largest and the second largest absolute values of eigenvalues.

The next lemma is the so-called expander mixing lemma, which states a fundamental fact about expander graphs.

 Lemma (expander mixing lemma) Let ${\displaystyle G}$ be a ${\displaystyle d}$-regular graph with ${\displaystyle n}$ vertices. Then for all ${\displaystyle S,T\subseteq V}$, ${\displaystyle \left||E(S,T)|-{\frac {d|S||T|}{n}}\right|\leq \lambda _{\max }{\sqrt {|S||T|}}}$

The left-hand side measures the deviation between two quantities: one is ${\displaystyle |E(S,T)|}$, the number of edges between the two sets ${\displaystyle S}$ and ${\displaystyle T}$; the other is the expected number of edges between ${\displaystyle S}$ and ${\displaystyle T}$ in a random graph of edge density ${\displaystyle d/n}$, namely ${\displaystyle d|S||T|/n}$. A small ${\displaystyle \lambda _{\max }}$ (or large spectral gap) implies that this deviation (or discrepancy as it is sometimes called) is small, so the graph looks random everywhere although it is deterministic.

Proof.
 Assume that ${\displaystyle A}$ is the adjacency matrix of ${\displaystyle G}$ and ${\displaystyle v_{1},v_{2},\ldots ,v_{n}}$ are the orthogonal eigen basis corresponding to ${\displaystyle \lambda _{1}\geq \lambda _{2}\geq \cdots \geq \lambda _{n}}$. Let ${\displaystyle \chi _{S}}$ and ${\displaystyle \chi _{T}}$ be characteristic vectors of vertex sets ${\displaystyle S}$ and ${\displaystyle T}$, i.e. ${\displaystyle \chi _{S}(i)={\begin{cases}1&{\text{if }}i\in S,\\0&{\text{otherwise.}}\end{cases}}}$ Then it is easy to verify that ${\displaystyle |E(S,T)|=\sum _{i\in S}\sum _{j\in T}A(i,j)=\sum _{i,j}\chi _{S}(i)A(i,j)\chi _{T}(j)=\chi _{S}^{T}A\chi _{T}}$. Expand ${\displaystyle \chi _{S}}$ and ${\displaystyle \chi _{T}}$ in orthogonal basis of eigen vectors ${\displaystyle v_{1},v_{2},\ldots ,v_{n}}$ as ${\displaystyle \chi _{S}=\sum _{i=1}^{n}\alpha _{i}v_{i}}$ and ${\displaystyle \chi _{T}=\sum _{i=1}^{n}\beta _{i}v_{i}}$. ${\displaystyle |E(S,T)|=\chi _{S}^{T}A\chi _{T}=\left(\sum _{i=1}^{n}\alpha _{i}v_{i}\right)A\left(\sum _{i=1}^{n}\beta _{i}v_{i}\right)=\left(\sum _{i=1}^{n}\alpha _{i}v_{i}\right)\left(\sum _{i=1}^{n}\lambda _{i}\beta _{i}v_{i}\right)=\sum _{i=1}^{n}\lambda _{i}\alpha _{i}\beta _{i},}$ where the last equation is due to the orthogonality of ${\displaystyle v_{1},\ldots ,v_{n}}$. Recall that ${\displaystyle \lambda _{1}=d}$ and ${\displaystyle v_{1}={\frac {\boldsymbol {1}}{\sqrt {n}}}=\left(1/{\sqrt {n}},\ldots ,1/{\sqrt {n}}\right)}$. We can conclude that ${\displaystyle \alpha _{1}=\langle \chi _{S},{\frac {\boldsymbol {1}}{\sqrt {n}}}\rangle ={\frac {|S|}{\sqrt {n}}}}$ and ${\displaystyle \beta _{1}=\langle \chi _{T},{\frac {\boldsymbol {1}}{\sqrt {n}}}\rangle ={\frac {|T|}{\sqrt {n}}}}$, where ${\displaystyle \langle \cdot ,\cdot \rangle }$ stands for the inner-product. Therefore, ${\displaystyle |E(S,T)|=\sum _{i=1}^{n}\lambda _{i}\alpha _{i}\beta _{i}=d{\frac {|S||T|}{n}}+\sum _{i=2}^{n}\lambda _{i}\alpha _{i}\beta _{i}.}$ By the definition of ${\displaystyle \lambda _{\max }}$, ${\displaystyle \left||E(S,T)|-d{\frac {|S||T|}{n}}\right|=\left|\sum _{i=2}^{n}\lambda _{i}\alpha _{i}\beta _{i}\right|\leq \sum _{i=2}^{n}\left|\lambda _{i}\alpha _{i}\beta _{i}\right|\leq \lambda _{\max }\sum _{i=2}^{n}\left|\alpha _{i}||\beta _{i}\right|.}$ Due to Cauchy-Schwartz inequality, ${\displaystyle |\alpha _{1}||\beta _{1}|+|\alpha _{2}||\beta _{2}|+\cdots +|\alpha _{n}||\beta _{n}|\leq {\sqrt {\alpha _{1}^{2}+\alpha _{2}^{2}+\cdots +\alpha _{n}^{2}}}{\sqrt {\beta _{1}^{2}+\beta _{2}^{2}+\cdots +\beta _{n}^{2}}}.}$ We can treat ${\displaystyle \alpha }$ and ${\displaystyle \beta }$ as two vectors, and conclude that ${\displaystyle \left||E(S,T)|-d{\frac {|S||T|}{n}}\right|\leq \lambda _{\max }\|\alpha \|_{2}\|\beta \|_{2}=\lambda _{\max }\|\chi _{S}\|_{2}\|\chi _{T}\|_{2}=\lambda _{\max }{\sqrt {|S||T|}}.}$
${\displaystyle \square }$