随机算法 (Fall 2011)/Expander Mixing Lemma

From TCS Wiki
Revision as of 15:53, 19 July 2011 by 114.212.208.2 (talk) (Created page with 'Given a <math>d</math>-regular graph <math>G</math> on <math>n</math> vertices with the spectrum <math>d=\lambda_1\ge\lambda_2\ge\cdots\ge\lambda_n</math>, we denote <math>\lambd…')
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Given a [math]\displaystyle{ d }[/math]-regular graph [math]\displaystyle{ G }[/math] on [math]\displaystyle{ n }[/math] vertices with the spectrum [math]\displaystyle{ d=\lambda_1\ge\lambda_2\ge\cdots\ge\lambda_n }[/math], we denote [math]\displaystyle{ \lambda_\max = \max(|\lambda_2|,|\lambda_n|)\, }[/math], which is the largest absolute value of an eigenvalue other than [math]\displaystyle{ \lambda_1=d }[/math]. Sometimes, the value of [math]\displaystyle{ (d-\lambda_\max) }[/math] is also referred as the spectral gap, because it is the gap between the largest and the second largest absolute values of eigenvalues.

The next lemma is the so-called expander mixing lemma, which states a fundamental fact about expander graphs.

Lemma (expander mixing lemma)
Let [math]\displaystyle{ G }[/math] be a [math]\displaystyle{ d }[/math]-regular graph with [math]\displaystyle{ n }[/math] vertices. Then for all [math]\displaystyle{ S, T \subseteq V }[/math],
[math]\displaystyle{ \left||E(S,T)|-\frac{d|S||T|}{n}\right|\le\lambda_\max\sqrt{|S||T|} }[/math]

The left-hand side measures the deviation between two quantities: one is [math]\displaystyle{ |E(S,T)| }[/math], the number of edges between the two sets [math]\displaystyle{ S }[/math] and [math]\displaystyle{ T }[/math]; the other is the expected number of edges between [math]\displaystyle{ S }[/math] and [math]\displaystyle{ T }[/math] in a random graph of edge density [math]\displaystyle{ d/n }[/math], namely [math]\displaystyle{ d|S||T|/n }[/math]. A small [math]\displaystyle{ \lambda_\max }[/math] (or large spectral gap) implies that this deviation (or discrepancy as it is sometimes called) is small, so the graph looks random everywhere although it is deterministic.