高级算法 (Fall 2023)/Problem Set 2

From TCS Wiki
Jump to navigation Jump to search
  • 本次作业每个Problem分值10分,满分80分。
  • 每道题目的解答都要有完整的解题过程,中英文不限。
  • 我们推荐大家使用LaTeX, markdown等对作业进行排版。

Problem 1 (Adjacency matrix)

Let [math]\displaystyle{ A }[/math] be the adjacency matrix of an undirected connected graph [math]\displaystyle{ G }[/math], and [math]\displaystyle{ \alpha_1 }[/math] be its largest eigenvalue.

  • [Lowerbounding [math]\displaystyle{ \alpha_1 }[/math]] We proved [math]\displaystyle{ \alpha_1 \le d_{\mathrm{max}} }[/math] in class. Show that [math]\displaystyle{ \alpha_1 \ge d_{\mathrm{avg}} }[/math] where [math]\displaystyle{ d_{\mathrm{avg}}:=\frac{2|E|}{|V|} }[/math] is the average degree of the graph.
  • [Monotonicity of the spectrum] Let [math]\displaystyle{ A' }[/math] be the adjacency matrix of any subgraph of [math]\displaystyle{ A }[/math] produced by deleting vertices, and let [math]\displaystyle{ \alpha_1' }[/math] be the largest eigenvalue of [math]\displaystyle{ A' }[/math]. Show that [math]\displaystyle{ \alpha'_1\leq \alpha_1 }[/math].
  • [Coloring number] The chromatic number [math]\displaystyle{ \chi(G) }[/math] of a graph is the smallest number of colors needed to color the vertices of [math]\displaystyle{ G }[/math] so that no two adjacent vertices share same color. Prove that [math]\displaystyle{ \chi(G) \le \lfloor \alpha_1 \rfloor +1 }[/math].

Problem 2 (Graph Laplacian)

  • [Spectrum of special graphs] Find eigenvalues of the Laplacian matrices of the following graphs:
    1. The complete graph with [math]\displaystyle{ n }[/math] vertices.
    2. The star graph with [math]\displaystyle{ n }[/math] vertices.
  • [Number of connected points] Let [math]\displaystyle{ G }[/math] be a connected graph, and [math]\displaystyle{ \lambda_1\le \lambda_2 \le \ldots \le \lambda_n }[/math] be the eigenvalues of its normalized Laplacian matrix [math]\displaystyle{ \mathcal{L} }[/math]. Prove that [math]\displaystyle{ \lambda_k = 0 }[/math] if and only if [math]\displaystyle{ G }[/math] has at least [math]\displaystyle{ k }[/math] components.
  • [Lowerbounding [math]\displaystyle{ \lambda_2 }[/math]] Let [math]\displaystyle{ G }[/math] be an undirected graph whose Laplacian is [math]\displaystyle{ L }[/math], with second-smallest eigenvalue [math]\displaystyle{ \lambda_2 }[/math]. We know that if [math]\displaystyle{ G }[/math] is connected then [math]\displaystyle{ \lambda_2\gt 0 }[/math]. Prove that [math]\displaystyle{ \lambda_2 \geq 1/O(rn) \geq 1/O(n^2) }[/math] by analyzing the Rayleigh quotient on all test vectors [math]\displaystyle{ x\perp \mathbf{1} }[/math]. Here, [math]\displaystyle{ r }[/math] is the *diameter* of the graph (i.e. the maximum shortest-path distance between pairs of vertices in the graph). Further, show that when [math]\displaystyle{ G }[/math] is simple and [math]\displaystyle{ d }[/math]-regular, we have [math]\displaystyle{ \lambda_2 \geq d/O(n^2) }[/math].

Problem 3 (Cheeger's inequality)

The Cheeger's inequality for graphs provides a relationship between the conductance of the graph and the second smallest eigenvalue of its normalized Laplacian matrix. In this problem, you will discover a Cheeger type inequality for the largest eigenvalue [math]\displaystyle{ \lambda_n }[/math].

(i)Prove: Let [math]\displaystyle{ G = (V, E) }[/math] be an undirected graph and [math]\displaystyle{ \lambda_n }[/math] be the largest eigenvalue of normalized Laplacian [math]\displaystyle{ \mathcal L(G) }[/math]. Then

   [math]\displaystyle{ 2 - \lambda_n = \min_{x \in \mathbb R^n} \frac{\sum _{ij \in E} (x_i + x_j)^2}{\sum _{i \in V} d_i\cdot x_i^2}.  }[/math]
 Let [math]\displaystyle{ S \subseteq V (G) }[/math] be a bipartite component with partition [math]\displaystyle{ S = (L, R) }[/math] such that all the edges in [math]\displaystyle{ S }[/math] are between [math]\displaystyle{ L }[/math] and [math]\displaystyle{ R }[/math]. Then the vector [math]\displaystyle{ \boldsymbol x \in \{−1, 0, 1\}^n }[/math] is a solution to the above optimization problem with objective value [math]\displaystyle{ 0 }[/math], where 
   [math]\displaystyle{ 
    \boldsymbol x_i = \begin{cases}+1 & \text { if } i \in L \\ -1 & \text { if } i \in R \\ 0 & \text { otherwise }\end{cases}
     }[/math]
   Using this association between a vector [math]\displaystyle{ \boldsymbol x \in \{−1, 0, 1\}^n }[/math] and a bipartition of a subset [math]\displaystyle{ S = (L, R) }[/math] , we consider the following denition of the *bipartiteness ratio* [math]\displaystyle{ \beta(x) }[/math] of [math]\displaystyle{ S = (L, R) }[/math] where each edge within [math]\displaystyle{ L }[/math] and each edge within [math]\displaystyle{ R }[/math] contributes [math]\displaystyle{ 2 }[/math] in the numerator while an edge in [math]\displaystyle{ \delta(S) }[/math] contributes [math]\displaystyle{ 1 }[/math] in the numerator. 
   [math]\displaystyle{ 
    \beta(x) := \frac{\sum _{ij \in E} |x_i + x_j|}{\sum _{i \in V} d_i|x_i|}.
     }[/math]
   The bipartiteness ratio of a graph [math]\displaystyle{ G }[/math] is defined as [math]\displaystyle{ \beta(G):= \min_x \beta(x) }[/math]. 
   Show the following analog of Cheeger's inequality for [math]\displaystyle{ 2 − \lambda_n }[/math] and [math]\displaystyle{ \beta(G) }[/math]: 
   [math]\displaystyle{ 
    \frac{1}{2}(2 - \lambda_n) \le \beta(G) \le \sqrt{2(2 - \lambda_n)}.
     }[/math]

Problem 4 (Random walk on graph)

Problem 5 (Hitting time, commute time)

Problem 6 (Electrical network)

Problem 7 (Expander mixing lemma)

Problem 8 (MCMC and coupling)

In this problem, you will analyze the MCMC sampler over independent sets of fixed size. Given graph [math]\displaystyle{ G=(V,E) }[/math] with maximum degree [math]\displaystyle{ \Delta }[/math], let [math]\displaystyle{ I_k }[/math] be the set of all independent sets in [math]\displaystyle{ G }[/math] of size [math]\displaystyle{ k }[/math]. Consider a random walk [math]\displaystyle{ \{X_i\}_{i\geq 0} }[/math] on [math]\displaystyle{ I_k }[/math] defined by the following process:

  • choose a vertex [math]\displaystyle{ v\in X_t }[/math] uniformly at random and a vertex [math]\displaystyle{ w\in V }[/math] uniformly at random
  • if [math]\displaystyle{ w\notin X_t }[/math] and [math]\displaystyle{ X_t - \{v\} + \{w\} }[/math] is independent, [math]\displaystyle{ X_{t+1} = X_t - \{v\} + \{w\} }[/math]
  • otherwise, [math]\displaystyle{ X_{t+1} = X_t }[/math]

(1) Show that the Markov chain [math]\displaystyle{ \{X_i\}_{i\geq 0} }[/math] is irreducible and aperiodic.

(2) Show that the stationary distribution of [math]\displaystyle{ \{X_i\}_{i\geq 0} }[/math] is the uniform distribution over [math]\displaystyle{ I_k }[/math].

(3) Use coupling argument to show that if [math]\displaystyle{ k\leq \frac{n}{3\Delta + 3} }[/math], then the [math]\displaystyle{ \epsilon }[/math]-mixing time of [math]\displaystyle{ \{X_i\}_{(i\geq 0)} }[/math] is a polynomial in [math]\displaystyle{ n }[/math] and [math]\displaystyle{ \log(1/\epsilon) }[/math].