Notes of Prasad Tetali’s lecture

Many sparse cuts via higher eigenvalues

Joint with Anand Louis, Santosh Vempala and Prasad Raghavendra (moving to Berkeley).

Topic: relating higher eigenvalues of the Laplacian to isopermetric constants.

1. Cheeger inequality

Laplacian is {D^{-1/2}(D-A)D^{-1/2}} where {A} is the adjacency matrix of the graph and {D} the diagonal degree matrix. The first eigenvalue is {\lambda_1=0}, let {\lambda_2} be the second one.

Cheeger’s inequality deals with Riemannian manifolds. Here is its translation for graphs. Given a subset {S} of vertices, let {E(S,\bar{S})} denote the set of edges between {S} and its complement {\bar{S}}. Set

\displaystyle  \begin{array}{rcl}  \Phi(S)=\frac{|E(S,\bar{S})}{d|S|}. \end{array}

and

\displaystyle  \begin{array}{rcl}  \Phi_G =\min\{\Phi(S)\,;\, S\subset V,\,|S|\leq\frac{1}{2}|V|\}. \end{array}

There is a weighted variant.

Theorem 1 (Cheeger’s inequality, due to Alon, Alon-Milman, Jerrum-Sinclair) Let {G} be a {d}-regular graph.

\displaystyle  \begin{array}{rcl}  \frac{1}{2}\Phi_G^2 \leq \lambda_2 \leq 2\Phi_G . \end{array}

Proof: Use min-max characterization of eigenvalues. For reverse inequality, insert Cauchy-Schwarz.

2. Higher eigenvalues

Motivated by Small Set Expansion, a problem introduced by Raghavendra and Steurer in connection with the Unique Games Conjecture.

Unclear a priori what will replace edge expansion.

Theorem 2 Easy: For every {k} disjoint subsets of vertices with, one of them has edge expansion {\geq \frac{1}{2}\lambda_k}.

Difficult: There exist {\Omega(k)} disjoint subsets with size {\leq \frac{1}{2}|V|} which all have edge expansion {\leq O(\sqrt{\lambda_k \log k})}.

Open: Can one replace {\Omega(k)} with exactly {k}.

Tightness: {\log k} term cannot be avoided (look at noised cube).

Proof is cumbersome but algorithm is nice. James Lee, S. Oveis Gharan and L. Trevisan have an alternate treatment (STOC’12).

2.1. Algorithm

Case {k=2}. Let {X=(x_1,\ldots,x_n)} be an eigenvector. Sort vertices according to increasing {x_i}. Stop when …

General case. Let {X_1,\dots,X_k} be orthonormal eigenvectors. This gives a map {V} to {{\mathbb R}^k}. Pick {k} random Gaussian vectors {g_1,\ldots,g_k} in {{\mathbb R}^k}. For each vertex {i},

– Project {V_i} along Gaussian directions {g_1,\ldots,g_k}.

– Pick {j} which maximizes {\langle V_i,g_j\rangle}.

– Move (project down) the vector {V_i} to direction {g_i}.

Run Cheeger rounding along each direction {g_i}, output the cuts of expansion {\leq O(\sqrt{\lambda_k \log k})}.

2.2. Questions from the audience

If one knows that eigenvalues have high multiplicity, can improve on {\log k} ? Answer: I don’t know.

Advertisements

About metric2011

metric2011 is a program of Centre Emile Borel, an activity of Institut Henri Poincaré, 11 rue Pierre et Marie Curie, 75005 Paris, France. See http://www.math.ens.fr/metric2011/
This entry was posted in Workshop lecture and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s