Notes of James Lee’s lecture nr 4

 

1. Embedding finite metric spaces into Hilbert spaces

 

Definition 1 Let {f:X\rightarrow Y} be a map between metric spaces. The Lipschitz norm of {f} is

\displaystyle  \begin{array}{rcl}  \|f\|_{Lip}=\sum_{x\not=y\in X}\frac{d(f(x),f(y))}{d(x,y)}, \end{array}

and the distorsion of {f} is {dist(f)=\|f\|_{Lip}\|f^{-1}\|_{Lip}}. The {L_2}-distorsion of {X} is

\displaystyle  \begin{array}{rcl}  c_2 (X)=\inf\{dist(f)\,;\, f:X\rightarrow L_2 \}. \end{array}

We are interested is quantitative aspects in terms of {|X|}. We shall see that this involves nice qualitative facts too.

Theorem 2 (Bourgain 1985) For {X} finite, {c_2 (X)\leq O(\log|X|)}.

 

2. Obstructions to embeddability

 

2.1. Observable diameter

 

Let {X} be a metric measure space (generalizes finite metric spaces with counting measure). Given a map {f:X\rightarrow{\mathbb R}}, the essential diameter of {f(X)} is the minimum diameter of {f(A)} where {A} contains at least 90% of the measure.

Definition 3 (Gromov)

\displaystyle  \begin{array}{rcl}  Obsdiam(X,{\mathbb R})=\max_{f:X\rightarrow{\mathbb R}, \, 1 \mathrm{-Lipschitz}}\textrm{essential diameter}(f). \end{array}

 

This notion enters the following spectral considerations. Let {G} be a {d}-regular graph. The Laplacian is {L_G =dI-A}, {A=} adjacency matrix. The second eigenvalue is

\displaystyle  \begin{array}{rcl}  \lambda_2 =\min_{f:X\rightarrow{\mathbb R}}\frac{f\cdot L_G f}{\|f-E_v (f)\|^2}. \end{array}

Theorem 4 (Pinsker) There exists an expanding family, i.e. an infinite family {\mathcal{F}} of {3}-regular graphs such that for all {G\in\mathcal{F}}, {\lambda_2 (G)\geq \delta_0 >0}.

On such a graph, for all {f:V\rightarrow{\mathbb R}},

\displaystyle  \begin{array}{rcl}  \frac{1}{2n}\sum_{x,\,y\in V}|f(x)-f(y)|^2 &=&\|f-E_v (f)\|^2 \\ &\leq& \frac{1}{\lambda_2}f\cdot L_G f \\ &\leq&\frac{1}{\delta_0}\sum_{x\sim y}|f(x)-f(y)|^2 , \end{array}

so

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb E}_{x,y}|f(x)-f(y)|^2 &\leq& const.~ \mathop{\mathbb E}_{x\sim y}|f(x)-f(y)|^2 . \end{array}

If {f} is {1}-Lipschitz, the right hand side is bounded, the left hand side bounds the observable diameter, thus

Proposition 5 The observable diameter is bounded on expanding families of graphs.

 

Since volume growth is at most {d^n}, the typical distance in {G} is {\log n}, thus, for arbitrary maps {f:V\rightarrow L_2},

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb E}_{x,y}|f(x)-f(y)|^2\geq \frac{1}{\|f^{-1}\|_{Lip}^2}\mathop{\mathbb E}_{x,y}d(x,y)^2 \geq O((\log n)^2)\frac{1}{\|f^{-1}\|_{Lip}^2}, \end{array}

and

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb E}_{x\sim y}|f(x)-f(y)|^2 \leq \|f\|_{Lip}^2 , \end{array}

leading to

Theorem 6 (Linial, London, Rabinovich 1994) For an expanding family of {d}-regular graphs,

\displaystyle  \begin{array}{rcl}  c_2 (G)\geq \Omega(\log |G|). \end{array}

 

2.2. Single scale obstruction

 

Expanding families satisfy the following stronger property, with {\tau=D=\log n}.

Definition 7 Say a family of metric spaces satisfies a single scale obstruction if there exists {\tau>0} such that for all {1}-Lipschitz {f:X\rightarrow L_2},

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb E}_{d(x,y)\in[\tau,2\tau]}|f(x)-f(y)|^2 \leq (\frac{\tau}{D})^2 . \end{array}

 

We shall see that more subtle obstructions exist, which cannot be localized at one scale.

 

2.3. Multiscale obstruction

 

Suppose that {X} does not satisfy a single scale obstruction. Then for all {\alpha>0} and all {k\in {\mathbb Z}} there exists a {1}-Lipschitz map {f_k : X\rightarrow L_2} which, on average, distorts distances at scale {2^k} by at most {1/\alpha}. An {\alpha}-ensemble requires this estimate to hold uniformly and not only on average. It is a tool for constructing embeddings.

Definition 8 Let {\alpha>0}. An {\alpha}-ensemble is a family of maps {\phi_k :X\rightarrow L_2}, {k\in {\mathbb Z}}, such that

  1. {\|\phi_k\|_{Lip} \leq 1};
  2. For {d(x,y)\in [2^k ,2^{k+1}]}, {|\phi_k (x) -\phi_k (y)|\geq \frac{2^k}{\alpha}}.

 

Example 1 (Laakso, Lang and Plaut) The Laakso spaces are graphs {\mathcal{L}_k} obtained by recursively replacing edges with 6-edge hinges.

 

Theorem 9 (Gupta, Krauthgamer, Lee)

  1. {\mathcal{L}_k} admits an {O(1)}-ensemble.
  2. {c_2 (\mathcal{L}_k) \geq \Omega(\sqrt{k})=\Omega(\sqrt{\log|L_k|})}.

 

Proof: The Lebesgue differentiation theorem states that a Lipschitz curve in {L_2} is close to a line at almost every point and at all small scales. This idea applies to geodesics in Laakso spaces. Since there are at least two pretty remote geodesics in {L_k} between every pair of endpoints, this implies that compression occurs somewhere near the middle. This is formalized in the short diagonals lemma (as called by Matousek):

If a 6-edges hinge {\mathcal{L}_1} is mapped to Hilbert space with {1+\epsilon}-distorsion of edges, then the two midpoints satisfy {|f(x)-f(y)|^2 \leq O(\epsilon)}. So global distorsion is {\geq\Omega(\frac{1}{\sqrt{\epsilon}})}.

In this simple example, differentiation boils down to summing defects (in squared distance) over successive scales. \Box

The next theorem shows that the distorsion lower bound for Laakso graphs in Theorem 9 is sharp.

Theorem 10 (Lee 2005) If {X} admits an {\alpha}-ensemble, then

\displaystyle  \begin{array}{rcl}  c_2 (X)\leq O(\sqrt{\alpha\log|X|}). \end{array}

If follows that spaces whose Hilbert distorsion is {\Omega(\log|X|)} can admit {\alpha}-ensembles only if {\alpha=\Omega(\log|X|)}, i.e. satisfy a one scale obstruction.

The next theorem shows that Theorem 10 is sharp.

Theorem 11 (Jaffe, Lee, Moharrami 2009) There exists an {n}-point space {X} such that

  1. {X} admits an {\alpha}-ensemble.
  2. {c_2 (X)\geq \Omega(\sqrt{\alpha\log|X|})}.

 

2.4. Random partitions

 

This is a tool to construct {\alpha}-ensembles.

Let {\mathcal{P}} be a random partition of {X}, i.e. a distribution over partitions of {X}. Denoting by {\mathcal{P}(x)} the unique piece of {\mathcal{P}} containing {x}, say that {\mathcal{P}} is {(\alpha,\Delta)}-padded if

  1. All pieces of {\mathcal{P}} have diameter {\leq \Delta}.
  2. For every {x\in X},\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(B(x,\frac{\Delta}{\alpha})\subset\mathcal{P}(x))\geq \frac{1}{2}. \end{array}

Definition 12 Define the modulus of decomposability of {X} as

\displaystyle  \begin{array}{rcl}  \alpha_X =\sup_{\Delta>0}\inf\{\alpha\,;\, X \textrm{ admits an }(\alpha,\Delta)\textrm{-padded random partition}\}. \end{array}

 

 

Example 2 {\alpha({\mathbb R}^d)=\Theta(d)}.

 

Example 3 For hyperbolic space, {\alpha(H^d)=\Theta(d)}, but if one considers only distances {\geq 1}, {\alpha(H^d)=\Theta(1)}.

 

Example 4 (Klein, Plotkin, Rao) For planar graphs, {\alpha\leq O(1)}.

Based on the following. On a planar surface, take an {(r,r+1)}-annulus, intersect with another {(r,r+1)}-annulus, and again one more time. What you get is bounded independantly on {r}. This is proven using the fact that {K_{3,3}} does not embed in the plane. Does this fact appear in Riemannian geometry ?

Example 5 (Lee, Sidiropoulos) For surfaces of genus {g}, {\alpha\leq O(\log g)}.

 

2.5. Random partitions for general finite metric spaces

 

Theorem 13 (Bartal, Linial, Saks) If {X} is finite, then {\alpha(X)\leq o(\log|X|)}.

 

Proof: Given {\Delta>0}. Order points {x_1 ,\ldots,x_n}. Choose independant exponential random variables {R_1 ,\ldots,R_n} with parameter {\frac{\Delta}{4\log n}}. Let {C_i} be {B(x_i ,R_i)} minus the previously defined {C_i}. They form the pieces of random partition {\mathcal{P}}.

Since

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(R_1 >\frac{\Delta}{2})=\frac{1}{n^2}, \end{array}

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(\exists\,i \,;\, R_i >\frac{\Delta}{2})\leq\frac{1}{n}, \end{array}

we can assume that no radius is too big.

Lemma 14

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(B(x,t) \textrm{ is not contained in }\mathcal{P}(x))\leq \frac{t}{\Delta}O(\log n). \end{array}

 

Proof: of Lemma. One can assume that {t\leq \frac{\Delta}{4\log n}}. Assume that for some {j}, {B(y_j ,R_j)} overlaps {B(x,t)}. From the memoryless property of the exponential distribution, conditioned on the above event, the distribution of {R_j} is still exponential with same parameter, so the bad event ({B(y,R_j)} only overlaps a little on {B(x,R)} instead of containing it) has probability

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(R_j \in [0,2t])\leq O(\frac{2t}{\frac{\Delta}{4\log n}}). \end{array}

\Box

This completes the proof of Theorem 13. \Box

In the sequel, we shall construct random partition tailored for special classes of metric spaces.

 

2.6. Comments

 

One can use Theorem 13, with some extra work, to prove Bourgain’s theorem, but Bourgain’s proof is simpler.

Lang: can one use Bourgain’s theorem to prove Theorem 13 ? Answer: yes, but with bound {O((\log n)^2)}.

Question: find a direct proof of the fact that the observable diameter of every metric on the {2}-sphere is bounded below by area. This is a strengthening of Joseph Hersch’s 1970 upper bound on spectral gap.

 

About metric2011

metric2011 is a program of Centre Emile Borel, an activity of Institut Henri Poincaré, 11 rue Pierre et Marie Curie, 75005 Paris, France. See http://www.math.ens.fr/metric2011/
This entry was posted in Course and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s