**1. Back to the -extension problem **

The word -extension is unfortunate, but commonly use. It vaguely refers to the fact that the given distance on needs be extended to distance on which vanishes for most edges.

** 1.1. Analysis of the FHRT algorithm **

Let be an edge, let . The easy case when has already been treated, so we assume that .

Recall our terminology: cuts the edge if . Let be the event that cuts edge and is mapped to terminal . Then

Here, is the order of when terminals are ordered in terms of their distances to . Next, we sum over all terminals , and get

Symmetrically, we have an estimate in case cuts edge . Summing over edges gives an upper bound of the objective function in terms of which is the LP value.

Note that the greedy algorithm “map to the closest terminal in distance” does not give any bound at all. For instance,

** 1.2. Link with Lipschitz extension **

We use the FHRT algorithm to give a proof of Lee and Naor’s theorem. Let be a normed space. Let be a weighted graph. Let be a subset of vertices of size . Let be a -Lipschitz map. The algorithm provides us with a distribution on maps such that on . Define

Then

Since this works for arbitrary weighted graphs, it extends to arbitrary metric spaces.

**2. Vertex sparsifiers **

** 2.1. Multi-commodity flows **

In a network with edge capacities we want to transmit a nonnegative amount of information from each source to each destination in a given list.

Definition 1A weighted collection of paths is amulti-commodity flow satisfying a -fraction of demandsif

- For each edge,
- For each ,

Themax-flowof the problem is the maximal feasible value of .

Computing max-flow and finding an optimal multi-commodity flow is an LP.

** 2.2. Flow sparsifiers **

Definition 2 (Leighton, Moitra)Given a graph with edge capacities and a subset of vertices of , avertex flow sparsifieris a graph on vertex set with edge-capacities such that

for any demands such that , .

The goal is to make the approximate computation of max-flow easier by choosing rather small with a reasonable loss (it may be useful if one has to repeatedly compute max-flow on the same graph). We want the bounds to be independent on the size of the original graph . Thus , , is called the *quality* of the sparsifier.

This also provides an approach to SPARSEST CUT, for instance.

** 2.3. Results **

[Moitra 2009] introduced vertex cut sparsifiers (see below). He showed that there exist vertex cut sparsifiers of quality .

[Leighton, Moitra 2010] introduced vertex flow sparsifiers (see above). They showed that there exist vertex flow sparsifiers of quality . This was not constructive. They could provide a construction of an -vertex flow sparsifier. They showed that there exists a constant lower bound for the quality of cut sparsifiers, and they proved an lower bound for vertex flow sparsifiers.

Later, the algorithmic bound was improved to by several groups, [Charikar, Leighton, Li, Moitra], [Englert, Gupta, Krauthgamer, R\” acker, Talgam, Talwar], [Makarychev, Makarychev].

** 2.4. Link to Lipschitz extension **

Below, we shall define vertex cut sparsifiers and metric sparsifiers.

Theorem 3 (Makarychev, Makarychev)

Theorem 4 (Makarychev, Makarychev)

They also obtained a lower bound on (independently, Charikar, Leighton, Li, Moitra proved a lower bound on the quality of cut sparsifiers ), but it turns out that it was not as good as the following bound, which follows from old results of [Johnson, Lindenstrauss] and [Grunbaum].

Theorem 5

Theorem 6 (Makarychev, Makarychev)

Recall that Ball asked wether is bounded. If true, this would give much improved upper bounds on quality of sparsifiers.

Johnson and Lindenstrauss showed that the Lipschitz extension constant is greater than or equal to the projection constant if dim. To the best of my knowledge, this is the only known way for proving lower bounds on . Using this approach, however, one cannot prove a lower bound better than (because the projection constant is at most ).

**Question**: Does there exist a metric space and a normed space such that

The linear setting (projection constants for linear subspaces of Banach spaces) is better understood. [Kadec, Stober 1971] show that every -dimensional has a linear projection with distorsion , and this is sharp [König].

** 2.5. Vertex cut sparsifiers **

Definition 7 (Moitra)Given a graph with edge costs and a subset of vertices of , avertex cut sparsifieris a graph on vertex set with edge-costs such thatfor all cuts of .

denotes the minimum quality achievable by size vertex cut sparsifiers for arbitrary graphs with costs .

** 2.6. Vertex metric sparsifiers **

Note that

where is the cut metric associated to . is called the *cost* of metric .

Definition 8 (Makarychev, Makarychev)Given a graph with edge costs and a subset of vertices of , avertex metric sparsifieris a graph on vertex set with edge-costs such that for every metric on , the minimum cost extension of to satisfies

denotes the minimum quality achievable by size vertex metric sparsifiers for arbitrary graphs with costs .

If one restricts the class of metrics to -embeddable metrics, one recovers .

** 2.7. Sketch of proof of Theorem 3 **

Let denote the minimum cost extension of . Then is a convex function on the space of metrics on . is a linear function on the space of metrics. Finding such that is equivalent to finding a hyperplane in which separates the convex hull of the graph of from the convex hull of the graph of . So the best is

The maximum is attained at some such that is an average (sum, by homogeneity) of distances and is the sum of . Each maps isometrically to , so maps isometrically to . is controlled by extensions of this embedding to .

If we replace with the space of -embeddable metrics on , the same discussion yields to