Stochastic decompositions of metric spaces
This has applications to many problems : Ramsey decompositions, distributed computing… This decompositions are a basic tool for various parts of mathematics and design of algorithms, so it is good to know how to produce them.
A stochastic decomposition is a probability distribution over partitions, i.e. a random partition.
1.1. Padded decompositions
Say a partition of a metric space is -bounded if all pieces (also called clusters) have diameter . Given a point and a partition , denotes the cluster that contains .
Definition 1 Fix a positive function on . Say a random -bounded partition is -padded if, for every point ,
will most of the time be a constant.
Example. Let , . Consider the partition into integral intervals and its translates by numbers uniformly chosen in . This is -padded for .
Example. Let , . The product partition is -padded for .
Definition 2 A metric space is called -decomposable if, for every , there exists a -padded decomposition.
In general, every finite metric space is -decomposable with
Example. Planar graphs are -decomposable for a universal constant . More generally, there is a -decomposability constant for every minor-excluding family of graphs.
Example. Bounded tree-width graphs are -decomposable for a universal constant .
2. Doubling metrics
2.1. The result
Today, I will prove the following.
Theorem 3 Every bounded -doubling metric space is -decomposable.
Doubling means finite dimensional.
Definition 4 A metric space is -doubling if for every point and , can be covered by balls of radius . The doubling dimension is .
Example. has doubling dimension .
2.2. Proof of doubling theorem
Fix . Let be a -net of . The partition will be based on balls centered at points of . We must specify radii and an ordering of points. Choose uniformly at random, and choose a random ordering .
For each , define a cluster as follows
The clusters of the random partition are the non empty ‘s.
Fix . Note that if is such that , then where . So only points in may play a role in defining the cluster of .
Claim: . Indeed, can be covered by balls of radius , and each of them contains at most one point of .
One could add up brutal bounds on the effects of all points of , but this would merely give a bound which is polynomial in . To get the logarithmic bound, one uses the randomness of the ordering. Let be the points of ordered by increasing distance from . Let . Say that cuts if
- is minimal according to the ordering such that .
We must estimate the probability that some small ball is cut, i.e. . Union bound gives
Could one apply Assouad’s embedding instead ? This would give a weaker bound, polynomial in .
How does the partition of the theorem compare with the product partition ? It does better, but it is not optimal either. Using concentration of measure, one can do even better.
Does the theorem extend to unbounded spaces ?
Tomorrow, I will explain an application of stochastic decompositions to constructing embeddings into Banach spaces, see my report.