1. Discrete Fourier analysis
Today, I state a theorem. This afternoon, Mossel explains how harmonic analysis can be used in combinatorics. The Parseval identity, though elementary, turns out to be very powerful. Tomorrow, I will give an additional ingredient, hypercontractivity.
1.1. Collective coin flipping
In 1985, Ben Or and Linial discussed a problem coming from theoretical computer science called collective coin flipping. When designing algorithms, use of random bits helps a lot. One is satisfied if the algorithm gives a correct answer with probability . Example (Rabin et al.) : primality test.
Even earlier, randomization arose in distributed computing. Synchronizing a large number of computers is uneasy. Assume processors need to agree together on a random bit. Each processor has an access to a source of perfect random bits. The crux of the matter is that some processors are adversarial: they try to cause the process to fail. we want a procedure immune to the presence of , or bad processors. For this, we use a Boolean function.
1.2. Boolean functions
Definition 1 A Boolean function is a function (sometimes, we replace with ).
Here is the procedure proposed by Ben Or and Linial: Processor contributes a random bit . The collective random bit is . The adversarial processors can wait and see the bits sent by good processors before sending their own nasty bit. There are variants. You can allow every processor to contribute many bits. You can also precribe the order in which the bits are contributed. You can consider a protocol where every processor contributes random bits in turns.
Suggestion: Why not select at random a processor and take its bit ? Answer: We are not allowed to do this, since we are not allowed external random bits. All the randomness available belongs to individual processors.
Here are examples of Boolean functions.
- OR (resp. AND) function (resp. ) do not produce a uniformly distributed bit.
- Dictatorship does not do the job, since one bad processor can spoil it.
- Parity mod does not do well either, since every processor can change the output.
- Majority if , otherwise. This can be spoiled by bad processors with high probability. Indeed, the fluctuations of are of size .
Can one do better ? Yes, this is what Ben Or and Linial observed. Before, I introduce influence.
1.3. Influence
Definition 2 (Influence) Let be a Boolean function. Assume that (usually, ). Let be a set of processors. The influence of towards 1 is
The influence of towards 0 is
The influence of on is
It is between 0 and 1.
In case has one element, . Notation: . Alternatively,
Define total influence as the sum of all ‘s.
Theorem 3 (Ben Or, Linial) : For all Boolean with , the total influence
In particular, .
They realized that this is related to discrete isoperimetry ( is also known as edge boundary or edge expansion), and follows from an inequality by Loomis and Whitney. They stressed the following questions:
- What can one say of the maximum influence ?
- Or more generally, of , especially when ?
Ben Or and Linial found Boolean functions that do better than majority for collective coin tossing.
The first example is recursive majority on triples. Let us assume that . Divide recursively the populations in 3. Construct the corresponding ternary tree. Take majorities of triples at some level to assign values at the next level, then take majorities again, and so on, up to the root. (Compare with the american electoral system, with 2 levels. Soviet Union and other communist countries used to have similar systems with up to 6 levels). For this Boolean function , for all , . So .
Later on, they found a better example: tribes. Let us assume that . Divide the population in equal tribes of size . Let iff for some tribe, all the votes are 1. In other words, is the OR of the ANDs of tribes. If , then . Indeed,
so
In this example, for all ,
so
Note the asymmetry: to force outcome 0, one needs to put an adversarial voter in each of tribes, whereas to force 1, one merely needs to fill one single tribe (of size ) with adversaries. Recursive majority is more symmetric.
1.4. The KKL Theorem
Ben Or and Linial conjectured that one cannot do better, i.e. for every balanced Boolean function, there is a processor whose influence is at least .
Theorem 4 (Kahn, Kalai, Linial) For every Boolean function with ,
Proof given tomorrow. It relies on some Fourier analysis, plus hypercontractivity (Bonami, Beckner, Gross). We tried many other ideas which turned out to be useless.
Corollary 5 Presence of adversaries will fool every Boolean function.
Are there better protocols than a single Boolean function ? Answer is Yes.
The first example is passing the baton. Let the first processor select at random a second processor, which selects at random a third processor, and so on. Use the random bit proposed by the winner, the last remaining processor. This takes rounds, random bits. What will be the strategy of adversaries ? Achieve that the winner is an adversary. For this, all they can do, when selected, is to select a good processor. This again yields a threshold of .
N. Alon and M. Naor found a protocol with a smaller number of rounds.
Today’s record (U. Feige): Number of rounds is . Here is a simplified version of this protocol. Set two rooms (more rooms required to get ). Let each processor go at random in one of the rooms. Take the room which has the smallest number of processors. Start again. There is no clear strategy for adversaries.
2. What is discrete harmonic analysis good for ?
Here is a list of questions to which the methods of discrete harmonic analysis are relevant.
2.1. Extremal combinatorics
Here are two sample results.
Sperner’s theorem (1928): An antichain of subsets of an -point set has at most elements. Antichain means no set contains the next. Equality holds for all subsets of size .
Erdös-Ko-Rado theorem (1935): If F is a collection of -sets in an -set, , and if, for every , , intersection is non empty, then . Equality holds for all sets containing a given element.
- Question: What happens in case equality almost holds? e.g. if we assume , a small constant ?
- Question: How many large antichains are there ?
- Question: What if we replace pairwise non disjoint with 3-wise non disjoint ?
These questions seem to be substantially harder.
2.2. Random graphs
Subject started by Erdös and Renyi in the 1960’s. Here is a model which slightly differs from theirs, but is easier to analyze.
Given an -vertex set, chose edges at random with probability , get random graph .
Given a finite graph , for which values of will contain as a subgraph, with high probability ? It clearly works for high , where is the transition ?
2.3. Percolation and related models
Start with planar grid ( edges). Consider random subgraph obtained when every edge is taken with probability . A transition arises at .
- For , subgraph has only small connected components.
- For , with positive probability, left side and right side are connected.
- For , this occurs with probability 1.
Every graph property (connectedness, simply connectedness,…) is 1 or 0 depending on the assignments associated to edges (open or closed), i.e. a Boolean function.
2.4. Computational complexity
Given a Boolean function , how complicated is it to compute it ? I explain the Boolean circuit model (one among many). A circuit is a directed graph split into levels. At the first level, there are Boolean variables. At the higher levels, there are gates (each of them takes several inputs and one output. The gates are chosen in the list OR, AND, NEGATION (one can get rid of NEGATION by admitting variables and their negations at the first level). Every Boolean function is an AND of ORs or an OR of ANDs, but this leads to huge circuits, so we prefer the more general class of circuits.
N/NP can be formulated as follows. The existence of a Hamiltonian cycle in an -vertex graph is a Boolean function. Can it be computed by circuits of polynomial size in ?
Bounding from below circuit complexity has shown to be very hard. Existing results apply only to very limited submodels. For instance, for bounded depths circuits, discrete harmonic methods are relevant.