Notes of Gabor Pete’s talk

Noise sensitivity in critical percolation

Recall Garban’s talk: Algorithm to decide if a crossing exists : start exploration from the left corner. If there is a left-right crossing, you cannot reach the upper side.

Introduce a dynamic: Flip edges with small probability, once every second. We see that in a short portion of time, crossing switches from left-right to up-down several times. So there seems to be plenty of pivotal edges, i.e. edges that, when flipped, change the crossing event.

1. Pivotal edges

From {SLE} technology, we know that

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(2 \textrm{ arms to distance } n)=n^{-1/4 +o(1)}, \end{array}

so the length of the interface is {\sim n^{7/4}}.

1.1. How many pivotals are there ?

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb E}|Piv_n| \sim n^2 \alpha_4 (n)=n^{3/4 +o(1)}. \end{array}

Let {\epsilon_n} be the probability to switch a given edge. If {\epsilon_n \mathop{\mathbb E}|Piv_n|} tends to {0} (low noise), we don’t hit any pivotals. Indeed, {\mathop{\mathbb E}|Piv_n|^2 \leq C\, (\mathop{\mathbb E}|Piv_n| )^2}. It implies asymptotically full correlation of successive configurations.

If {\epsilon_n \mathop{\mathbb E}|Piv_n|} tends to infinity, we do hit many pivotals, but this does not imply asymptotic independence.

1.2. The Fourier spectrum

Let {f_n} denote the indicator function of the crossing event. Let {N_{\epsilon}} denote the noise operator,

\displaystyle  \begin{array}{rcl}  (N_{\epsilon}(f)(\omega)=\mathop{\mathbb E}(f(\omega^\epsilon|\omega)). \end{array}

{N_{\epsilon}} is diagonal in the Fourier-Walsh basis,

\displaystyle  \begin{array}{rcl}  N_{\epsilon}\chi_S =(1-\epsilon)^{|S|}\chi_S . \end{array}

Define the spectral sample {S_f} as the distribution {\hat{f}_S^2} on the set of subsets of {[n]}. Notation:

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(S_f S)=\hat{f}_S^2 . \end{array}

If for some sequence {k_n}, {\mathop{\mathbb P}(0<|S_{f_n}|<tk_n)} tends to {0}, then we have asymptotic independence, since correlations

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb E}(fN_{\epsilon}f)-\mathop{\mathbb E}(f)^2 =\sum(1-\epsilon)^{|S|}\hat{f}_S^2 . \end{array}

1.3. Pivotals versus spectral sample

An easy calculation gives

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(i\in Piv_f)=\mathop{\mathbb P}(i\in S_f). \end{array}


\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(i\textrm{ and }j\in Piv_f)=\mathop{\mathbb P}(i\textrm{ and }j\in S_f), \end{array}

(but this does not hold for more points).

For {f=f_n =} percolation, {\mathop{\mathbb E}|Piv_n|^2 \leq C\, (\mathop{\mathbb E}|Piv_n| )^2}, hence there exists {c>0} such that

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(|S_{f_n}|>c\mathop{\mathbb E}|S_{f_n}|)\leq \end{array}

1.4. Other events

Dictator has {\mathop{\mathbb P}(S_f =\{x_1\})=1}.

Majority has {/P(S_f =\{x_i\})=\frac{1}{n}} (most of the mass on singletons, very noise stable) and {\mathop{\mathbb E}|S_f| \sim\sqrt{n}} is huge (sharp threshold).

Parity is the most sensitive to noise.

1.5. BKS

Schramm tried to prove existence scaling limits using noise sensitivity.

Theorem 1 (Benjamini, Kalai, Schramm) A sequence of monotone Boolean functions is noise sensitive iff it is correlated to all weightged majorities.

Corollary 2 In particular, left-right percolation crossing is noise sensitive.

1.6. Schramm and Steif

Theorem 3 If {f} can be computed by a randomized algorithm with revealment {\delta}, then

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(|S_f|=k)=\sum_{|S|=k}\hat{f}_S^2 \leq \delta k\mathop{\mathbb E}(f^2). \end{array}

This does not give a good bound for percolation. The best revealment is at least {n^{-1/2}}, whereas the conjectured {\epsilon_n} is {n^{-3/4}}.

2. Basic properties of the spectral sample

Gil Kalai to study the whole distribution {S_f}. We do not know how to sample from it (a quantum algorithm exists, Bernstein-Vazirani).

Now we know conformal invarince (Smirnov, Tsirelson, Schramm-Smirnov). Can we use it to describe {S_f}, {f=} percolation ?

2.1. General facts

Lemma 4 (Linial, Mansour, Nisan) (Random restriction Lemma).

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(S_f \cap A =S)=\mathop{\mathbb E}(\mathop{\mathbb E}(f\chi_S |S\in A^c)^2). \end{array}

This gives a control on the spectral sample in a subset.

2.2. Separation

Lemma 5 (Kesten 1987, Damron, Sapozhnikov 2009) (Strong separation Lemma). If {B} is far from the sides of the square, conditioned on the {4} interfaces to reach {\partial B}, with a uniformly positive conditional probability, the interfaces are well-separated around {\partial B}.

Corollary 6

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(S_{f_n}\cap B\not=\emptyset)\sim\alpha_4 (n). \end{array}

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(|S_{f_n}\cap B| S_{f_n}\cap B\not=\emptyset)\sim\alpha_2 (n). \end{array}

2.3. Self similarity

The number of {r}-boxes hit by {S_{f_n}} is {\mathop{\mathbb E}|S_{f_r}|}.

{S_{f_n}} is very different from a uniform random set {U_n} of the same density. Typically, for {r\gg n^{5/8}}, {U_n} intersects all {r}-boxes.

2.4. Tightness

We would like to estimate how much {|S_{f_n}\cap B|} differs from its expectation. This works for the zero set of random walks on the line. But there is less independence for {S_{f_n}}. For instance, we do not know how independent events {S_{f_n}} hits {B} and {S_{f_n}} hits {B'} are. But we understand {S_{f_n}} hits {B} versus {S_{f_n}\subset A}.

Theorem 7 (Garban, Pete, Schramm) On the triangular lattice,

\displaystyle  \begin{array}{rcl}  \mathop{\mathbb P}(0<|S_{f_n}|<\lambda\mathop{\mathbb E}|S_{f_n}|)\sim\lambda^{2/3}. \end{array}

So the scaling limit of {S_{f_n}} is a conformally invariant Cantor set with Hausdorff dimension {\frac{3}{4}}.

We also prove that the scaling limit of dynamical percolation exists as a Markov process. Above theorem implies that this process is ergodic.

2.5. Questions

  1. What about other Boolean functions ? Do typical random restrictions of large Boolean functions look generic ? Compare Szemerédi 1975 Regularity Lemma, Chatterji-Ledoux 2009 for submatrices of Hermitian matrices.
  2. Self similarity of {Piv_{n}} and {S_{f_n}} implies that the entropy of these random sets {X_n} should be at most {\mathop{\mathbb E}|X_n|}, so no {\log} factors as in the uniform case. This looks like fractal percolation on a {b}-ary tree.

2.6. Connection with Influence-Entropy conjecture

Influence-Entropy conjecture (Friedgut-Kalai 1996).

\displaystyle  \begin{array}{rcl}  \sum_S \hat{f}_S^2 \log\frac{1}{\hat{f}_S^2}\leq C\,\sum_S |S|\hat{f}_S^2 . \end{array}

Considered very hard (much stronger than KKL). Does it hold for {S_{f_n}} ?


About metric2011

metric2011 is a program of Centre Emile Borel, an activity of Institut Henri Poincaré, 11 rue Pierre et Marie Curie, 75005 Paris, France. See
This entry was posted in Workshop lecture and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s