**Dictatorship testing and hardness of approximation**

We shall explain a few *sharp* hardness of approximation results. This involves several components: PCP, Parallel Repetition, discrete analysis. Then all these components must be merged together, and it is usually delicate.

Several points of views are needed. For instance, we shall frequently shift from the probabilistic checking of proofs to instances of combinatorial optimization problems.

**1. Linear equations **

In 1996, Johan Håstad gave a sharp hardness of approximation result for ELIN. Specificly, he proved that , , -GAP ELIN is NP-hard.

** 1.1. ELIN **

An instance of ELIN is a (weighted) set of linear equations over Boolean variables , each of the form , .

Say that

- is a YES instance if there exists an assignment of variables that satisfies a fraction of equations.
- is a NO instance if no assignment of variables satisfies a fraction of equations.

The -GAP version of ELIN consists in deciding wether a given instance, which is known to be YES or NO, is YES or is NO.

** 1.2. Reduction **

Reduction between from GAP BLAH’ to GAP BLAH means a polytime algorithm which computes from an instance of problem BLAH’ an instance of problem BLAH, in such a way that if is YES, so is , and if is NO, so is .

Usual decision problems are special cases of gap problems: -GAP problems. The PCP theorem states that a certain gap problem is NP-hard, i.e. admits reductions to arbitrary NP decision problems.

** 1.3. Testing **

Let us switch point of view, and use the testing interpretation. The reduction generates a distribution over equations. Pick an equation at random and evaluate it. An assignment is viewed as an attempt to prove that is YES. If is YES, then there exists an assignment such that . This is called the *completeness test*. If is NO, then for every assignment , . This is called the *soundness test*.

In this language, we see Håstad’s theorem as the conclusion of a long story. Every mathematical proof can be written formally (Frege) and checked in polynomial time (Gödel). Cook-Levin: we can even verify proofs by local tests. Once the proof written according to suitable rules, it is sufficient to check consistency of each -uple of bits. PCP says a bounded number of queries suffice to assert correctness of the proof up to a small probability of error. We are dealing with the ultimate result: 3 bits are sufficient, and with the optimal probability gap of .

** 1.4. Boolean functions on **

Example 1Majority is . -th dictatorship is . Products of dictatorships constitute the Walsh basis of characters . A -junta is a Boolean function that depends on at most variables.

Note that characters are multiplicative: .

** 1.5. Testing characters **

A -bit test detecting a class of Boolean functions consists of a probability distribution on and a decision . We accept if .

Its *completeness* is if is . Its *soundness* is if for all far enough from (in terms of Hamming distance), . (Far enough will be specified in examples).

The *character test* (aka *linearity test*) uses multiplicativity. It is a -bit test detecting the class of characters. The distribution is obtained as follows: pick and uniformly at random in , and let . The decision is . In other words, is accepted whatever the input iff is multiplicative, i.e. iff is a character.

The test is complete by construction. Let us show that for functions whose Hamming distance to characters is , soundness is . Bellare, Goldreich and Sudan, [BGS], had a cumbersome proof, limitied to small . Håstad, [H], gave the nice and general proof which follows. Note this is a sort of zero-knowledge proof, since when , one does not know which character is close to .

*Proof:* View as a real valued function and write it as a linear combination of characters (Fourier-Walsh transform),

This can be proven as follows. Given real valued function , on , define inner product

Since coordinates are independant and have expectation ,

unless , i.e. . So characters form an orthonormal basis. Furthermore, . If is Boolean,

The probability that the test accepts is given by

To sum up, if probability of acceptance is , then

Since , there must be some such that . The Hamming distance of and is

Observe that, due to concentration, most functions have close to , i.e. close to . Given a constant , when is large, the proportion of such that decays exponentially. So we have really tested something.

** 1.6. From testing characters to testing dictatorships **

Assume we are allowed any bounded number of queries. Iterate the character test. If passes all tests, then is very close to some character. How can one distinguish from a character ? Non dictatorship characters are much more noise sensitive. The probability that changes if one coordinate is changed is twice larger for non-dictatorship than for dictatorships. Do we really need the preliminary character test ? Maybe not.

Here is a first attempt for a dictatorship test. Let be a noise parameter. Pick and uniformly at random, independantly. Let be an -noice on variables, i.e. each coordinate takes independently value with probability . Accept iff . Dictatorships pass this test with probability (completeness is ). Unfortunately, the constant function passes the test with probability , so soundness is for functions up to distance of dictatorships.

Observe that dictatorships satisfy whereas does not. So here is Håstad’s *dictatorship test*. Pick and uniformly at random, independantly. Let , and uniform in . Accept iff . Again, completeness is . For soundness, we shall show today that if , then there exists such that .

*Proof:* If is the probability of acceptance,

** References **

[BGS] Bellare, Mihir; Goldreich, Oded; Sudan, Madhu; * Free bits, PCPs, and nonapproximability–towards tight results.* SIAM J. Comput. **27** (1998), no. 3, 804–915.

[H] Håstad, Johan; * Some optimal inapproximability results.* J. ACM **48** (2001), no. 4, 798–859.