** Geometry and combinatorics **

The theory of expander graphs gives us an idea of the impact combinatorics may have on the rest of mathematics in the future.

Most existing combinatorics deals with one-dimensional objects. Understanding higher dimensional situations is important. In particular, random simplicial complexes.

**1. Permutations **

In a hike, the most dangerous moment is the very beginning: you might take the wrong trail. So let us spend some time at the starting point of combinatorics, permutations.

A permutation matrix is nothing but an -array of 0’s and 1’s, with exactly one 1 in each row or column.

This suggests the following -dimensional generalization: Consider -arrays of 0’s and 1’s, with exactly one 1 in each row (all directions).

How many such things are there ? For , this is related to counting latin squares. A latin square is an -array of integers in such that every appears exactly once in each row or column. View a latin square as a topographical map: entry equals the height at which the nonzero entry of the array sits.

In van Lint and Wilson’s book, one finds the following asymptotic formula for the number of latin squares.

This was an illumination to me. It suggests the following asymptotic formula for the number of generalized permutations.

**Conjecture**:

We can merely prove an upper bound.

Theorem 1 (Linial-Zur Luria)

This follows from the theory of the permanent

**2. Permanent **

Definition 2The permanent of a square matrix is the sum of all terms of the determinants, without signs.

It is important, though naive looking. If is the incidence graph of a bipartite graph , is the number of perfect matchings of .

There are two major questions concerning bounds on permanents.

** 2.1. Van der Waerden’s conjecture **

Van der Waerden conjectured in the 1920’s and Faligman, Egorychev proved in the 1970 the following lower bound.

Definition 3An matrix is doubly stochastic if it is nonnegative and all rows and columns add up to 1.

Easy observation: If is doubly stochastic, its permanent is . By compactness, there is a positive minimum. Van der Waerden conjecti

Theorem 4 (Faligman, Egorychev)If doubly stochastic, then

** 2.2. Minc’s conjecture **

In the 60’s, Minc conjectured and later Bregman proved the following upper bound.

Theorem 5 (Bregman)If be a 0/1 matrix with 1’s on the -th row, then

Our upper bound on the number of latin squares () is a simple consequence of these two theorems.

Consider boxes, say arrays of integers, and try to extend them into latin cubes. The next layer is a permutation matrix, with 1’s forbidden if 1 is already used in the same shaft in the box. The number of next layers is the permanent of the matrix obtained from by removing….

** 2.3. Difficulties in higher dimensions **

The above connection between permanent and perfect matchings does not generalize. There exists a -array of 0/1 such that every line has exactly two 1’s but the array does not contains a 2-dimensional permutation. Therefore I suspect that a different proof of the lower bound on the number of latin squares needs to be found.

Here is an other fact which gets difficult in higher dimensions. Doubly stochastic matrices form a convex polytope , whose vertices are described by the Birkhoff-von Neumann theorem.

Theorem 6 (Birkhoff-von Neumann)The vertices of are permutation matrices.

Triply, -ly stochastic -arrays are defined in a similar way, but they have much more vertices.

**3. Proof of Bregman’s theorem **

Next I explain how we could generalise Bregman’s theorem. We were influenced by Schrijver and Radhakrishnan.

** 3.1. Entropy **

My main point is to illustrate the method of entropy. Entropy measures how much information one gets by sampling a random source. Given a probability distribution on some -point set,

Then . The entropy of a random variable is the entropy of its distribution.

Let be a vector valued random variable. The following (easy) chain rule is essential.

** 3.2. Generalized diagonals **

In a 0/1 matrix, a *generalized diagonal* is a sequence of entries, one from every row and colum. is the number of generalized diagonals.

We must count elements in a large set . If is a uniform random variable in , . Here is the set of all generalized diagonals in .

Let us draw generalized diagonals uniformly, yielding . Let

Then

where is the number of viable 1’s in the -th row, or, equivalently, the number of 1’s on the -th row which are not below an other 1 (“in the shadow”). Let us average the inequality

over all possible arrangements of the rows. Recall is the number of non shaded 1’s in row relative to ordering . In other words, it is the rank of our “special runner” in a random ordering of the runners. Therefore, the expectation

and this proves Bregman’s theorem.

In higher dimensions, it goes more or less the same.