Notes of Jean-Bernard Lasserre’s lecture nr 4

1. A new approach to nonnegativity

The SDP relaxations we have found up to now give lower bounds on the optimum. The new approach will give upper bounds.

1.1.

Let {C(K)_d} be the convex cone of polynomials of degree {\leq d} which are {\geq 0} on {K}.

Previously, {K} was given by a set of inequations {g=\{g_j\}}. Recall that the quadratic module is {Q(g)=\{\sum\sigma_j g_j \,;\,\sigma_j} a sum of squares{\}}. Since {Q_d (g)\subset C(K)_d}, replacing {C(K)_d} with {Q_d (g)} leads to a lower bound on the optimum. If {M-|X|^2 \in Q(g)}, then {C(K)_d} is the union of {Q_d (g)} (Putinar’s Positivstellensatz), so the optima {f^*_d} over {Q_d (g)} converge to the optimum. When the optimal solution is unique, optimal solutions of SDP’s converge, and this can be seen directly on the moment matrices.

Now, we assume that a positive measure with support exactly equal to {K} is given. This models inverse problems, i.e. situations in which {K} is known only via measurements.

Then a continuous function {f} on {K} is {\geq 0} iff the signed measure {f\,d\mu} is nonnegative, and this needs be checked on open subsets on {K} only.

Theorem 1 (Lasserre, 2010) Let {K} be compact, let {\mu} be a positive measure with support {K}, let {f} be a continuous function on {K}. Let {z_{\alpha}=\int_{K}X^{\alpha}f\,d\mu}. Then

  1. {f\geq 0} on {K} iff \displaystyle  \begin{array}{rcl}  \forall \textrm{ polynomials }h,\quad\int_{K}h^2 f\,d\mu \geq 0, \end{array} and this is equivalent to all moment matrices {M_d (z)} being positive semidefinite.
  2. If, in addition, {f} is concave, then one may replace {K} with its convex hull.

Proof: We use the following easy criterion when {K} is compact.

Lemma 2 Let {y=(y_{\alpha})}. Then {y} is the set of moments of a positive measure on {[-1,1]^n} iff {y} is bounded and all moment matrices {M_d (y)} are positive semidefinite.

As before, we know that as soon as the rank of {M_d (y)} stabilizes, one does not need consider higher degree moments.

In the non compact case, we use

Lemma 3 (Carleman) Let {y=(y_{\alpha})}. If all moment matrices {M_d (y)} are positive semidefinite, and for all {i},

\displaystyle  \begin{array}{rcl}  \sum_{k=1}^{\infty}L_y (x_i^{2k})=+\infty, \end{array}

then {y} is the set of moments of a unique positive measure on {{\mathbb R}^n}.

\Box

In the space {{\mathbb R}[X]_d} of degree {\leq d}-polynomials, the set

\displaystyle  \begin{array}{rcl}  \Delta_k :=\{f\in{\mathbb R}[X]_d \,;\, M_k (fy) \textrm{ positive semidefinite}\} \end{array}

is a spectrahedron, i.e. the intersection of a linear subspace with the cone of positive semidefinite matrices. The spectrahedra {\Delta_d} decrease, and their intersection equals {C(X)_d}. They can be used to give a certificate that {f} is not {\geq 0} on {K}. In other words,

\displaystyle  \begin{array}{rcl}  \overline{\bigcup_{k=0}^{\infty}P_{k}^{d}(g)}=C(K)_d =\bigcap_{k=0}^{\infty}\Delta_k \end{array}

1.2. A hierarchy of upper bounds

Theorem 4 Let {K} be compact, let {\mu} be a positive measure with support {K}, with moments {y}. let {f} be a continuous function on {K}. Consider the following hierarchies of SDP.

  1. minimize (over sums of squares {\sigma} of degree {\leq d}) {\int_{K}f \sigma\,d\mu} subject to constraints

    \displaystyle  \begin{array}{rcl}  \int_{K}\sigma\,d\mu=1. \end{array}

  2. maximize (over real numbers {\lambda}) {\lambda} subject to constraints

    \displaystyle  \begin{array}{rcl}  M_k (f-\lambda,y) \textrm{ positive semidefinite}. \end{array}

Then the optima of 1. (resp. 2.) decrease and converge as {k} tends to infinity to {\inf_{K}f}.

Interpretation: 1. amounts to optimizing {\int_{K}f\,d\nu} over measures {\nu} which have a density w.r. t. {\mu} which is a sum of squares of low degree. The speed of convergence depends on the quality of approximation of Dirac measures by such measures.

1.3. An example

The above method applies whenever {K} comes with a measure with computable moments.

Say a symmetric matrix {A} is copositive if {x^{\top}Ax\geq 0} for all vectors {x\in{\mathbb R}_+^n} with nonnegative entries. It is known that every {0/1} optimization problem can be expressed by minimization on the cone {COP} of copositive matrices. The moments of Gaussian measure on {COP} are known. The SDP upper bounds decrease fast in first steps, but then decrease slowly.

1.4. Quadrature formulae

Quadrature formulae, for a given measure {\mu} are combinations of Dirac measures which give the same integral as {\mu} on every polynomial of degree {\leq d}. They can be obtained from moment matrices as follows: in one variable, replace last line by {(1~x ~x^2~ ...)}. The points are the roots of the determinant of the obtained matrix, a determinant of degree {n^d}. Alternating way to obtain an upper bound on the optimum is to compute these roots, evaluate {f} there and pick the smallest value encountered.

For instance, for MAX CUT, this yields an alternate to Goemans-Williamson’s rounding procedure. The weakness is root extraction, of course.

Advertisements

About metric2011

metric2011 is a program of Centre Emile Borel, an activity of Institut Henri Poincaré, 11 rue Pierre et Marie Curie, 75005 Paris, France. See http://www.math.ens.fr/metric2011/
This entry was posted in Course and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s