## Notes of Jean Lasserre’s lecture nr 2

1. Back to bounds for positivstellenstatz

In Schmüdgen’s Positivstellensatz,

Theorem 1 (Schmüdgen 1991) Let ${K}$ be a basic semi-algebraic set, defined by ${g_i \geq 0}$. Then a polynomial ${f}$ is positive on ${K}$ iff ${f\in P(g_1 ,\cdots, g_m)}$.

there are bounds (due to Marie-Fran\c coise Roy, possibly unpublished), but very large.

Putinar’s theorem

Theorem 2 (Putinar, Jacobi, Prestel 1993) Let ${K}$ be a basic semi-algebraic set. Assume that there exists a constant ${M}$ such that ${M-|X|^2 \in Q(g_1 ,\cdots, g_m)}$. Then a polynomial ${f}$ is positive on ${K}$ iff ${f\in Q(g_1 ,\cdots, g_m)}$.

is an improvement, since the number of terms in the expression is ${m}$ instead of ${2^m}$. Apart from that, the apriori bounds are even worse than for Schmüdgen’s theorem.

Linial: are there examples which tell that bounds have to be so bad ? Answer: Not that I know. My personal view is that bad instances (very small coefficients…) lead to terrible computation times, as is common in symbolic computation.

1.1. An alternative representation

Again due to Krivine.

Theorem 3 (Krivine) Let ${K}$ be a basic semi-algebraic set, defined by ${g_i \geq 0}$. Assume that ${\{0,1,g_1 ,\ldots,g_m\}}$ generates the algebra ${{\mathbb R}[X]}$. Assume that ${K}$ is compact, and that all ${g_i \leq 1}$ on ${K}$. If ${f>0}$ on ${K}$, then there exist nonnegative constants ${c_{\alpha\beta}}$ such that

$\displaystyle \begin{array}{rcl} f=\sum_{\alpha,\,\beta\in{\mathbb N}^m}c_{\alpha\beta}g^{\alpha}(1-g)^{\beta}. \end{array}$

If one fixes an apriori bound on the degrees ${\alpha}$, ${\beta}$, finding the decomposition is solving an LP (note that existing LP software is efficient on instances involving millions of variables). However, the computation is ill-conditioned due to binomial coefficients in the expansion of ${(1-g)^{\beta}}$.

Remark 1 Theorem 3 does not apply if ${f\geq 0}$ instead of ${f>0}$.

Indeed, let ${g_i}$ be linear (so that ${K}$ is a polyhedron) but ${f}$ non linear, achieving an absolute minimum at an interior point ${x^*}$ of some face with only one active constraint, ${\{g_1 =0\}}$, but not constant on the that face, i.e. there exists a point ${x}$ where ${f(x)>0}$ and ${g_1 (x)=0}$. Assume a decomposition ${f=poly(g,1-g)}$ as in Theorem 3 exists. Then all monomials vanish at ${x^*}$, so have a positive degree in ${g_1}$. So the polynomial vanishes at ${x}$ as well, contradiction.

2. Dual side: the moment problem

Problem: Given a set ${K\subset {\mathbb R}^n}$ and numbers ${y_{\alpha}}$, ${\alpha\in {\mathbb N}^n}$, does there exist a positive measure ${\mu}$ supported on ${K}$ such that

$\displaystyle \begin{array}{rcl} \forall \alpha\in {\mathbb N}^n ,\quad \int_{K}X^{\alpha}\,d\mu ? \end{array}$

Positive answers when ${n=1}$: Hamburger (${K={\mathbb R}}$), Stieltjes (${K={\mathbb R}_+}$).

Here is some notation

Definition 4 The linear functional ${L_y :{\mathbb R}[X]\rightarrow{\mathbb R}}$ is defined by

$\displaystyle \begin{array}{rcl} L_y (f)=\sum_{\alpha\in{\mathbb N}^n}f_{\alpha}y_{\alpha}. \end{array}$

The moment matrix has entries

$\displaystyle \begin{array}{rcl} M_{d}(y)_{\alpha\beta}=y_{\alpha+\beta}, \quad |\alpha|+|\beta|\leq d. \end{array}$

Given a polynomial ${\theta=\sum_{\gamma}\theta_{\gamma}X^{\gamma}}$, the localizing matrix is

$\displaystyle \begin{array}{rcl} M_{d}(\theta y)_{\alpha\beta}=L_y (\theta X^{\alpha+\beta}) \end{array}$

The moment matrix encodes

$\displaystyle \begin{array}{rcl} \forall f \textrm{ of degree }\leq d,\, L_y (f^2)\geq 0 \Leftrightarrow M_d (y)\textrm{ is positive semidefinite}. \end{array}$

The localizing matrix encodes

$\displaystyle \begin{array}{rcl} \forall f \textrm{ of degree }\leq d,\, L_y (f^2 \theta)\geq 0 \Leftrightarrow M_d (\theta y) \textrm{ is positive semidefinite}. \end{array}$

2.1. Schmüdgen dual conditions

Let ${K}$ be a basic semi-algebraic set, defined by ${g_i \geq 0}$. For ${J\subset [m]}$, let ${g_J =\prod_{j\in J}g_j}$.

Theorem 5 Assume ${K}$ is compact. Then a sequence ${y=\{y_{\alpha}\}}$ has a representing measure if and only if

$\displaystyle \begin{array}{rcl} \forall J\subset [m],\,\forall f\in{\mathbb R}[X],\quad L_y (f^2 g_J)\geq 0. \end{array}$

If one restricts to degree ${\leq d}$ polynomials, this amounts to checking that the matrices ${M_d (g_J y)}$ are positive semidefinite for all ${J\subset[m]}$. It is a set of ${2^m}$ linear matrix inequalities, i.e. an SDP.

2.2. Putinar dual conditions

Let ${K}$ be a basic semi-algebraic set, defined by ${g_i \geq 0}$.

Theorem 6 Assume ${K}$ is compact. Assume that there exists a constant ${M}$ such that ${M-|X|^2 \geq 0}$ on ${K}$. Then a sequence ${y=\{y_{\alpha}\}}$ has a representing measure if and only if

$\displaystyle \begin{array}{rcl} \forall i\in [m],\,\forall f\in{\mathbb R}[X],\quad L_y (f^2 g_i)\geq 0. \end{array}$

If one restricts to degree ${\leq d}$ polynomials, this amounts to checking that the matrices ${M_d (g_i y)}$ are positive semidefinite for all ${i\in[m]}$. It is a set of ${m}$ linear matrix inequalities, i.e. an SDP.

2.3. Krivine dual conditions

Let ${K}$ be a basic semi-algebraic set, defined by ${g_i \geq 0}$. For ${\alpha\in{\mathbb N}^m}$, let ${g^{\alpha}=\prod_{j\in [m]}g_j^{\alpha_j}}$.

Theorem 7 Assume ${K}$ is compact. Assume that the ${g_i}$‘s and ${1}$ generate ${{\mathbb R}[X]}$. Assume that ${1\leq g_j \leq 1}$ on ${K}$. Then a sequence ${y=\{y_{\alpha}\}}$ has a representing measure if and only if

$\displaystyle \begin{array}{rcl} \forall \alpha,\,\beta\in {\mathbb N}^n,\,\forall f\in{\mathbb R}[X],\quad L_y (g^{\alpha}(1-g)^{\beta})\geq 0. \end{array}$

If one restricts to degree ${\leq d}$ polynomials, this amounts to checking that a finite set of linear inequalities has a nonnegative solution, i.e. solving an LP.

3. SDP relaxations

Now we have tools to solve the global optimization problem on a basic semialgebraic set ${K}$ defined by ${g_i \geq 0}$.

Recall the dual formulation: to minimize ${f=\sum_{\alpha}f_{\alpha}X^{\alpha}}$ on ${K}$, minimize ${\sum_{\alpha}f_{\alpha}y_{\alpha}}$ under constraints ${y_{\alpha}=\int X^{\alpha}\,d\mu}$ for some probability measure ${\mu}$ supported on ${K}$. Note that the unknown is ${y}$. Also, ideally, the measure ${\mu}$ is a Dirac measure.

Substitute the Putinar dual conditions. Minimize ${L_y (f)}$ under constraints

• ${\forall d}$, ${M_d (y)}$ is positive semidefinite.
• ${\forall d}$, ${\forall i}$, ${M_d (g_i y)}$ is positive semidefinite.
• ${L_y (1)=1}$.

Our relaxation ${Q_d}$ consists in bounding the degree ${d}$. Let degree${(g_i)=2v_i}$ or ${2g_i -1}$.

Minimize ${L_y (f)}$ under constraints

• ${M_d (y)}$ is positive semidefinite.
• ${\forall d}$, ${\forall i}$, ${M_{d-v_i} (g_i y)}$ is positive semidefinite.
• ${L_y (1)=1}$.

The dual SDP ${Q^{*}_{d}}$ is

Maximize ${\lambda}$ under constraints

• ${f-\lambda=\sigma_0 +\sum_{k=1}^{m}\sigma_k g_k}$
• each ${\sigma_k}$ is a sum of squares.
• degree${(\sigma_0)\leq d}$, degree${(\sigma_k)\leq d}$.

Example 1 The Goemans-Williamson relaxation for MAX CUT.

Let ${K\subset{\mathbb R}^n}$ be defined by ${g_i (x)=x_{i}^2 -1}$, ${g_{n+i} (x)=1-x_{i}^2}$, ${f(x)=x^{\top}Qx}$ where ${Q}$ is the adjacency matrix of the given (weighted) graph. The primal SDP constraints ${M_{d}(g_i y)}$ and ${M_{d}(g_{n+i} y)}$ positive semidefinite imply that ${M_{d}(g_i y)=0}$ For ${d=1}$, we find the Goemans-Williamson relaxation. The dual ${Q^{*}_{1}}$ amounts to maximizing ${\lambda}$ subject to

$\displaystyle \begin{array}{rcl} x^{\top}Qx-\lambda=\sigma_0 (x)+\sum_{j}\lambda_j (x_j^2 -1). \end{array}$

Here, ${\sigma_0}$ is a sum of squares of linear forms. Instead of rounding, one could study the dual problem and show it is not feasible beyond 0.878… This is pure algebra, but does not seem easy.

Linial: in 1995, I tried to prove Bourgain’s embedding in this way (search for ${\ell_2}$ embeddings is an SDP), but did not succeed.

Kindler: can one stop with a finite ${d}$ ? Answer: yes, at ${d=2^n}$, the SDP stabilizes.