## Notes of Giovanna Citti’s lecture

Total variation flow on the Heisenberg group

We study graphs of functions, as subsets of ${G\times{\mathbb R}}$, with prescribed boundary data. We prove existence and uniqueness of minimal graphs, long time existence of the total variation flow.

1. Motivation

From image processing. Use cortical models. Dates back to the 1980’s. In 2003, Sarti and I introduced a model of image completion, that Petitot endorsed. A missing part in an image is viewed as a hole in the graph in ${G\times{\mathbb R}}$, to be filled by a minimal surface.

2. Technique

Riemannian approximation. Approximate subRiemannian structure with blowing up Riemannian metrics ${g_\epsilon}$. Then study total variation flow (gradient flow of area functional. Domain is assumed to be convex.

Using derivatives along right-invariant vectorfields, equations look better. Use parabolic maximum principle:

– maximum is achieved on the boundary.

– along the boundary, construct linear barrier functions.

2.2. ${C^{1,\alpha}}$ estimate

Suitably weighted Poincaré inequality is uniform in ${\epsilon}$. Moser iteration leads to uniform ${C^{1,\alpha}}$ estimate.

2.3. ${C^{\infty}}$ interior estimate

Use fundamental solution of the parabolic (non divergence form) operator

$\displaystyle \begin{array}{rcl} L=\partial_t -\sum_{i,\,j\leq r} a_{ij}X_j X_i -\sum_{i,\,j>r}a_{ij}\epsilon^2 X_j X_i . \end{array}$

$\displaystyle \begin{array}{rcl} L'=\partial_t -\sum_{i,\,j\leq r} a_{ij}X_j X_j -\sum_{i,\,j>r}a_{ij}(\partial s_j +\epsilon X_j)(\partial s_i+\epsilon X_i) . \end{array}$

As ${\epsilon\rightarrow 0}$, ${L'}$ converges smoothly, so one can use standard estimates. Of course, on functions which do not depend on extra variables ${s_j}$, ${L}$ and ${L'}$ coincide. Whence uniform estimates.

2.4. Schauder estimates at the boundary

Problem reduces to a linear one. The linearized operator has good estimates away from the characteristic set of the boundary of the domain.

Theorem 1 (Capogna-Citti-Manfredini) For a smooth and convex domain, and ${C^{2,\alpha}}$ boundary data, there exists a unique minimal graph satisfying boundary data. It is Lipschitz, smooth in the interior, ${C^{2,\alpha}}$ up to non characteristic boundary points. It is the limit of the total variation flow.

What about characteristic points ? Up to now, everything worked for all 2-step groups. From now on, stick to 3-dimensional Heisenberg group. Jerison showed that Schauder estimate at the boundary does not hold. His counterexample is ${\{x_3\geq M(x^2+y^2)\}}$ with ${M\ll 0}$ making the domain very concave. The horizontal normal is undefined. However, the approximating Riemannian normals at the characteristic point converge. Indeed, the Riemannian approximation performs a blow up. It follows that renormalizations of Euclidean Poisson kernels converge to the Heisenberg Poisson kernel at the characteristic point.

Issue of characteristic points is relevant for the application to visual completion. But worse: in vision, ${G}$ is not the Heisenberg group but the roto-translation group, for which we do not even have uniqueness of minimal graph.

Regularity at characteristic points is governed by the fundamental solution ${((x^2+y^2)^2+16t^2)^{1/4}}$. Regularity holds if the domains contains a level set if the fundamental solution touching the boundary at the characterstic point.