Total variation flow on the Heisenberg group
We study graphs of functions, as subsets of , with prescribed boundary data. We prove existence and uniqueness of minimal graphs, long time existence of the total variation flow.
From image processing. Use cortical models. Dates back to the 1980’s. In 2003, Sarti and I introduced a model of image completion, that Petitot endorsed. A missing part in an image is viewed as a hole in the graph in , to be filled by a minimal surface.
Riemannian approximation. Approximate subRiemannian structure with blowing up Riemannian metrics . Then study total variation flow (gradient flow of area functional. Domain is assumed to be convex.
2.1. Gradient estimates
Using derivatives along right-invariant vectorfields, equations look better. Use parabolic maximum principle:
– maximum is achieved on the boundary.
– along the boundary, construct linear barrier functions.
Suitably weighted Poincaré inequality is uniform in . Moser iteration leads to uniform estimate.
2.3. interior estimate
Use fundamental solution of the parabolic (non divergence form) operator
Add extra variables and use
As , converges smoothly, so one can use standard estimates. Of course, on functions which do not depend on extra variables , and coincide. Whence uniform estimates.
2.4. Schauder estimates at the boundary
Problem reduces to a linear one. The linearized operator has good estimates away from the characteristic set of the boundary of the domain.
Theorem 1 (Capogna-Citti-Manfredini) For a smooth and convex domain, and boundary data, there exists a unique minimal graph satisfying boundary data. It is Lipschitz, smooth in the interior, up to non characteristic boundary points. It is the limit of the total variation flow.
What about characteristic points ? Up to now, everything worked for all 2-step groups. From now on, stick to 3-dimensional Heisenberg group. Jerison showed that Schauder estimate at the boundary does not hold. His counterexample is with making the domain very concave. The horizontal normal is undefined. However, the approximating Riemannian normals at the characteristic point converge. Indeed, the Riemannian approximation performs a blow up. It follows that renormalizations of Euclidean Poisson kernels converge to the Heisenberg Poisson kernel at the characteristic point.
Issue of characteristic points is relevant for the application to visual completion. But worse: in vision, is not the Heisenberg group but the roto-translation group, for which we do not even have uniqueness of minimal graph.
Regularity at characteristic points is governed by the fundamental solution . Regularity holds if the domains contains a level set if the fundamental solution touching the boundary at the characterstic point.