Volume 37, pp. 87-104, 2010.

Convergence rates for regularization with sparsity constraints

Ronny Ramlau and Elena Resmerita

Abstract

Tikhonov regularization with $p$-powers of the weighted $\ell_p$ norms as penalties, with $p\in (1,2)$, have been employed recently in reconstruction of sparse solutions of ill-posed inverse problems. This paper shows convergence rates for such a regularization with respect to the norm of the weighted spaces by assuming that the solutions satisfy a certain smoothness (source) condition. The meaning of the latter is analyzed in some detail. Moreover, converse results are established: Linear convergence rates for the residual, together with convergence of the approximations to the solution, can be achieved only if the solution satisfies a source condition. Further insights for the particular case of a convolution equation are provided by analyzing the equation both theoretically and numerically.

Full Text (PDF) [392 KB], BibTeX

Key words

ill-posed problem, regularization, Bregman distance, sparsity

AMS subject classifications

47A52, 65J20

Links to the cited ETNA articles

[19]Vol. 30 (2008), pp. 54-74 Ronny Ramlau: Regularization properties of Tikhonov regularization with sparsity constraints

ETNA articles which cite this article

Vol. 39 (2012), pp. 437-463 Dirk A. Lorenz, Peter Maass, and Pham Q. Muoi: Gradient descent for Tikhonov functionals with sparsity constraints: Theory and numerical comparison of step size rules
Vol. 39 (2012), pp. 476-507 Ronny Ramlau and Clemens A. Zarzer: On the minimization of a Tikhonov functional with a non-convex sparsity constraint

< Back