Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
MS02 1: Advances in regularization for some classes of nonlinear inverse problems
Time:
Tuesday, 05/Sept/2023:
1:30pm - 3:30pm

Session Chair: Bernd Hofmann
Session Chair: Robert Plato
Location: VG1.102


Show help for 'Increase or decrease the abstract text size'
Presentations

Deautoconvolution in the two-dimensional case

Yu Deng1, Bernd Hofmann1, Frank Werner2

1Chemnitz University of Technology, Germany; 2University of Würzburg

There is extensive mathematical literature on the inverse problem of deautoconvolution for a function with support in the unit interval $[0,1] \subset \mathcal{R}$, but little is known about the multidimensional situation. We try to fill this gap with analytical and numerical studies on the reconstruction of a real function of two real variables over the unit square from observations of its autoconvolution on $[0,2]^2 \subset \mathcal{R}^2$ (full data case) or on $[0,1]^2$ (limited data case). In an $L^2$-setting, twofoldness and uniqueness assertions are presented for the deautoconvolution problem in 2D. Moreover, its ill-posedness is characterized and illustrated. Extensive numerical case studies give an overview of the behaviour of stable approximate solutions to the two-dimensional deautoconvolution problem obtained by Tikhonov-type regularization with different penalties and the iteratively regularized Gauss-Newton method.


Efficient minimization of variational functionals via semismooth* Newton methods

Simon Hubmer1, Ronny Ramlau1,2

1Johann Radon Institue Linz, Austria; 2Johannes Kepler University Linz, Austria

In this talk, we consider the efficient numerical minimization of variational functionals as they appear for example in $L_{p}$ or TV regularization of nonlinear inverse problems. For this, we consider so-called semismooth* Newton methods, which are a class of optimization methods for non-differentiable and set-valued mappings. Based on the concept of (limiting) normal cones, which are a purely geometrical generalization of derivatives, these methods can be shown to converge locally superlinearly under suitable assumptions. Furthermore, we show how they can be applied to efficiently minimize variational functionals with general convex and in some special cases even non-convex penalty terms.


Convergence Nestorov acceleration for linear ill-posed problems

Stefan Kindermann

Johannes Kepler University Linz, Austria

We show that Nesterov acceleration is an optimal-order iterative regularization method for linear ill-posed problems provided that a parameter is chosen accordingly to the smoothness of the solution. The central result is a representation of the iteration residual polynomials via Gegenbauer polynomials. This also explains the observed semi-saturation effect of Nesterov iteration.


Analysis of the discrepancy principle for Tikhonov regularization under low order source conditions

Chantal Klinkhammer, Robert Plato

Universität Siegen, Germany

We study the application of Tikhonov regularization to ill-posed nonlinear operator equations. The objective of this work is to prove low order convergence rates for the discrepancy principle under low order source conditions of logarithmic type. We work within the framework of Hilbert scales and extend existing studies on this subject to the oversmoothing case. The latter means that the exact solution of the treated operator equation does not belong to the domain of definition of the penalty term. As a consequence, the Tikhonov functional fails to have a finite value.


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AIP 2023
Conference Software: ConfTool Pro 2.8.101+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany