Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
MS25 2: Hyperparameter estimation in imaging inverse problems: recent advances on optimisation-based, learning and statistical approaches
Time:
Monday, 04/Sept/2023:
4:00pm - 6:00pm

Session Chair: Luca Calatroni
Session Chair: Monica Pragliola
Location: VG0.110


Show help for 'Increase or decrease the abstract text size'
Presentations

Learning a sparsity-promoting regularizer for linear inverse problems

Luca Ratti1, Giovanni S. Alberti2, Ernesto De Vito2, Tapio Helin3, Matti Lassas4, Matteo Santacesaria2

1Università degli Studi di Bologna, Italy; 2Università degli Studi di Genova, Italy; 3Lappeenranta-Lahti University of Technology; 4University of Helsinki

Variational regularization is a well-established technique to tackle instability in inverse problems, and it requires solving a minimization problem in which a mismatch functional is endowed with a suitable regularization term. The choice of such a functional is a crucial task, and it usually relies on theoretical suggestions as well as a priori information on the desired solution. A promising approach to this task is provided by data-driven strategies, based on statistical learning: supposing that the exact solution and the measurements are distributed according to a joint probability distribution, which is partially known thanks to a training sample, we can take advantage of this statistical model to design regularization operators. In this talk, I will present a hybrid approach, which first assumes that the desired regularizer belongs to a class of operators (suitably described by a set of parameters) and then learns the optimal one within the class. In the context of linear inverse problems, I will first briefly recap the main results obtained for the family of generalized Tikhonov regularizers: a characterization of the optimal regularizer, and two learning-based techniques to approximate it, with guaranteed error estimates. Then, I will focus on a class of sparsity-promotion regularizers, which essentially leads to the task of learning a sparsifying transform for the considered data. Also in this case, it is possible to deduce theoretical error bounds between the optimal regularizer and its supervised-learning approximation as the size of the training sample grows.


Noise Estimation via Tractable Diffusion

Martin Zach1, Thomas Pock1, Erich Kobler2, Antonin Chambolle3

1Graz University of Technology, Austria; 2Universitätsklinikum Bonn, Germany; 3Université Paris-Dauphine-PSL, France

Diffusion models have recently received significant interest in the imaging sciences. After achieving impressive results in image generation, focus has shifted towards finding ways to exploit the encoded prior knowledge in classical inverse problems. In this talk, we highlight another intriguing viewpoint: Instead of focusing on image reconstruction, we propose to use tractable diffusion models which also allow to estimate the noise in an image. In particular, we utilize a fields-of-experts-type model with Gaussian mixture experts that admits an analytic expression for a normalized density under diffusion, and can be trained with empirical Bayes. The normalized model can be used for noise estimation of a given image by maximizing it w.r.t. diffusion time, and simultaneously gives a Bayesian least-squares estimator for the clean image. We show results on denoising problems and propose possible applications to more involved inverse problems.


Speckle noise removal via learned variational models

Salvatore Cuomo, Mariapia De Rosa, Stefano Izzo, Monica Pragliola, Francesco Piccialli

University of Naples Federico II, Italy

In this talk, we address the image denoising problem in presence of speckle degradation typically arising in ultra-sound images. Variational methods and Convolutional Neural Networks (CNNs) are considered well-established methods for specific noise types, such as Gaussian and Poisson noise. Nonetheless, the advances achieved by these two classes of strategies are limited when tackling the de-speckle problem. In fact, variational methods for speckle removal typically amounts to solve a non-convex functional with the related issues from the convergence viewpoint; on the other hand, the lack of large datasets of noise-free ultra-sound images has not allowed the extension of the state-of-the-art CNN denoiser methods to the case of speckle degradation. Here, we aim at combining the classical variational methods with the predictive properties of CNNs by considering a weighted total variation regularized model; the local weights are obtained as the output of a statistically inspired neural network that is trained on a small and composite dataset of natural and synthetic images. The resulting non-convex variational model, which is minimized by means of the Alternating Direction Method of Multipliers (ADMM) is proven to converge to a stationary point. Numerical tests show the effectiveness of our approach for the denoising of natural and satellite images.



Bayesian sparse optimization for dictionary learning

Alberto Bocchinfuso, Daniela Calvetti, Erkki Somersalo

Case Western Reserve University, United States of America

Dictionary learning methods have been used in recent years to solve inverse problems without using the forward model in the traditional optimization algorithms. When the dictionaries are large, and a sparse representation of the data in terms of the atoms is desired, computationally efficient sparse optimization algorithms are needed. Furthermore, reduced dictionaries can represent the data only up to a model reduction error, and Bayesian methods for estimating modeling errors turn out to be useful in this context. In this talk, the ideas of using Bayesian hierarchical models and modeling error methods are discussed.


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AIP 2023
Conference Software: ConfTool Pro 2.8.101+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany