Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
MS24 2: Learned Regularization for Solving Inverse Problems
Time:
Thursday, 07/Sept/2023:
4:00pm - 6:00pm

Session Chair: Johannes Hertrich
Session Chair: Sebastian Neumayer
Location: VG1.101


Show help for 'Increase or decrease the abstract text size'
Presentations

Learning Sparsifying Regularisers

Sebastian Neumayer

EPFL Lausanne, Switzerland

Solving inverse problems is possible, for example, by using variational models. First, we discuss a convex regularizer based on a one-hidden-layer neural network with (almost) free-form activation functions. Our numerical experiments have shown that this simple architecture already achieves state-of-the-art performance in the convex regime. This is very different from the non-convex case, where more complex models usually result in better performance. Inspired by this observation, we discuss an extension of our approach within the convex non-convex framework. Here, the regularizer can be non-convex, but the overall objective has to remain convex. This maintains the nice optimization properties while allowing to significantly boost the performance. Our numerical results show that this convex-energy-based approach is indeed able to outperform the popular BM3D denoiser on the BSD68 test set for various noise scales.


Shared Prior Learning of Energy-Based Models for Image Reconstruction

Thomas Pinetz1, Erich Kobler1, Thomas Pock2, Alexander Effland1

1University of Bonn, Germany; 2Technical University of Graz, Austria

In this talk, we propose a novel learning-based framework for image reconstruction particularly designed for training without ground truth data, which has three major building blocks: energy-based learning, a patch-based Wasserstein loss functional, and shared prior learning. In energy-based learning, the parameters of an energy functional composed of a learned data fidelity term and a data-driven regularizer are computed in a mean-field optimal control problem. In the absence of ground truth data, we change the loss functional to a patch-based Wasserstein functional, in which local statistics of the output images are compared to uncorrupted reference patches. Finally, in shared prior learning, both aforementioned optimal control problems are optimized simultaneously with shared learned parameters of the regularizer to further enhance unsupervised image reconstruction. We derive several time discretization schemes of the gradient flow and verify their consistency in terms of Mosco convergence. In numerous numerical experiments, we demonstrate that the proposed method generates state-of-the-art results for various image reconstruction applications - even if no ground truth images are available for training.



Gradient Step and Proximal denoisers for convergent plug-and-play image restoration.

Samuel Hurault, Arthur Leclaire, Nicolas Papadakis

University of Bordeaux, France

Plug-and-Play (PnP) methods constitute a class of iterative algorithms for imaging problems where regularization is performed by an off-the-shelf denoiser. Specifically, given an image dataset, optimizing a function (e.g. a neural network) to remove Gaussian noise is equivalent to approximating the gradient or the proximal operator of the log prior of the training dataset. Therefore, any off-the-shelf denoiser can be used as an implicit prior and inserted into an optimization scheme to restore images. The PnP and Regularization by Denoising (RED) frameworks provide a basis for this approach, for which various convergence analyses have been proposed in the literature. However, most existing results require either unverifiable or suboptimal hypotheses on the denoiser, or assume restrictive conditions on the parameters of the inverse problem. We will introduce the Gradient Step and Proximal denoisers, and their variants, recently proposed to restore RED and PnP algorithms to their original form as (nonconvex) real proximal splitting algorithms. These new algorithms are shown to converge towards stationary points of an explicit functional and to perform state-of-the-art image restoration, both quantitatively and qualitatively.


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AIP 2023
Conference Software: ConfTool Pro 2.8.101+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany