Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
MS25 1: Hyperparameter estimation in imaging inverse problems: recent advances on optimisation-based, learning and statistical approaches
Time:
Monday, 04/Sept/2023:
1:30pm - 3:30pm

Session Chair: Luca Calatroni
Session Chair: Monica Pragliola
Location: VG0.111


Show help for 'Increase or decrease the abstract text size'
Presentations

Automatic Differentiation of Fixed-Point Algorithms for Structured Non-smooth Optimization

Peter Ochs

Saarland University, Germany

A large class of non-smooth practical optimization problems can be written as minimization of a sum of smooth and partly smooth functions. We consider structured non-smooth optimization problems which also depend on a parameter vector and study the problem of differentiating its solution mapping with respect to the parameter which has far reaching applications in sensitivity analysis and parameter learning. We show that under partial smoothness and other mild assumptions, Automatic Differentiation (AD) of the sequence generated by proximal splitting algorithms converges to the derivative of the solution mapping. For a variant of automatic differentiation, which we call Fixed-Point Automatic Differentiation (FPAD), we remedy the memory overhead problem of the Reverse Mode AD and moreover provide faster convergence theoretically. We numerically illustrate the convergence and convergence rates of AD and FPAD on Lasso and Group Lasso problems and demonstrate the working of FPAD on prototypical practical image denoising problem by learning the regularization term.


Learning data-driven priors for image reconstruction: From bilevel optimisation to neural network-based unrolled schemes

Kostas Papafitsoros1, Andreas Kofler2, Fabian Altekrüger3,4, Fatima Antarou Ba4, Christoph Kolbitsch2, Evangelos Papoutsellis5, David Schote2, Clemens Sirotenko6, Felix Zimmermann2

1Queen Mary University of London, United Kingdom; 2Physikalisch-Technische Bundesanstalt, Germany; 3Humboldt-Universität zu Berlin; 4Technische Universität Berlin, Germany; 5Finden Ltd, Rutherford Appleton Laboratory, United Kingdom; 6Weierstrass Institute for Applied Analysis and Stochastics, Germany

Combining classical model-based variational methods for image reconstruction with deep learning techniques has attracted a significant amount of attention during the last years. The aim is to combine the interpretability and the reconstruction guarantees of a model-based method with the flexibility and the state-of-the-art reconstruction performance that the deep neural networks are capable of achieving. We introduce a general novel image reconstruction approach that achieves such a combination which we motivate by recent developments in deeply learned algorithm unrolling and data-driven regularisation as well as by bilevel optimisation schemes for regularisation parameter estimation. We consider a network consisting of two parts: The first part uses a highly expressive deep convolutional neural network (CNN) to estimate a spatially varying (and temporally varying for dynamic problems) regularisation parameter for a classical variational problem (e.g. Total Variation). The resulting parameter is fed to the second sub-network which unrolls a finite number of iterations of a method which solves the variational problem (e.g. PDHG). The overall network is then trained end-to-end in a supervised fashion. This results to an entirely interpretable algorithm since the “black-box” nature of the CNN is placed entirely on the regularisation parameter and not to the image itself. We prove consistency of the unrolled scheme by showing that, as the number of unrolled iterations tends to infinity, the unrolled energy functional used for the supervised learning $\Gamma$-converges to the corresponding functional that incorporates the exact solution map of the TV-minimization problem. We also provide a series of numerical examples that show the applicability of our approach: dynamic MRI reconstruction, quantitative MRI reconstruction, low-dose CT and dynamic image denoising.


Learned proximal operators in accelerated unfolded methods with pseudodifferential operators

Andrea Sebastiani1, Tatiana Alessandra Bubba2, Luca Ratti1, Subhadip Mukherjee2

1University of Bologna; 2University of Bath

In recent years, hybrid reconstruction frameworks has been proposed by unfolding iterative methods and learning a suitable pseudodifferential correction on the part that can provably not be handled by model-based methods. In particular, the inner hyperameters of the method are estimated by using supervised learning techniques.

In this talk, I will present a variant of this approach, where an accelerated iterative algorithm is unfolded and the proximal operator is replaced by a learned operators, as in the PnP framework. The numerical experiments on limited-angle CT achieve promising results.


Masked and Unmasked Principles for Automatic Parameter Selection in Variational Image Restoration for Poisson Noise Corruption

Francesca Bevilacqua1, Alessandro Lanza1, Monica Pragliola2, Fiorella Sgallari1

1University of Bologna, Italy; 2University of Naples Federico II, Italy

Due to the statistical nature of electromagnetic waves, Poisson noise is a widespread cause of data degradation in many inverse imaging problems. It arises whenever the acquired data is formed by counting the number of photons irradiated by a source and hitting the image domain. Poisson noise removal is a crucial issue typical in astronomical and medical imaging, where the scenarios are characterized by a low photon count. For the former case, this is related to the acquisition set-up, while in the latter it is desirable to irradiate the patient with lower electromagnetic doses in order to keep it safer. However, the weaker the light intensity, the stronger the Poisson noise degradation in the acquired images and the more difficult the reconstruction problem.

An effective model-based approach for reconstructing images corrupted by Poisson noise is the use of variational methods. Despite the successful results, their performance strongly depends on the selection of the regularization parameter that balances the effect of the regularization term and the data fidelity term. One of the most used approaches for choosing the parameter is the discrepancy principle proposed in [1] that relies on imposing that the data term is equal to its approximate expected values. It works well for mid- and high-photon counting scenarios but leads to poor results for low-count Poisson noise. The talk will address novel parameter selection strategies that outperform the state-of-the-art discrepancy principles in [1], especially for low-count regime. The approaches are based on decreasing the approximation error in [1] by means of a suitable Montecarlo simulation [2], on applying a so-called Poisson whiteness principle [3] and on cleverly masking the data used for the parameter selection [4], respectively. Extensive experiments are presented which prove the effectiveness of the three novel methods.

[1] M. Bertero, P. Boccacci, G. Talenti, R. Zanella, L. Zanni. A discrepancy principle for Poisson data, Inverse Problems 26(10), 2010. [105004]

[2] F. Bevilacqua, A. Lanza, M. Pragliola, F. Sgallari. Nearly exact discrepancy principle for low-count Poisson image restoration, Journal of Imaging 8(1): 1-35, 2022.

[3] F. Bevilacqua, A. Lanza, M. Pragliola, F. Sgallari. Whiteness-based parameter selection for Poisson data in variational image processing, Applied Mathematical Modelling 117: 197-218, 2023.

[4] F. Bevilacqua, A. Lanza, M. Pragliola, F. Sgallari. Masked unbiased principles for parameter selection in variational image restoration under Poisson noise, Inverse Problems 39(3), 2023. [034002]



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AIP 2023
Conference Software: ConfTool Pro 2.8.101+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany