Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
MS05 3: Numerical meet statistical methods in inverse problems
Time:
Thursday, 07/Sept/2023:
4:00pm - 6:00pm

Session Chair: Martin Hanke
Session Chair: Markus Reiß
Session Chair: Frank Werner
Location: VG2.102


Show help for 'Increase or decrease the abstract text size'
Presentations

Bayesian hypothesis testing in statistical inverse problems

Remo Kretschmann, Frank Werner

Institute of Mathematics, University of Würzburg, Germany

In many inverse problems, one is not primarily interested in the whole solution $u^\dagger$, but in specific features of it that can be described by a family of linear functionals of $u^\dagger$. We perform statistical inference for such features by means of hypothesis testing.

This problem has previously been treated by multiscale methods based upon unbiased estimates of those functionals [1]. Constructing hypothesis tests using unbiased estimators, however, has two severe drawbacks: Firstly, unbiased estimators only exist for sufficiently smooth linear functionals, and secondly, they suffer from a huge variance due to the ill-posedness of the problem, so that the corresponding tests have bad detection properties. We overcome both of these issues by considering the problem from a Bayesian point of view, assigning a prior distribution to $u^\dagger$, and using the resulting posterior distribution to define Bayesian maximum a posteriori (MAP) tests.

The existence of a hypothesis test with maximal power among a class of tests with prescribed level has recently been shown for all linear functionals of interest under certain a priori assumptions on $u^\dagger$ [2]. We study Bayesian MAP tests bases upon Gaussian priors both analytically and numerically for linear inverse problems and compare them with unregularized as well as optimal regularized hypothesis tests.

[1] K. Proksch, F. Werner, A. Munk. Multiscale scanning in inverse problems. Ann. Statist. 46(6B): 3569--3602. 2018. https://doi.org/10.1214/17-AOS1669

[2] R. Kretschmann, D. Wachsmuth, F. Werner. Optimal regularized hypothesis testing in statistical inverse problems. Preprint, 2022. https://doi.org/10.48550/arXiv.2212.12897



Predictive risk regularization for Gaussian and Poisson inverse problems

Federico Benvenuto

Università degli Studi di Genova, Italy

In this talk, we present two methods for the choice of the regularization parameter in statistical inverse problems based on the predictive risk estimation, in the case of Gaussian and Poisson noise. In the first case, the criterion for choosing the regularization parameter in Tikhonov regularization is motivated by stability issue in the case of small sized samples and it minimizes a lower bound of the predictive risk. It is applicable when both data norm and noise variance are known, minimizing a function which depends on the signal-to-noise ratio, and also when they are unknown, using an iterative algorithm which alternates between a minimization step of finding the regularization parameter and an estimation step of estimating signal-to-noise ratio. In this second case, we introduce a novel estimator of the predictive risk with Poisson data, when the loss function is the Kullback–Leibler divergence, in order to define a regularization parameter's choice rule for the expectation maximization (EM) algorithm. We present a Poisson counterpart of the Stein's Lemma for Gaussian variables, and from this result we derive the proposed estimator which is asymptotically unbiased with increasing number of measured counts, when the EM algorithm for Poisson data is considered. In both cases we present some numerical tests with synthetic data.


Reconstruction of active forces in actomyosin droplets

Anne Wald, Emily Klass

Georg-August-University Göttingen, Germany

Many processes in cells are driven by the interaction of multiple proteins, for example cell contraction, division or migration. Two important types of proteins are actin filaments and myosin motors. Myosin is able to bind to and move along actin filaments with its two ends, leading to the formation of a dynamic actomyosin network, in which stresses are generated and patterns may form. Droplets containing an actomyosin network serve as a strongly simplified model for a cell, which are used to study elemental mechanisms. We are interested in determining the parameters that characterize this active matter, i.e., active forces that cause the dynamics of an actomyosin network, represented by the flow inside the actomyosin droplet, as well as the local viscosity. This leads to a (deterministic) parameter identification problem for the Stokes equation, where the viscosity inside the droplet can be estimated by means of statistical approaches.


Learning Linear Operators

Nicole Mücke

TU Braunschweig, Germany

We consider the problem of learning a linear operator $\theta$ between two Hilbert spaces from empirical observations, which we interpret as least squares regression in infinite dimensions. We show that this goal can be reformulated as an inverse problem for $\theta$ with the undesirable feature that its forward operator is generally non-compact (even if $\theta$ is assumed to be compact or of $p$-Schatten class). However, we prove that, in terms of spectral properties and regularisation theory, this inverse problem is equivalent to the known compact inverse problem associated with scalar response regression. Our framework allows for the elegant derivation of dimension-free rates for generic learning algorithms under H\"{o}lder-type source conditions. The proofs rely on the combination of techniques from kernel regression with recent results on concentration of measure for sub-exponential Hilbertian random variables. The obtained rates hold for a variety of practically-relevant scenarios in functional regression as well as nonlinear regression with operator-valued kernels and match those of classical kernel regression with scalar response.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AIP 2023
Conference Software: ConfTool Pro 2.8.101+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany