Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
MS39: Statistical inverse problems: regularization, learning and guarantees
Time:
Monday, 04/Sept/2023:
1:30pm - 3:30pm

Session Chair: Kim Knudsen
Session Chair: Abhishake Abhishake
Location: VG2.105


Show help for 'Increase or decrease the abstract text size'
Presentations

On the Regularized Functional Regression

Sergei Pereverzyev

The Johann Radon Institute for Computational and Applied Mathematics (RICAM), Austria

Functional Regression (FR) involves data consisting of a sample of functions taken from some population. Most work in FR is based on a variant of the functional linear model first introduced by Ramsay and Dalzell in 1991. A more general form of polynomial functional regression has been introduced only quite recently by Yao and Müller (2010), with quadratic functional regression as the most prominent case. A crucial issue in constructing FR models is the need to combine information both across and within observed functions, which Ramsay and Silverman (1997) called replication and regularization, respectively. In this talk we are going to present a general approach for the analysis of regularized polynomial functional regression of arbitrary order and indicate the possibility for using here a technique that has been recently developed in the context of supervised learning. Moreover, we are going to describe of how multiple penalty regularization can be used in the context of FR and demonstrate an advantage of such use. Finally, we briefly discuss the application of FR in stenosis detection.

Joint research with S. Pereverzyev Jr. (Uni. Med. Innsbruck), A. Pilipenko (IMATH, Kiev) and V.Yu. Semenov (DELTA SPE, Kiev) supported by the consortium of Horizon-2020 project AMMODIT and the Austrian National Science Foundation (FWF).



Inverse learning in Hilbert scales

Abhishake Abhishake

LUT University Lappeenranta, Finland

We study the linear ill-posed inverse problem with noisy data in the statistical learning setting. Approximate reconstructions from random noisy data are sought with general regularization schemes in Hilbert scale. We discuss the rates of convergence for the regularized solution under the prior assumptions and a certain link condition. We express the error in terms of certain distance functions. For regression functions with smoothness given in terms of source conditions, the error bound can then be explicitly established.


Stability and Generalization for Stochastic Gradient Methods

Yiming Ying

SUNY Albany, United States of America

Stochastic gradient methods (SGMs) have become the workhorse of machine learning (ML) due to their incremental nature with a computationally cheap update. In this talk, I will first discuss the close interaction between statistical generalization and computational optimization for SGMs in the framework of statistical learning theory (SLT). The core concept for this study is algorithmic stability which characterizes how the output of an ML algorithm changes upon a small perturbation of the training data. Our theoretical studies have led to new insights into understanding the generalization of overparameterized neural networks trained by SGD. Then, I will describe how this interaction framework can be used to derive lower bounds for the convergence of existing methods in the task of maximizing the AUC score which further inspires a new direction for designing efficient AUC optimization algorithms.


Causality and Consistency in Bayesian Inference Paradigms

Klaus Mosegaard

University of Copenhagen, Denmark

Bayesian inference paradigms are regarded as powerful tools for solution of inverse problems. However, Bayesian formulations suffer from a number of difficulties that are often overlooked.

The well known, but mostly neglected, difficulty is connected to the use of conditional probability densities. Borel, and later Kolmogorov's (1933/1956), found that the traditional definition of probability densities is incomplete: In different parameterizations it leads to different, conditional probability measures. This inconsistency is generally neglected in the scientific literature, and therefore threatens the objectivity of Bayesian inversion, Bayes Factor computations, and trans-dimensional inversion. We will show that this problem is much more serious than usually assumed.

Additional inconsistencies in Bayesian inference are found in the so-called hierarchical methods where so-called hyper-parameters are used as variables to control the uncertainties. We will see that these methods violate causality, and analyze how this challenges the validity of Bayesian computations.


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AIP 2023
Conference Software: ConfTool Pro 2.8.101+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany