Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
Location indicates the building first and then the room number!
Click on "Floor plan" for orientation in the builings and on the campus.
|
Session Overview |
Session | ||
S 1 (6): Machine Learning
Session Topics: 1. Machine Learning
| ||
Presentations | ||
3:50 pm - 4:15 pm
Statistical guarantees for stochastic Metropolis-Hastings 1Universität Hamburg, Germany; 2Karlsruhe Institute of Technology, Germany
Uncertainty quantification is a key issue when considering the application of deep neural network methods in science and engineering. To this end, numerous Bayesian neural network approaches have been introduced. The main challenge is to construct an algorithm which is applicable to the large sample sizes and parameter dimensions of modern applications on the one hand and which admits statistical guarantees on the other hand. A stochastic Metropolis-Hastings step saves computational costs by calculating the acceptance probabilities only on random (mini-)batches, but reduces the effective sample size leading to less accurate estimates. We demonstrate that this drawback can be fixed with a simple correction term. Focusing on deep neural network regression, we prove a PAC-Bayes oracle inequality which yields optimal contraction rates and we analyze the diameter and show high coverage probability of the resulting credible sets. The method is illustrated with a simulation example.
4:15 pm - 4:40 pm
Lévy Langevin Monte Carlo for heavy-tailed target distributions TU Dresden, Germany
We extend the Monte Carlo method of Oechsler 2024 to the setting of a target
distribution with heavy tails: We choose a regularly varying distribution and
prove the convergence of a solution of a stochastic differential equation to this
target. Hereby, the stochastic differential equation is driven by a general Lévy
process - unlike in the case of a classical Langevin diffusion. This method is
justified, apart from the possibility to sample from non-smooth targets, by
the fact that an exponential convergence to the invariant distribution holds,
which in general cannot be guaranteed by the classical Langevin diffusion in
presence of heavy tails. Advantageous compared to other Langevin Monte
Carlo methods is the option of an easy implementation of the method by only
using a compound Poisson process as noise term and a numerically manageable
drift term.
|
Contact and Legal Notice · Contact Address: Conference: GPSD 2025 |
Conference Software: ConfTool Pro 2.8.105 © 2001–2025 by Dr. H. Weinreich, Hamburg, Germany |