In this talk we are going to discuss the problem of hyperparameters tuning in
the context of learning from different domains known also as domain adaptation. The domain adaptation scenario arises when one studies two input-output relationships governed by probabilistic laws with respect to different probability measures, and uses the data drawn from one of them to minimize the expected prediction risk over the other measure.
The problem of domain adaptation has been tackled by many approaches, and most domain adaptation algorithms depend on the so-called hyperparameters that change the performance of the algorithm and need to be tuned. Usually, algorithm performance variation can be attributed to just a few hyperparameters, such as a regularization parameter in kernel ridge regression, or batch size and number of iterations in stochastic gradient descent training.
In spite of its importance, the question of selecting these parameters has not been much studied in the context of domain adaptation. In this talk, we are going to shed light on this issue. In particular, we discuss how a regularization of the Radon-Nikodym differentiation can be employed in hyperparameters tuning. Theoretical results will be illustrated by application to stenosis detection in different types of arteries.
The talk is based on the recent joint work [1] performed within COMET-Module project S3AI funded by the Austrian Research Promotion Agency (FFG).
[1] E.R. Gizewski, L. Mayer, B.A. Moser, D.H. Nguyen, S. Pereverzyev Jr.,
S.V. Pereverzyev, N. Shepeleva, W. Zellinger. On a regularization of unsupervised domain adaptation in RKHS. Appl. Comput. Harmon. Anal. 57: 201-227, 2022.