Home

I am a PhD student at University of Tübingen and the International Max Planck Research School for Intelligent Systems supervised by Robert Bamler. I am interested in flexible and efficient inference methods for deep probabilistic models. Additionally, I am interested in applications of probabilistic machine learning in science. My PhD is funded by the Cluster of Excellence Machine Learning.

Most recently, we found that the SVHN dataset suffers from a distribution mismatch between the training set and test set that impacts probabilistic generative models. We will present the corresponding paper at the NeurIPS 2023 Workshop on Distribution Shifts. A new canonical split can be downloaded from the project page.

Lately, I have been investigating connections between (Differentiable) Annealed Importance Sampling and (Differentiable) Markov Chain Monte Carlo Sampling methods. Our latest paper Resampling Gradients Vanish in Differentiable Sequential Monte Carlo Samplers explains how a maximal Effective Sample Size influences gradients due to resampling (spoiler: you do not need them). A poster that we presented at ICLR 2023 can be found here.

News