Home
I am a PhD student at University of Tübingen and the International Max Planck Research School for Intelligent Systems supervised by Robert Bamler. I am interested in flexible and efficient inference methods for deep probabilistic models. Additionally, I am interested in applications of probabilistic machine learning in science. My PhD is funded by the Cluster of Excellence Machine Learning.
Most recently, we found that the SVHN dataset suffers from a distribution mismatch between the training set and test set that impacts probabilistic generative models. We will present the corresponding paper at the NeurIPS 2023 Workshop on Distribution Shifts. A new canonical split can be downloaded from the project page.
Lately, I have been investigating connections between (Differentiable) Annealed Importance Sampling and (Differentiable) Markov Chain Monte Carlo Sampling methods. Our latest paper Resampling Gradients Vanish in Differentiable Sequential Monte Carlo Samplers explains how a maximal Effective Sample Size influences gradients due to resampling (spoiler: you do not need them). A poster that we presented at ICLR 2023 can be found here.
News
- November 2023: I am visiting NeurIPS 2023 in person. Talk to me in New Orleans!
- October 2023: Our paper The SVHN Dataset Is Deceptive for Probabilistic Generative Models Due to a Distribution Mismatch has been accepted at the NeurIPS 2023 Workshop on Distribution Shifts! Check out the project page!
- May 2023: I am visiting ICLR 2023 in person. Come and talk to me in Kigali, Rwanda!
- April 2023: Our paper Resampling Gradients Vanish in Differentiable Sequential Monte Carlo Samplers has been accepted as a TinyPaper at ICLR 2023!
- July 2022: I am starting a PhD in the group of Robert Bamler!