Home

I am a PhD student at University of Tübingen and the International Max Planck Research School for Intelligent Systems supervised by Robert Bamler. My PhD is funded by the Cluster of Excellence Machine Learning. I am interested in deep (probabilistic) models, flexible and efficient inference methods for large (language) models, and data compression.
If you are a Bachelor’s or Master’s student and interested in writing a thesis in our group, please reach out!

Recently, I have been working with Differentiable Annealed Importance Sampling (DAIS). Our latest ICML 2024 paper shows that DAIS minimizes the symmetrized Kullback-Leibler divergence between initial and target distribution and investigates a useful inference method utilizing the learned initial distribution. In our ICLR 2023 Tiny Paper we equip DAIS with a resampling step and explain how a maximal Effective Sample Size influences gradients due to resampling (spoiler: you do not need them).

Our paper at the NeurIPS 2023 Workshop on Distribution Shifts finds that the SVHN dataset suffers from a distribution mismatch between the training set and test set that impacts probabilistic generative models. A new canonical split can be downloaded from the project page.

News