Projects

All projects can be found on my GitHub page.

GNN-MCM

The code for our work Balancing Molecular Information and Empirical Data in the Prediction of Physico-Chemical Properties. We propose a hybrid method for combining molecular descriptors with representation learning for the (exemplary) task of predicting activity coefficients.

DAIS0

The code for our work Differentiable Annealed Importance Sampling Minimizes The Symmetrized Kullback-Leibler Divergence Between Initial and Target Distribution. We investigate the initial distribution of differentiable annealed importance sampling and find that it minimizes the symmetrized Kullback-Leibler divergence between its initial distribution and target distribution. Motivated by this insight we use the initial distribution for variational inference.

SVHN-Remix

We find that the SVHN dataset (Netzer et al., 2011) suffers from a distribution shift between training set and test set. We analyze the distribution shift and its implications and provide a new split

DSMCS

The code for our work Resampling Gradients Vanish in Differentiable Sequential Monte Carlo Samplers where we propose a differentiable Sequential Monte Carlo Sampler and show that there is no need for differentiating the resampling operation if the effective sample size is maximal.

Minima-Slides

Modern and simplistic LaTeX beamer slides for professional and private use.