We do research at the intersection of statistical machine learning, computational statistics, and applied probability, developing new algorithms and theoretical insights. The group is led by O. Deniz Akyildiz in the Department of Mathematics at Imperial College London.
This page shares news and activities specific to the research group.
News
- December 2025: The group showcased three works at NeurIPS 2025 (two main-track papers and one workshop paper):
- Paula presented Sampling by averaging: A multiscale approach to score estimation, tackling sampling from complex distributions via multiscale fast-slow systems.
- James presented Learning Latent Variable Models via Jarzynski-adjusted Langevin Algorithm, introducing a new Monte Carlo method for latent variable models.
- Deniz presented A Gradient Flow approach to Solving Inverse Problems with Latent Diffusion Models (primary author Tim Wang) at the Frontiers in Probabilistic Inference: Sampling Meets Learning workshop, proposing a framework for solving inverse problems with latent diffusion models.
- December 2025: Joanna presented the poster Scalable Learning of Energy-Based Priors via Interacting Particle Systems at the Amortized ProbML workshop. See more details.
- September 2025: Matthew's paper On diffusion posterior sampling via sequential Monte Carlo for zero-shot scaffolding of protein motifs was published in Transactions on Machine Learning Research (TMLR).
- July 2025: Paula received the Best Student Paper award for Proximal Interacting Particle Langevin Algorithms at the Conference on Uncertainty in Artificial Intelligence (UAI). Here is the announcement on X.