Group Members
O. Deniz Akyildiz
Assist. Prof. — Head of Group
Deniz is an assistant professor at Imperial who initiated the group in 2023, with broad interests in sampling, optimization, and stochastic filtering. [website]
Paula Cordero-Encinar
PhD student
Paula joined the group in 2024. She works on sampling, generative models, and theoretical foundations. She is co-supervised by Andrew Duncan and she is a part of StatML CDT. [website]
James Cuin
PhD student
James joined the group in 2025. He works on statistical inference and generative models. He is co-supervised by Yanbo Tang and he is a part of StatML CDT.
Joanna Marks
PhD student
Joanna joined the group in 2025. She works on generative models and optimal transport. Her second supervisor is Riccardo Passeggeri and she is a part of StatML CDT.
Paul F. Valsecchi Oliva
PhD student
Paul joined the group in 2023. He works on theoretical foundations of learning algorithms. His second supervisor is Andrew Duncan and he is a part of StatML CDT.
Tim Wang
PhD student
Tim joined the group in 2024. He works on generative models and interacting particle algorithms.
Matthew Young
PhD student
Matthew joined the group in 2025. He works on sampling and generative models. [website]
Jackie Zhang
PhD student
Jackie joined the group in 2025. He works on generative models and interacting particle systems.
Robin Mury
Visiting MSc student
Robin is a MSc student at ETH Zurich, visiting our group for the first half of 2026.
News from the Group
- January 2026: Congratulations to Tim Wang, Rafael Athanasiades and Adam Rozzio for getting their papers accepted at AISTATS 2026! The group will be presenting two papers at AISTATS. Tim's paper, titled Training Latent Diffusion Models with Interacting Particle Algorithms introduces an interacting particle method for training latent diffusion models. Adam (who was a summer intern in the group in 2024) and Rafael (who completed a miniproject in the group in 2025) co-authored the paper Momentum SVGD-EM for Accelerated Maximum Marginal Likelihood Estimation which introduces a new, accelerated method for maximum marginal likelihood estimation in latent variable models.
- December 2025: The group showcased three works at NeurIPS 2025 (two main-track papers and one workshop paper):
- Paula presented Sampling by averaging: A multiscale approach to score estimation, tackling sampling from complex distributions via multiscale fast-slow systems.
- James presented Learning Latent Variable Models via Jarzynski-adjusted Langevin Algorithm, introducing a new Monte Carlo method for latent variable models.
- Deniz presented A Gradient Flow approach to Solving Inverse Problems with Latent Diffusion Models (primary author Tim Wang) at the Frontiers in Probabilistic Inference: Sampling Meets Learning workshop, proposing a framework for solving inverse problems with latent diffusion models.
- December 2025: Joanna presented the poster Scalable Learning of Energy-Based Priors via Interacting Particle Systems at the Amortized ProbML workshop. See more details.
- September 2025: Matthew's paper On diffusion posterior sampling via sequential Monte Carlo for zero-shot scaffolding of protein motifs was published in Transactions on Machine Learning Research (TMLR).
- July 2025: Paula received the Best Student Paper award for Proximal Interacting Particle Langevin Algorithms at the Conference on Uncertainty in Artificial Intelligence (UAI). Here is the announcement on X.
Past (Imperial)
StatML first year (mini) project supervision
Peter HylandRafael Athanasiades
Summer/Visiting students and Interns
Tom Rossa — (from ENSAE, between 07/2024-09/2024)Xinyue Lou — (from Kings College London, between 07/2024-09/2024)
Adam Rozzio — (from ENS Paris-Saclay, between 04/2024-07/2024)
O. Fabian Gonzalez H. — (from Universidad Carlos III de Madrid, between 10/2023-04/2024).
Tim Wang — Generative modelling with adaptive Langevin dynamics, Summer 2023 (Imperial - BSc).
Masters
Matthew Young — Generative Protein Design (MRes)Tim Wang - Flow Matching for Inverse Problems (MSc at Oxford, co-supervisor: George Deligiannidis)
Ben Dowling — Diffusion Models for Optimisation (MSc)
Tony Dam — Gaussian Matrix Factorisation via Gibbs Sampling (MSci, 2024)
Joe Marks — Generative Modelling for Data Augmentation in Factor Investing (MSc, 2023)
Rina Maletta — Matrix Variate Gaussian Matrix Factorisation (MSc, 2023)
Sally Marshall — A Temporal Proximal Point Method for Matrix Factorisation of Time Series (MSci thesis, 2023)
Peter Christofides Paton — Investigating Generalisation in Score-Based Generative Models (MEng thesis, 2023)
Undergraduate
Carlos Cardoso Correia Perello — Convergence of Adaptive Importance Samplers for Unbounded Parametric Families (BSc, 2023)Peiyi Zhou — Sampling from log-concave distributions through Markov Chain Monte Carlo methods (BSci, 2024)
Warwick/Cambridge
In my postdoctoral years at University of Warwick (2019-2021) and The Alan Turing Institute (2021-2022) (visiting University of Cambridge), I have had the pleasure of closely supervising multiple PhD students. Below, I list students who wrote theses predominantly consisting of work done in close collaboration with me.Benjamin Boys, University of Cambridge, graduated 2025, (main supervisor Mark Girolami)
Alex Glyn-Davies, University of Cambridge, graduated 2025, (main supervisor Mark Girolami)
Ayman Boustati, University of Warwick, graduated 2021 (main supervisor Theodoros Damoulas)