Yorgos Deligiannidis
I am Quantitative Research Director at QRT.
Before joining QRT I was an academic and most recently a stats prof at Oxford. My academic research focuses on the analysis of random processes and objects, especially those arising from algorithms used in computational statistics and machine learning. I have worked extensively on the theory and methodology of sampling methods, especially Markov Chain Monte Carlo. I have also worked on random walks on lattices and groups.
Recent talks
- Tutorial on Diffusions given to the Fundamentals of AI CDT (Oxford), October 2025, slides directly.
- COLT 2025, Linear Convergence of Diffusion Models Under the Manifold Hypothesis, July 2025.
- CRISM 2.0, Warwick, Theory for denoising diffusion models (Keynote), May 2025,
- Cambridge, Statslab , Uniform Quantitative stability for Sinkhorn, April 2024
- Warwick Stats Seminar, Uniform Quantitative stability for Sinkhorn, June 2023
- Athens Probability Colloquium , March 2023
News
| Jan 29, 2026 | Two paper accepted at AISTATS 2026:
|
|---|---|
| Sep 19, 2025 | Three papers have been accepted at NeurIPS 2025:
|
| Jul 02, 2025 | New preprint out with Alex Goyal and Nikolas Kantas (Imperial), on the convergence of Gibbs samplers beyond the log-concave setting. In particular we establish bounds on the conductance for the systematic-scan and random-scan Gibbs samplers when the target distribution satisfies a Poincare or log-Sobolev inequality and possesses sufficiently regular conditional distributions. You can find it here. |
| Jul 01, 2025 | At COLT 2025 to present our work with Peter Potapchik and Iskander Azangulov on the Linear Convergence of Diffusion Models Under the Manifold Hypothesis. |
| Oct 17, 2024 | Our preprint Linear Convergence of Diffusion Models Under the Manifold Hypothesis is out. In the context of ImageNet, our work suggests using ~100 steps (intrinsic dim) for effective sampling, compared to the existing state-of-the-art bounds, which recommend ~150k steps (ambient dim). |