George Deligiannidis

Associate Professor of Statistics
Department of Statistics, University of Oxford

  • Hugh Price Fellow in Statistics, Jesus College

Contact info

Email: deligian 'at' stats.ox.ac.uk
Telephone: 01865282855
This is me

Brief Bio

I studied mathematics (MMath) at Warwick university, then went on to study for a joint MSc in Financial Mathematics at Heriot-Watt University and the University of Edinburgh. I got my PhD from the University of Nottingham where I studied with Sergey Utev and Huiling Le. After my PhD I worked at the Department of Mathematics of the University of Leicester as Teaching Assistant and Teaching Fellow between 2009 and 2012. I moved to the Department of Statistics of the University of Oxford as a Departmental Lecturer, where I stayed until 2016 when I moved to King's College London as Lecturer in Statistics. I moved back to Oxford as Associate Professor of Statistics in late 2017.

Research

Research Interests

I work in the intersection of probability and statistics to analyse random processes and objects, especially those arising from algorithms used in computational statistics and machine learning. I have worked extensively on the theory and methodology of sampling methods, especially Markov Chain Monte Carlo. I have also worked on random walks on lattices and groups.
At the moment I am particularly interested in the interplay between sampling, optimization and machine learning.

News

I recently received a New Investigator Award from EPSRC, details here, to study the statistical properties of denoising diffusion models. I will be hiring a postdoc for three years to work on this project. If interested, you can apply here .
  1. Linear Convergence Bounds for Diffusion Models via Stochastic Localization. (2023). Joe Benton, Valentin De Bortoli, Arnaud Doucet, George Deligiannidis. Accepted at ICLR 2023 (Spotlight).
    arXiv
  2. From Denoising Diffusions to Denoising Markov Models. Accepted at JRSSB as a discussion paper, by Joe Benton and Yuyang Shi and Valentin De Bortoli and George Deligiannidis and Arnaud Doucet (2023).
    arXiv
  3. Error Bounds for Flow Matching Methods. Joe Benton, George Deligiannidis, Arnaud Doucet (2023). Accepted at TMLR.
    arXiv
  4. On the Expected Size of Conformal Prediction Sets. Guneet S. Dhillon, George Deligiannidis, Tom Rainforth. Accepted at AISTATS 2024.
    arXiv
  5. Generalization Bounds with Data-dependent Fractal Dimensions. Benjamin Dupuis, George Deligiannidis, Umut Şimşekli (2023). ICML 2023.
    arXiv Conference
  6. A Multi-Resolution Framework for U-Nets with Applications to Hierarchical VAEs. Fabian Falck, Christopher Williams, Dominic Danks, George Deligiannidis, Christopher Yau, Chris C Holmes, Arnaud Doucet, Matthew Willetts. NeurIPS 2022 (Oral).
    Conference
  7. A Continuous Time Framework for Discrete Denoising Models. Andrew Campbell, Joe Benton, Valentin De Bortoli, Tom Rainforth, George Deligiannidis, Arnaud Doucet. NeurIPS 2022 (Oral).
    arXiv Conference
  8. Eugenio Clerico, Amitis Shidani, George Deligiannidis, Arnaud Doucet (2022). Chained Generalisation Bounds. Colt 2022.
    arXiv Proceedings
  9. Yuyang Shi, Valentin De Bortoli, George Deligiannidis, Arnaud Doucet (2022). Conditional Simulation Using Diffusion Schr\" odinger Bridges. UAI 2022.
    arXiv
  10. Oscar Clivio, Fabian Falck, Brieuc Lehmann, George Deligiannidis, Chris Holmes. Neural score matching for high-dimensional causal inference. AISTATS 2022.
    arXiv
  11. Eugenio Clerico, George Deligiannidis, Arnaud Doucet (2021). Conditional Gaussian PAC-Bayes. AISTATS 2022.
    Conference arXiv
  12. Eugenio Clerico, George Deligiannidis, Arnaud Doucet (2021). Wide stochastic networks: Gaussian limit and PAC-Bayesian training arXiv:2106.09798.
    arXiv
  13. Alexander Camuto, George Deligiannidis, Murat A. Erdogdu, Mert Gürbüzbalaban, Umut Şimşekli, Lingjiong Zhu (2021). Fractal Structure and Generalization Properties of Stochastic Optimization Algorithms. Accepted at NeurIPS ’21 (Spotlight).
    arXiv
  14. Adrien Corenflos, James Thornton, Arnaud Doucet, George Deligiannidis (2020). Differentiable Particle Filtering via Entropy- Regularized Optimal Transport. Accepted for long presentation at ICML 2021. arXiv:2102.07850.
    arXiv
  15. Simsekli, U., Sener, O., Deligiannidis, G., & Erdogdu, M. A. (2020). Hausdorff Dimension, Stochastic Differential Equations, and Generalization in Neural Networks. NeurIPS ’20 (Spotlight Presentation).
    arXiv
  16. Hayou, S., Clerico, E., He, B., Deligiannidis, G., Doucet, A., & Rousseau, J. (2020). Stable ResNet. AISTATS 2021.
    arXiv Conference
  17. Cornish, R., Caterini, A. L., Deligiannidis, G., & Doucet, A. (2019). Localised Generative Flows. ICML ’20.
    arXiv
  18. Cornish, R., Caterini, A. L., Deligiannidis, G., & Doucet, A. (2019). Relaxing bijectivity constraints with continuously indexed normalising flows. ArXiv Preprint ArXiv:1909.13833.
  19. M El Khribch, G Deligiannidis, D Paulin (2021). On Mixing Times of Metropolized Algorithm With Optimization Step (MAO): A New Framework
    arXiv
  20. George Deligiannidis, Valentin De Bortoli, Arnaud Doucet (2021). Quantitative Uniform Stability of the Iterative Proportional Fitting Procedure. Accepted at Annals of Applied Probability.
    arXiv URL
  21. Syed, S., Bouchard-Côté, A., Deligiannidis, G., & Doucet, A. (2021). Non-Reversible Parallel Tempering: an Embarassingly Parallel MCMC Scheme. Journal of the Royal Statistical Society, Series B,
    A cool application can be found here where it was used to generate the first high-res images of the M87 black hole.

    arXiv URL
  22. Deligiannidis, G., Paulin, D., & Doucet, A. (2018). Randomized Hamiltonian Monte Carlo as Scaling Limit of the Bouncy Particle Sampler and Dimension-Free Convergence Rates. Annals of Applied Probability.
    arXiv URL
  23. Deligiannidis, G., Doucet, A., & Rubenthaler, S. (2020). Ensemble Rejection Sampling.
    arXiv
  24. Deligiannidis, G., Bouchard-Côté, A., & Doucet, A. (2019). Exponential Ergodicity of the Bouncy Particle Sampler. Annals of Statistics, 47(3), 1268–1287.
    arXiv URL
  25. Heng, J., Bishop, A. N., Deligiannidis, G., & Doucet, A. (2019). Controlled sequential monte carlo. Annals of Statistics, 48(5), 2904–2929.
    arXiv
  26. Doucet, A., Deligiannidis, G., Middleton, L., & Jacob, P. (2019). Unbiased smoothing using Particle Independent Metropolis-Hastings. AISTATS 2019.
  27. Schmon, S. M., Doucet, A., & Deligiannidis, G. (2019). Bernoulli Race Particle Filters. AISTATS 2019.
    arXiv
  28. Cornish, R., Vanetti, P., Bouchard-Côté, A., Deligiannidis, G., & Doucet, A. (2019). Scalable Metropolis-Hastings for Exact Bayesian Inference with Large Datasets. ICML 2019.
    arXiv
  29. Schmon, S. M., Deligiannidis, G., Doucet, A., & Pitt, M. K. (2020). Large Sample Asymptotics of the Pseudo-Marginal Method. Biometrika.
    arXiv URL
  30. Deligiannidis, G., Doucet, A., & Pitt, M. K. (2018). The Correlated Pseudo-Marginal Method. JRSSB, 80(5), 839–870.
    URL
  31. Deligiannidis, G., & Lee, A. (2018). Which ergodic averages have finite asymptotic variance? Annals of Applied Probability, 28(4), 2309–2334.
    arXiv URL
  32. Middleton, L., Deligiannidis, G., Doucet, A., & Jacob, P. E. (2018). Unbiased Markov chain Monte Carlo for intractable target distributions. To Appear in Electronic Journal of Statistics.
    arXiv
  33. Vanetti, P., Bouchard-Côté, A., Deligiannidis, G., & Doucet, A. (2017). Piecewise-Deterministic Markov Chain Monte Carlo. Https://Arxiv.org/Pdf/1707.05296.
  34. Doucet, A., Pitt, M. K., Deligiannidis, G., & Kohn, R. (2015). Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator. Biometrika, 102(2), 295–313.
    arXiv URL
  35. Fahim Faizi, Pedro J Buigues, George Deligiannidis & Edina Rosta (2020). Simulated tempering with irreversible Gibbs sampling techniques.
    arXiv
  36. Deligiannidis, G., Maurer, S., & Tretyakov, M. V. (2020). Random walk algorithm for the Dirichlet problem for parabolic integro-differential equation. BIT, Numerical Mathematics.
    Journal arXiv
  37. Faizi, F., Deligiannidis, G., & Rosta, E. (2020). Efficient Irreversible Monte Carlo samplers. J. Chem. Theory Comput., 16:4, 2124–2138.
    arXiv Journal
  38. Deligiannidis, G., Gouëzel, S., & Kosloff, Z. (2021). Boundary of the Range of a random walk and the Fölner property. Electronic Journal of Probability. 26, 1-39.
    arXiv URL
  39. Deligiannidis, G., & Kosloff, Z. (2017). Relative Complexity of Random Walks in Random Scenery in the absence of a weak invariance principle for the local times. Annals of Probability, 45(4), 2505–2532. Retrieved from https://projecteuclid.org/euclid.aop/1502438433
    arXiv URL
  40. Deligiannidis, G., & Utev, S. (2016). Optimal bounds for the variance of self-intersection local times. International Journal of Stochastic Analysis, 2016. Retrieved from https://www.hindawi.com/journals/ijsa/2016/5370627/abs/
    URL
  41. Deligiannidis, G., Peligrad, M., & Utev, S. (2014). Asymptotic variance of stationary reversible and normal Markov processes. Electronic Journal of Probability, 20, 1–26. Retrieved from https://projecteuclid.org/euclid.ejp/1465067126
    URL
  42. Deligiannidis, G., & Utev, S. (2013). Variance of partial sums of stationary sequences. Annals of Probability, 41(5), 3606–3616. Retrieved from https://projecteuclid.org/euclid.aop/1378991850
    URL
  43. Deligiannidis, G., & Utev, S. A. (2011). Asymptotic variance of the self-intersections of stable random walks using Darboux-Wiener theory. Siberian Mathematical Journal, 52(4), 639–650. Retrieved from https://link.springer.com/article/10.1134/S0037446611040082
    URL
  44. Deligiannidis, G., & Utev, S. (2010). An asymptotic variance of the self-intersections of random walks.
    arXiv
  45. Deligiannidis, G. (2009). Some results associated with random walks (PhD thesis). PhD thesis, University of Nottingham 2010. Retrieved from http://eprints.nottingham.ac.uk/13104/1/537670.pdf
    URL
  46. Deligiannidis, G., Le, H., & Utev, S. (2009). Optimal Stopping for processes with independent increments, and applications. Journal of Applied Probability, 46(4), 1130–1145. Retrieved from https://www.cambridge.org/core/journals/journal-of-applied-probability/article/optimal-stopping-for-processes-with-independent-increments-and-applications/B7E6499862E770195C01A7710E0E243F
    URL

Selected Talks

  • Session on PDMPs and hypo-coercivity, MCQMC, Oxford, 2020
  • Mathematical Conversations, Institute for Advanced Study, Princeton, 2020
  • Bayescomp, Florida 2020
  • European Meeting of Statisticians, Palermo 2019
  • Probability Seminar, Warwick, February 2019.
  • SIAM Conference on High Dimensional Inference and Monte Carlo Techniques, Warwick 2019
  • Statistics Seminar, Bristol, October 2018.
  • Bayesian Computation for High-Dimensional Statistical Models, IMS, Singapore, September 2018.
  • Opening talk of Applied Mathematics Session, 3rd UK India Frontiers of Science Meeting, Royal Society, Chicheley Hall, May 2018.
  • BayesComp, Session on PDMPs, Barcelona, March 2018.
  • Machine Learning Seminar, Department of Information Technology, Uppsala, January 2018.
  • Department of Statistics, Warwick, June 2017.
  • Department of Statistics, Harvard University, May 2017.
  • Stochastic Analysis Seminar, Imperial College, December 2016.
  • Midlands Probability Seminar, University of Warwick, May 2016.
  • Mathematical Finance Seminar, Mathematical Institute, Oxford, March 2016.
  • Christmas Workshop on Sequential Monte Carlo, Imperial College, December 2015.
  • Probability Colloquium, Maxwell Institute, Edinburgh, September 2015.
  • Probability and Statistics Seminar, University of Bristol, March 2015.
  • Limit theorems in probability and dynamics, {CIRM, Marseilles}, July 2014.

Selected Talks

  • Athens Probability Colloquium , March 2023
  • PRAIRIE Colloquium , December 2022
  • ESSEC Statistics Seminar , May 2022
  • AUEB (ΟΠΑ) Statistics Seminar , October 2021
  • Imperial Statistics Seminar, October 2021
  • Session on PDMPs and hypo-coercivity, MCQMC, Oxford, 2020
  • Mathematical Conversations, Institute for Advanced Study, Princeton, 2020
  • Bayescomp, Florida 2020
  • European Meeting of Statisticians, Palermo 2019
  • Probability Seminar, Warwick, February 2019.
  • SIAM Conference on High Dimensional Inference and Monte Carlo Techniques, Warwick 2019
  • Statistics Seminar, Bristol, October 2018.
  • Bayesian Computation for High-Dimensional Statistical Models, IMS, Singapore, September 2018.
  • Opening talk of Applied Mathematics Session, 3rd UK India Frontiers of Science Meeting, Royal Society, Chicheley Hall, May 2018.
  • BayesComp, Session on PDMPs, Barcelona, March 2018.
  • Machine Learning Seminar, Department of Information Technology, Uppsala, January 2018.
  • Department of Statistics, Warwick, June 2017.
  • Department of Statistics, Harvard University, May 2017.
  • Stochastic Analysis Seminar, Imperial College, December 2016.
  • Midlands Probability Seminar, University of Warwick, May 2016.
  • Mathematical Finance Seminar, Mathematical Institute, Oxford, March 2016.
  • Christmas Workshop on Sequential Monte Carlo, Imperial College, December 2015.
  • Probability Colloquium, Maxwell Institute, Edinburgh, September 2015.
  • Probability and Statistics Seminar, University of Bristol, March 2015.
  • Limit theorems in probability and dynamics, {CIRM, Marseilles}, July 2014.

Teaching

Advanced Simulation SC5 (2019-2020)

an image alt text

Lectures

  • Monday, 16:00-17:00, LG01 Department of Statistics
  • Tuesday, 11-12:00, LG01 Department of Statistics.

Classes

Undergraduates:
  • Tuesday, weeks 2,5,7, TT1 13:30-15:00
  • Thursday, weeks 2,5,7, TT1 10-11:30
MSc
  • Tuesday, weeks 3, 5, 7, 8 16-17:00

Problem Sheets

Solutions to R exercises

Notes

These may be updated as we go.

Slides

These will be updated as we go, however only minor changes will be made. Feel free to use these to prepare ahead of the lecture.

Disclaimer

Disclaimer This course is based on the material developed by previous instructors, including Arnaud Doucet, Pierre E. Jacob, Rémi Bardenet, George Deligiannidis, Lawrence M. Murray, Tigran Nagapetyan, Patrick Rebeschini and Paul Vanetti.

Modern Statistical Theory (StatML CDT)

Notes

SB21 Foundations of Statistical Inference

I will be updating these from time to time, but not very regularly. For the most up-to-date notes and problem sheets please visit the Canvas website.

Notes

MSc in Statistical Science

Offering external projects for the MSc in Statistical Science

If you are reading this page then you may be an industrial partner, or an academic or postdoc from another department of the university interested in co-supervising an MSc project for students on the MSc in Statistical Science offered by the Department of Statistics.

We are very happy to have projects proposed by members of other departments or industrial partners and such projects have been very successful in the past.

Here are a few key facts to keep in mind:

  • Projects run for 3 months from June to early September. The deadline for submitting the project is usually the second Monday of September.
  • The topics can vary greatly, but the project must demonstrate a good command of advanced statistical techniques acquired in the duration of the degree.
  • Projects proposed externally, ie by industry or other departments, need to be signed-off and co-supervised by an internal supervisor.
  • The deadline for submitting external projects is Friday wk 4 of Hilary term, ie for 2022 the 11th of February.
  • The project will then be advertised to potential internal supervisors who will be in contact with you to discuss and prepare the formal proposal.
  • Unfortunately not all projects will be picked up, but these may be resubmitted for consideration the following year.
  • The templates can be found here in pdf and in docx. There you can see the information that is usually required.
  • For projects with industry partners please also indicate whether a confidentiality agreement is needed. These are handled by Research Services and they often take some time so the sooner we know about this the better.

The timeline is roughly as follows:

  • Projects are approved and released by wk 10-11 of HT, roughly mid-end of March.
  • Students submit their top 10 choices in order of preference in wk 0.
  • Projects are allocated by wk 2-3 in TT.
  • Students start work on the project after the end of their exams around wk 6-7 of TT.

If you are interested in submitting an external MSc project please send me an email by the 15th of January 2022 with a single document (.pdf .doc .txt) including your contact details and a brief description that I can advertise to academics in the Department of Statistics. I will then bring you in touch with anyone interested and you can compile and submit the proposal in collaboration with them. As it often takes a bit of time for this to happen it is crucial that you send me the proposal by the 15th of January.