Bio


Prior to joining the University of Oxford, I was a Lecturer in the Computer Science department at Yale University and a Postdoctoral Associate at the Yale Institute for Network Science, hosted by Sekhar Tatikonda. I have a Ph.D. in Operations Research and Financial Engineering from Princeton University, where I worked in Applied Probability under the supervision of Ramon van Handel.

Here is my Curriculum Vitae.

Research


Interests

My research interests lie at the intersection of Probability, Statistics, and Computer Science.
I am interested in the investigation of fundamental principles in high-dimensional probability, statistics and optimization to design computationally efficient and statistically optimal algorithms for machine learning.


Activities

I am a Fellow at the Alan Turing Institute London. On June 11th 2018 I co-organized the one-day workshop The Interplay between Statistics and Optimization in Learning. I am now co-organizing the two-day workshop Statistics and Optimization which will take place on January 13th and 14th 2020.

I am a member of the European Laboratory for Learning and Intelligent Systems (ELLIS). I am part of the Oxford management team for the Imperial-Oxford StatML Centre for Doctoral Training.

Papers

  • Implicit regularization for optimal sparse recovery (with T. Vaškevičius and V. Kanade). NeurIPS, 2019. [arXiv]
  • Optimal statistical rates for decentralised non-parametric regression with linear speed-up (with D. Richards). NeurIPS, 2019. [arXiv]
  • Decentralized cooperative stochastic multi-armed bandits (with D. Martínez-Rubio and V. Kanade). NeurIPS, 2019. [arXiv]
  • Graph-dependent implicit regularisation for distributed stochastic subgradient descent (with D. Richards). [arXiv]
  • Locality in network optimization (with S. Tatikonda), IEEE Transactions on Control of Network Systems, vol. 6, no. 2, pp. 487-500, 2019. [journal] [arXiv]
  • A new approach to Laplacian solvers and flow problems (with S. Tatikonda), Journal of Machine Learning Research, 20(36):1−37, 2019. [journal] [arXiv]
  • Accelerated consensus via Min-Sum Splitting (with S. Tatikonda), NIPS, 2017. [proceedings] [arXiv] [poster]
  • Decay of correlation in network flow problems (with S. Tatikonda), 50th Annual Conference on Information Sciences and Systems (CISS), 2016. [proceedings] [pdf]
  • Fast mixing for discrete point processes (with A. Karbasi), 28th Annual Conference on Learning Theory (COLT), 2015. [proceedings] [arXiv] [poster]
  • Can local particle filters beat the curse of dimensionality? (with R. van Handel), Ann. Appl. Probab. 25, No. 5, 2809–2866, 2015. [journal] [arXiv]
  • Phase transitions in nonlinear filtering (with R. van Handel), Electron. J. Probab. 20, no. 7, 1-46, 2015. [journal] [arXiv]
  • Comparison theorems for Gibbs measures (with R. van Handel), J. Stat. Phys. 157, 234-281, 2014. [journal] [arXiv]
  • Nonlinear filtering in high dimension, Ph.D. thesis, Princeton University, 2014. [pdf]

Talks

  • On the Interplay between Statistics, Computation and Communication in Decentralised Learning, Decision and Control Systems, KTH, October 2019. Invited talk.
  • Implicit Regularization for Optimal Sparse Recovery, Probability and Mathematical Statistics seminar, Department of Mathematics, KTH, October 2019. Invited talk.
  • Implicit Regularization for Optimal Sparse Recovery, London Machine Learning Meetup, September 2019. Invited talk.
  • Implicit Regularization for Optimal Sparse Recovery, Theory, Algorithms and Computations of Modern Learning Systems workshop, DALI/ELLIS, September 2019. Invited talk.
  • On the Interplay between Statistics, Computation and Communication in Decentralised Learning, Optimization and Statistical Learning workshop (OSL 2019), Les Houches School of Physics. Invited talk. [slides]
  • On the Interplay between Statistics, Computation and Communication in Decentralised Learning, School of Mathematics, University of Bristol, March 2019. Invited talk.
  • On the Interplay between Statistics, Computation and Communication in Decentralised Learning, Algorithms & Computationally Intensive Inference Seminar, University of Warwick, February 2019. Invited talk.
  • Multi-Agent Learning: Implicit Regularization and Order-Optimal Gossip, Theory and Algorithms in Data Science, The Alan Turing Institute, August 2018. Invited talk.
  • Multi-Agent Learning: Implicit Regularization and Order-Optimal Gossip, Statistical Scalability Programme, Isaac Newton Institute, June 2018. Invited talk.
  • Multi-Agent Learning: Implicit Regularization and Order-Optimal Gossip, Statistics Seminar Series, Department of Decision Sciences, Bocconi University, May 2018. Invited talk.
  • Distributed and Decentralised Learning: Generalisation and Order-Optimal Gossip, Amazon Berlin, April 2018.
  • Locality and Message Passing in Network Optimization, Workshop on Optimization vs Sampling, The Alan Turing Institute, February 2018. Invited talk.
  • Accelerated Consensus via Min-Sum Splitting, Statistics Seminar, University of Cambridge, November 2017. Invited talk.
  • Accelerating message-passing using global information, OxWaSP Workshop, University of Warwick, October 2017.
  • Accelerating message-passing using global information, StatMathAppli 2017, Statistics Mathematics and Applications, Fréjus, September 2017.
  • Accelerated Min-Sum for consensus, Large-Scale and Distributed Optimization, LCCC Workshop, Lund University, June 2017. Invited talk.
  • Message-passing in convex optimization, WINRS conference, Brown University, March 2017. Invited talk.
  • Min-Sum and network flows, Workshop on Optimization and Inference for Physical Flows on Networks, Banff International Research Station, March 2017. Invited talk.
  • Locality and message-passing in network optimization, DISMA, Politecnico di Torino, January 2017. Invited talk.
  • Locality and message-passing in network optimization, LIDS Seminar Series, MIT, November 2016. Invited talk.
  • Locality and message-passing in network optimization, Probability Seminar, Division of Applied Mathematics, Brown University. November 2016. Invited talk.
  • Message-passing in network optimization, YINS Seminar Series, Yale University, November 2016.
  • Tractable Bayesian computation in high-dimensional graphical models, Mathematical Sciences Department, IBM Thomas J. Watson Research Center, June 2016. Invited talk.
  • From sampling to learning submodular functions, 2016 New England Statistics Symposium (NESS), Yale University, April 2016. Invited talk.
  • Scale-free sequential Monte Carlo, Seminar on particle methods in Statistics, Statistics Department, Harvard University, April 2016. Invited talk.
  • Decay of correlation in network flow problems, 50th Annual Conference on Information Sciences and Systems (CISS 2016), Princeton University, March 2016.
  • Locality in network optimization, INFORMS, Philadelphia, November 2015.
  • Local algorithms in high-dimensional models, Statistics Department, University of Oxford, September 2015. Invited talk.
  • Killed random walks and graph Laplacians: local sensitivity in network flow problems, Yale Probabilistic Networks Group seminar, Statistics Department, Yale University, September 2015.
  • Decay of correlation in graphical models; algorithmic perspectives, School of Computer and Communication Sciences, École Polytechnique Fédérale de Lausanne, August 2015.
  • Fast mixing for discrete point processes, 28th Annual Conference on Learning Theory (COLT 2015), Université Pierre et Marie Curie, July 2015. [poster] [video]
  • Filtering compressed signal dynamics in high dimension, 45th Annual John H. Barrett Memorial Lectures, University of Tennessee, May 2015. Invited talk.
  • On the role of the Hessian of submodular functions, Yale Probabilistic Networks Group seminar, Statistics Department, Yale University, April 2015.
  • Submodular functions, from optimization to probability, Probability Theory and Combinatorial Optimization, The Fuqua School of Business, Duke University, March 2015.
  • Estimating conditional distributions in high dimension, Applied Mathematics seminar, Yale University, October 2014.
  • Nonlinear filtering in high dimension, Yale Probabilistic Networks Group seminar, Statistics Department, Yale University, September 2014.
  • Particle filters and curse of dimensionality, Monte Carlo Inference for Complex Statistical Models workshop, Isaac Newton Institute for Mathematical Sciences, University of Cambridge, April 2014. Invited talk. [slides] [video]
  • Particle filters and curse of dimensionality, Cambridge Machine Learning Group, University of Cambridge, February 2014.
  • New phenomena in nonlinear filtering, Yale Probabilistic Networks Group seminar, Statistics Department, Yale University, February 2014.
  • Filtering in high dimension, Cornell Probability Summer School, Cornell University, July 2013.

Teaching


In Fall 2018 I designed and taught Algorithmic Foundations of Learning, for which I received the 2019 MPLS Teaching Award. I will teach this course again in Fall 2019.

In Spring 2018 I taught Advanced Simulation Methods.

During the 2017/2018 academic year I organized a reading group on optimization for Machine Learning. Notes can be found here.

While at Yale, in Fall 2016 I served as the Head Instructor for CS50 — Introduction to Computing and Programming — taught jointly with Harvard University. This was a coverage on the Yale Daily News. Here is the intro class in Machine Learning and Python, or its VR version.

From 2015 to 2017 I supervised a group of senior students on research projects in Machine Learning, investigating the development of algorithms for natural language processing, sparse regression, and distributed optimization.

I was a member of the Yale Postdoctoral Association, with the goal to facilitate and promote teaching experiences for postdocs in the sciences. For three years in a row, from 2015 to 2017, I organized the Julia Robinson Mathematics Festival at Yale, a celebration of ideas and problems in mathematics that enable junior high and high school students to explore fun math in a non-competitive setting.

In 2013 I received the Excellence in Teaching Award from the Princeton Engineering Council while serving as head teaching assistant for ORF 309 (Probability and Stochastic Systems) at Princeton University.

I was also a fellow of the McGraw Center for Teaching and Learning at Princeton University.

Contact Information

patrick.rebeschini AT stats.ox.ac.uk

Department of Statistics
University of Oxford
24-29 St Giles'
Oxford, OX1 3LB
United Kingdom

LinkedIn profile