Prior to joining the University of Oxford, I was a Lecturer in the Computer Science department at Yale University and a Postdoctoral Associate at the Yale Institute for Network Science, hosted by Sekhar Tatikonda.
I have a Ph.D. in Operations Research and Financial Engineering from Princeton University, where I worked in Applied Probability under the supervision of Ramon van Handel.

Here is my Curriculum Vitae.

My research interests lie at the intersection of **Applied Probability**, **Statistics**, and **Computer Science**.
I am interested in the investigation of fundamental principles to perform scalable inference, learning, and optimization in high-dimensional models, and in the design and analysis of algorithms in distributed Machine Learning, with applications to graphical models and Monte Carlo methods.

I am an alumnus of the Yale Institute for Network Science, and an alumnus of the Princeton Statistical Laboratory.

**Accelerated Consensus via Min-Sum Splitting**(with S. Tatikonda), NIPS (2017). [proceedings] [arXiv] [poster]**A new approach to Laplacian solvers and flow problems**(with S. Tatikonda). [pdf] [arXiv]**Scale-free network optimization: foundations and algorithms**(with S. Tatikonda). [pdf] [arXiv]**Locality in network optimization**(with S. Tatikonda). [pdf] [arXiv]**Decay of correlation in network flow problems**(with S. Tatikonda), 50th Annual Conference on Information Sciences and Systems (CISS) (2016). [proceedings] [pdf]**Fast mixing for discrete point processes**(with A. Karbasi), 28th Annual Conference on Learning Theory (COLT) (2015). [proceedings] [arXiv] [poster]**Can local particle filters beat the curse of dimensionality?**(with R. van Handel),*Ann. Appl. Probab.***25**, No. 5, 2809–2866 (2015). [journal] [arXiv]**Phase transitions in nonlinear filtering**(with R. van Handel),*Electron. J. Probab.***20**, No. 7, 1-46 (2015). [journal] [arXiv]**Comparison theorems for Gibbs measures**(with R. van Handel),*J. Stat. Phys.***157**, 234-281 (2014). [journal] [arXiv]**Nonlinear filtering in high dimension**, Ph.D. thesis, Princeton University (2014). [pdf]

**Accelerated Consensus via Min-Sum Splitting**, Statistics Seminar, University of Cambridge, November 2017. Invited talk.**Accelerating message-passing using global information**, OxWaSP Workshop, University of Warwick, October 2017.**Accelerating message-passing using global information**, StatMathAppli 2017, Statistics Mathematics and Applications, Fréjus, September 2017.**Accelerated Min-Sum for consensus**, Large-Scale and Distributed Optimization, LCCC Workshop, Lund University, June 2017. Invited talk.**Message-passing in convex optimization**, WINRS conference, Brown University, March 2017. Invited talk.**Min-Sum and network flows**, Workshop on Optimization and Inference for Physical Flows on Networks, Banff International Research Station, March 2017. Invited talk.**Locality and message-passing in network optimization**, DISMA, Politecnico di Torino, January 2017. Invited talk.**Locality and message-passing in network optimization**, LIDS Seminar Series, MIT, November 2016. Invited talk.**Locality and message-passing in network optimization**, Probability Seminar, Division of Applied Mathematics, Brown University. November 2016. Invited talk.**Message-passing in network optimization**, YINS Seminar Series, Yale University, November 2016.**Tractable Bayesian computation in high-dimensional graphical models**, Mathematical Sciences Department, IBM Thomas J. Watson Research Center, June 2016. Invited talk.**From sampling to learning submodular functions**, 2016 New England Statistics Symposium (NESS), Yale University, April 2016. Invited talk.**Scale-free sequential Monte Carlo**, Seminar on particle methods in Statistics, Statistics Department, Harvard University, April 2016. Invited talk.**Decay of correlation in network flow problems**, 50th Annual Conference on Information Sciences and Systems (CISS 2016), Princeton University, March 2016.**Locality in network optimization**, INFORMS, Philadelphia, November 2015.**Local algorithms in high-dimensional models**, Statistics Department, University of Oxford, September 2015. Invited talk.**Killed random walks and graph Laplacians: local sensitivity in network flow problems**, Yale Probabilistic Networks Group seminar, Statistics Department, Yale University, September 2015.**Decay of correlation in graphical models; algorithmic perspectives**, School of Computer and Communication Sciences, École Polytechnique Fédérale de Lausanne, August 2015.**Fast mixing for discrete point processes**, 28th Annual Conference on Learning Theory (COLT 2015), Université Pierre et Marie Curie, July 2015. [poster] [video]**Filtering compressed signal dynamics in high dimension**, 45th Annual John H. Barrett Memorial Lectures, University of Tennessee, May 2015. Invited talk.**On the role of the Hessian of submodular functions**, Yale Probabilistic Networks Group seminar, Statistics Department, Yale University, April 2015.**Submodular functions, from optimization to probability**, Probability Theory and Combinatorial Optimization, The Fuqua School of Business, Duke University, March 2015.**Estimating conditional distributions in high dimension**, Applied Mathematics seminar, Yale University, October 2014.**Nonlinear filtering in high dimension**, Yale Probabilistic Networks Group seminar, Statistics Department, Yale University, September 2014.**Particle filters and curse of dimensionality**, Monte Carlo Inference for Complex Statistical Models workshop, Isaac Newton Institute for Mathematical Sciences, University of Cambridge, April 2014. Invited talk. [slides] [video]**Particle filters and curse of dimensionality**, Cambridge Machine Learning Group, University of Cambridge, February 2014.**New phenomena in nonlinear filtering**, Yale Probabilistic Networks Group seminar, Statistics Department, Yale University, February 2014.**Filtering in high dimension**, Cornell Probability Summer School, Cornell University, July 2013.

In Spring 2018 I will be teaching Advanced Simulation Methods (Part C/MSc). Here is the course's website.

In Fall 2017 I am running a reading group on optimization for machine learning. The notes that are being produced can be found here.

While at Yale, in Fall 2016 I served as the Head Instructor for CS50 — Introduction to Computing and Programming — taught jointly with Harvard University. This was a coverage on the Yale Daily News. Here is the intro class in Machine Learning and Python, or its VR version.

From 2015 to 2017 I supervised a group of senior students on research projects in Machine Learning, investigating the development of algorithms for natural language processing, sparse regression, and distributed optimization.

I was a member of the Yale Postdoctoral Association, with the goal to facilitate and promote teaching experiences for postdocs in the sciences. On April 2, 2017, we hosted the third edition of the Julia Robinson Mathematics Festival at Yale, a celebration of ideas and problems in mathematics that enable junior high and high school students to explore fun math in a non-competitive setting.

While at Princeton, I repeatedly served as teaching assistant for ORF 309 (Probability and Stochastic Systems) taught by Prof. Erhan Çınlar. ORF 309 is considered one of the most challenging classes offered at Princeton University, and it is taken by approximately 150 students, 80% of which are undergraduate. In fall 2012 I was appointed head teaching assistant for the class and I received the 2013's Excellence in Teaching Award from the Princeton Engineering Council.

I was also a fellow of the McGraw Center for Teaching and Learning at Princeton University.