Professor Yee Whye Teh

Professor of Statistical Machine Learning

Biographical Sketch

Prior to joining Oxford, I was a Lecturer then Reader of Computational Statistics and Machine Learning at the Gatsby Neuroscience Unit, UCL from 2007 to 2012. I obtained my PhD in Computer Science at the University of Toronto in 2003. This was followed by two years as a postdoctoral fellow at University of California, Berkeley, then as Lee Kuan Yew Postdoctoral Fellow at the National University of Singapore.

Research Interests

My research interests lie in the general areas of machine learning, Bayesian statistics and computational statistics. Although my group works on a variety of topics ranging from theoretical, through to methodological and applications, I am personally particularly interested in three (overlapping) themes: Bayesian nonparametrics and probabilistic learning, large scale machine learning, and deep learning.

These themes are motivated by the phenomenal growth in the quantity, diversity and heterogeneity of data now available. The analysis of such data is crucial to opening doors to new scientific frontiers and future economic growth. In the longer term, the development of general methods that can deal with such data are important testing grounds for artificial general intelligence systems.

Publications

Teh, Y., Dupont, E. and Doucet, A. (2019) “Augmented neural ODEs”, Proceedings of the 5th Workshop on Energy Efficient Machine Learning and Cognitive Computing - NeurIPS Edition (EMC2-NIPS 2019), 32(2019), pp. 1–11.
Ge, S., Wang, S., Teh, Y., Wang, L. and Elliott, L. (2019) “Random tessellation forests”, in Advances in Neural Information Processing Systems 32 (NIPS 2019). Conference on Neural Information Processing Systems.
Rao, D., Visin, F., Rusu, A., Teh, Y., Pascanu, R. and Hadsell, R. (2019) “Continual unsupervised representation learning”, in Advances in Neural Information Processing Systems 32 (NIPS 2019). Conference on Neural Information Processing Systems.
Mathieu, E., Le Lan, C., Maddison, C., Tomioka, R. and Teh, Y. (2019) “Continuous hierarchical representations with poincaré Variational Auto-Encoder”, in Advances in Neural Information Processing Systems 32 (NIPS 2019). Conference on Neural Information Processing Systems.
Kosiorek, A., Sabour, S., Teh, Y., Hinton, G. and STAFFORD-TOLLEY, J. (2019) “Stacked capsule autoencoders”, in Advances in Neural Information Processing Systems 32 (NIPS 2019). Nueral Information Processing Systems, p. 15512.
Foster, A., Jankowiak, M., Bingham, E., Horsfall, P., Teh, Y., Rainforth, T. and Goodman, N. (2019) “Variational Bayesian optimal experimental design”, in Advances in Neural Information Processing Systems 32 (NIPS 2019). Conference on Neural Information Processing Systems.
Foster, A., Jankowiak, M., Bingham, E., Horsfall, P., TEH, Y., RAINFORTH, T. and Goodman, N. (2019) “Variational Bayesian Optimal Experimental Design”, in.
Foster, A., Jankowiak, M., O’Meara, M., Teh, Y. and Rainforth, T. (2019) “A Unified Stochastic Gradient Approach to Designing Bayesian-Optimal Experiments.”
Le, T., Kosiorek, A., Siddharth, N., Teh, Y. and Wood, F. (2019) “Revisiting reweighted wake-sleep for models with stochastic control flow”, in Proceedings of the International Conference on Uncertainty in Artificial Intelligence. Association for Uncertainty in Artificial Intelligence.