Professor Yee Whye Teh

Professor of Statistical Machine Learning

Biographical Sketch

Prior to joining Oxford, I was a Lecturer then Reader of Computational Statistics and Machine Learning at the Gatsby Neuroscience Unit, UCL from 2007 to 2012. I obtained my PhD in Computer Science at the University of Toronto in 2003. This was followed by two years as a postdoctoral fellow at University of California, Berkeley, then as Lee Kuan Yew Postdoctoral Fellow at the National University of Singapore.

Research Interests

My research interests lie in the general areas of machine learning, Bayesian statistics and computational statistics. Although my group works on a variety of topics ranging from theoretical, through to methodological and applications, I am personally particularly interested in three (overlapping) themes: Bayesian nonparametrics and probabilistic learning, large scale machine learning, and deep learning.

These themes are motivated by the phenomenal growth in the quantity, diversity and heterogeneity of data now available. The analysis of such data is crucial to opening doors to new scientific frontiers and future economic growth. In the longer term, the development of general methods that can deal with such data are important testing grounds for artificial general intelligence systems.

Publications

Schwarz, J., Luketina, J., Czarnecki, W., Grabska-Barwinska, A., Teh, Y., Pascanu, R. and Hadsell, R. (2018) “Progress & compress: A scalable framework for continual learning”, in 35th International Conference on Machine Learning, ICML 2018, pp. 7199–7208.
Gamelo, M., Rosenbaum, D., Maddison, C., Ramalho, T., Saxton, D., Shanahan, M., Teh, Y., Rezende, D. and Eslami, S. (2018) “Conditional neural processes”, in 35th International Conference on Machine Learning, ICML 2018, pp. 2738–2747.
Kim, H. and Teh, Y. (2018) “Scaling up the automatic statistician: Scalable structure discovery using gaussian processes”, in International Conference on Artificial Intelligence and Statistics, AISTATS 2018, pp. 575–584.
Rowland, M., Bellemare, M., Dabney, W., Munos, R. and Teh, Y. (2018) “An analysis of categorical distributional reinforcement learning”, in International Conference on Artificial Intelligence and Statistics, AISTATS 2018, pp. 29–37.
Rainforth, T., Kosiorek, A., Le, T., Maddison, C., Igl, M., Wood, F. and Teh, Y. (2018) “Tighter variational bounds are not necessarily better”, in 35th International Conference on Machine Learning, ICML 2018.
Webb, S., Goliński, A., Zinkov, R., Siddharth, N., Rainforth, T., Teh, Y. and Wood, F. (2018) “Faithful inversion of generative models for effective amortized inference”, in Advances in Neural Information Processing Systems.
Maddison, C., Lawson, D., Tucker, G., Heess, N., Norouzi, M., Mnih, A., Doucet, A. and Teh, Y. (2017) “Filtering variational objectives”, in Advances in Neural Information Processing Systems. Neural Information Processing Systems Foundation.
Perrone, V., Jenkins, P., Spano, D. and Teh, Y. (2017) “Poisson random fields for dynamic feature models”, Journal of Machine Learning Research, 18.
Hasenclver, L., Webb, S., Lienart, T., Vollmer, S., Lakshminarayanan, B., Blundell, C. and Teh, Y. (2017) “Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server”, Journal of Machine Learning Research, 18(106), pp. 1–37.