CVPR 2012 Tutorial: All you want to know about Gaussian Processes

Location and Dates

Conference on Computer Vision and Pattern Recognition CVPR 2012 - Providence, Rhode Island, USA
Saturday June 16, 2012


Course description

We expect this tutorial to provide the theoretical background for good understanding of Gaussian processes, as well as illustrate the applications where Gaussian processes have been shown to work well; in some cases outperforming the state-of-the-art. We expect to have a broad audience, from experts in the field to undergraduate students interested in enlarging their understanding of machine learning tools and Bayesian statistics. In terms of basic understanding of Gaussian processes, the tutorial will cover the following topics: We will begin by an introduction to Gaussian processes starting with parametric models and generalized linear models. We will further demonstrate how basis functions can be increased in number to lead to non parametric regression models. An overview of how prediction with Gaussian processes is formed through conditioning in a joint Gaussian density will then be provided. We will then demonstrate how covariance parameters can be learned and what is the role of the log determinant in the likelihood. Gaussian processes have a natural trade-off between data fit and regularization, we will explain where this come from. We will then extend Gaussian processes from the Gaussian noise model, and show how to deal with non Gaussian likelihood models including likelihoods for classification. Finally, we will show how to make Gaussian process models computationally efficient. The usefulness of these processes will be demonstrated in a wide variety of vision related applications including pose estimation and object recognition. The second part of the tutorial will focus on how to use Gaussian processes to perform manifold modeling. In particular, we will review the Gaussian process latent variable model (GPLVM) as well as its variants that incorporate prior knowledge in the form of dynamics, labels, topologies and physical constraints. We will finish by discussing mechanistic models with incorporated latent forces and show a wide range of applications of these latent variable models including character animation, articulated tracking as well as deformable surface estimation. In summary, the tutorial is composed of the following subjects:
  • Parametric models and Bayesian treatments.
  • Introduction to Gaussian processes through non parametric models.
  • Non parametric modelling of functions using Gaussian processes.
  • Extending beyond Gaussian noise models: classification and robust regression.
  • Applications of Gaussian processes: object recognition, pose estimation.
  • Manifold Modelling with Gaussian Processes: GPLVM
  • Incorporating prior knowledge: dynamics, labels, topology, physical constraints.
  • Mechanistic Models using Gaussian Processes
  • Applications of GPLVM: recognition, animation, tracking of humans and deformable surfaces.

Provisional Schedule

Session 1: (8:30 - 10)
Session 2: (10:30 - 12:30)
  • 10:30 - 11:30 (Raquel) - PCA as a Latent Variable Model [Slides]
  • 11:30 - 12:30 (Neil) - Multitask and muti-output learning with Gaussian Process [Slides]
Session 3: (13:30 - 15:00)
  • 13:30 - 15:00 (Raquel) - Dimensionality reduction and GPs in Latent Variable Models [Slides]
Session 4: (15:30 - 17:00)
  • 15:30 - 16:00 (Raquel) - Continuation of Dimensionality reduction and GPs in Latent Variable Models [Slides]
  • 16:30 - 17:00 (Neil) - Remaining Challenges: Learning the dimensionality of the latent space, Non-gaussian likelihoods, Approximations

Course Materials

We plan to distribute the slides that compose the tutorial, as well as code to illustrate all the aspects of it, including some of the applications and demos.

Organizer Biographies

Raquel Urtasun is an Asssistant Professor at TTI-Chicago a philanthropically endowed academic institute located in the campus of the University of Chicago. She was a visiting professor at ETH Zurich during the spring semester of 2010. Previously, she was a postdoctoral research scientist at UC Berkeley and ICSI. Before that, she was a postdoctoral associate at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT. Raquel Urtasun completed her PhD at the Computer Vision Laboratory, at EPFL, Switzerland in 2006 working with Pascal Fua and David Fleet at the University of Toronto. She has been area chair of NIPS 2010, 2011, 2012 and UAI 2012, as well as NIPS Workshop co-Chair in 2012. She has also served in the committee of numerous international computer vision and machine learning conferences (e.g., CVPR, ICCV, ECCV, ICML, NIPS). Her major interests are statistical learning and computer vision, with a particular interest in non-parametric Bayesian statistics, latent variable models, structured prediction and their application to scene understanding.

Neil Lawrence received his bachelor's degree in Mechanical Engineering from the University of Southampton in 1994. Following a period as an field engineer on oil rigs in the North Sea he returned to academia to complete his PhD in 2000 at the Computer Lab in Cambridge University. He spent a year at Microsoft Research in Cambridge before leaving to take up a Lectureship at the University of Sheffield, where he was subsequently appointed Senior Lecturer in 2005. In January 2007 he took up a post as a Senior Research Fellow at the School of Computer Science in the University of Manchester where he worked in the Machine Learning and Optimisation research group. In August 2010 he returned to Sheffield to take up a collaborative Chair in Neuroscience and Computer Science. Neil's main research interest is machine learning through probabilistic models. He focuses on both the algorithmic side of these models and their application. He has a particular focus on applications in computational biology, but happily dabbles in other areas such as speech, vision and graphics. His main publication area from a methodological perspective is Gaussian processes. He is know for two particular formalisms based on Gaussian process models: Latent Force Models and Gaussian Process Latent Variable Models. Neil is an Associate Editor in Chief for IEEE Transactions on Pattern Analysis and Machine Intelligence and an Action Editor for the Journal of Machine Learning Research. He was the founding editor of the JMLR Workshop and Conference Proceedings and is currently series editor. He is Program Chair for AISTATS 2012 and has served on the program committee of several international conferences and was an area chair for the NIPS conference in 2005 and 2006. He was general chair of AISTATS in 2010 (bringing the conference to Europe for the first time) and NIPS Workshop Chair, also in 2010.

shareright © 2002 Phlash  

eXTReMe Tracker