top of page

Deep Feedforward Networks, Bayesian Gaussian Process Latent Variable Model

Deep Feedforward Networks [Slides] Navid Shervani-Tabar We review the deep feedforward networks. General setup and design decisions would be discussed - choosing the optimizer, the cost-function, and the form of the output units. We review the basics of gradient-based learning, then proceed to confront some of the design decisions e.g. choosing the activation functions that will be used to compute the hidden layer values, and the architecture of the network. We present the back-propagation algorithm and generalizations. This is the first in a series of lectures on Deep Learning. References: Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. Deep learning. Chapter 6. MIT press, 2016. The Bayesian Gaussian Process Latent Variable Model [Slides]

Steven Atkinson

This talk reviews the Bayesian Gaussian process latent variable model, which is a nonlinear, probabilistic, generative model for unsupervised learning and can be conceptualized as a nonlinear extension of probabilistic PCA.

References:

Tipping, Michael E., and Bishop, Christopher M. "Probabilistic principal component analysis." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 61.3 (1999): 611-622.

Titsias, M. and Lawrence, N. D. "Bayesian Gaussian Process Latent Variable Model." International Conference on Artificial Intelligence and Statistics (2010): 844-851

Tags:

Featured Posts
Recent Posts
Search By Tags
bottom of page