Copyright © 2017 University of Notre Dame

CSE Laboratory 372 Fitzpatrick Hall, Notre Dame, IN 46556, USA

Phone 574-631-2429   nzabaras@nd.edu

Contact us

Accessibility Information

Please reload

Recent Posts

Our new website is launched today.

March 14, 2017

1/1
Please reload

Featured Posts

Deep Feedforward Networks, Bayesian Gaussian Process Latent Variable Model

Deep Feedforward Networks [Slides]
Navid Shervani-Tabar
 
We review the deep feedforward networks. General setup and design decisions would be discussed - choosing the optimizer, the cost-function, and the form of the output units. We review the basics of gradient-based learning, then proceed to confront some of the design decisions e.g. choosing the activation functions that will be used to compute the hidden layer values, and the architecture of the network. We present the back-propagation algorithm and generalizations. This is the first in a series of lectures on Deep Learning.
 
References:
Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. Deep learning. Chapter 6. MIT press, 2016.
 
 
The Bayesian Gaussian Process Latent Variable Model [Slides]

Steven Atkinson

 

This talk reviews the Bayesian Gaussian process latent variable model, which is a nonlinear, probabilistic, generative model for unsupervised learning and can be conceptualized as a nonlinear extension of probabilistic PCA.

 

References:

Tipping, Michael E., and Bishop, Christopher M. "Probabilistic principal component analysis." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 61.3 (1999): 611-622.

Titsias, M. and Lawrence, N. D. "Bayesian Gaussian Process Latent Variable Model." International Conference on Artificial Intelligence and Statistics (2010): 844-851

Tags:

Share on Twitter
Please reload

Please reload

Search By Tags
Please reload