Please reload

Deep Feedforward Networks [Slides]
Navid Shervani-Tabar
We review the deep feedforward networks. General setup and design decisions would be discussed - choosing the optimizer, the cost-function, and the form of the output units. We review the basics of gradient-base...

August 24, 2017

We review and discuss the structure and implementation of basic neural networks using PyTorch. Polynomial fitting, classification, and mixture density networks will be discussed along with coding details for replications of results found in the literature.



August 18, 2017

We review the motivation for and implementation of sparse Gaussian processes.  Special attention is given to the variational method of Titsias (2009), which addresses many of the shortcomings of the previous state of the art and serves as a foundation for many current...

Introduction to Expectation Propagation [Slides]

Souvik Chakraborty

Expectation propagation (EP) is an approximate Bayesian inference algorithm which constructs tractable approximations to complex probability distributions. EP is an extension of the assumed density filte...

Please reload

Please reload

Recent Posts
Search By Tags
Featured Posts

Our new website is launched today.

March 14, 2017

Please reload

Please reload

Copyright © 2020 University of Notre Dame

372 Fitzpatrick Hall, Notre Dame, IN 46556, USA

Phone 574-631-2429

Contact us

Accessibility Information