Introduction to Expectation Propagation [Slides]
Expectation propagation (EP) is an approximate Bayesian inference algorithm which constructs tractable approximations to complex probability distributions. EP is an extension of the assumed density filtering which in turn is an approximation of Kalman filtering algorithm. This seminar reviews EP and its variants that have been developed over the last decade or so.
Minka, T. P. (2001). Expectation propagation for approximate Bayesian inference. Uncertainty in Artificial Intelligence, 17, 362–369.
Minka, T. P. (2001). A Family of Algorithms for Approximate Bayesian Inference. PhD Thesis, Massachusetts Institute of Technology.Minka, T. P. (2005).
Power EP. Technical Report MSR-TR-2004-149, Microsoft Research, Cambridge.
Qi, Y. and Guo, Y. (2012). Message passing with relaxed moment matching. arXiv preprint. arXiv:1204.4166
Uncertainty Quantification of Model-Form Error with RANS Simulations
While Reynolds-Averaged Navier-Stokes (RANS) simulations are the work horse of the practical CFD community, the need to model smaller turbulent scales can introduce significant error into the simulation. We review a Bayesian approach to quantifying this model-form error to further improve the accuracy of RANS simulations.
Xiao, Heng, et al. "Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach." Journal of Computational Physics 324 (2016): 115-136.
Variational Auto-Encoders [Slides]
Generative models allow us to create new samples from certain underlying (high-dimensional) data distribution instead of running expensive simulations. This seminar reviews a variational auto-encoder, one of the most successful generative models which scales variational Bayes to deep neural networks using the reparameterization trick.
Kingma, Diederik P., and Max Welling. "Auto-encoding variational Bayes." arXiv preprint arXiv:1312.6114 (2013).