top of page

Expectation Propagation, Model uncertainty in RANS simulation, Variational Auto-Encoders

Introduction to Expectation Propagation [Slides]

Souvik Chakraborty

Expectation propagation (EP) is an approximate Bayesian inference algorithm which constructs tractable approximations to complex probability distributions. EP is an extension of the assumed density filtering which in turn is an approximation of Kalman filtering algorithm. This seminar reviews EP and its variants that have been developed over the last decade or so.

References:

Minka, T. P. (2001). Expectation propagation for approximate Bayesian inference. Uncertainty in Artificial Intelligence, 17, 362–369.

Minka, T. P. (2001). A Family of Algorithms for Approximate Bayesian Inference. PhD Thesis, Massachusetts Institute of Technology.Minka, T. P. (2005).

Power EP. Technical Report MSR-TR-2004-149, Microsoft Research, Cambridge.

Qi, Y. and Guo, Y. (2012). Message passing with relaxed moment matching. arXiv preprint. arXiv:1204.4166

Uncertainty Quantification of Model-Form Error with RANS Simulations

Nick Geneva

While Reynolds-Averaged Navier-Stokes (RANS) simulations are the work horse of the practical CFD community, the need to model smaller turbulent scales can introduce significant error into the simulation. We review a Bayesian approach to quantifying this model-form error to further improve the accuracy of RANS simulations.

References:

Variational Auto-Encoders [Slides]

Yinhao Zhu

Generative models allow us to create new samples from certain underlying (high-dimensional) data distribution instead of running expensive simulations. This seminar reviews a variational auto-encoder, one of the most successful generative models which scales variational Bayes to deep neural networks using the reparameterization trick.

References:

Kingma, Diederik P., and Max Welling. "Auto-encoding variational Bayes." arXiv preprint arXiv:1312.6114 (2013).

Tags:

Featured Posts
Recent Posts
Search By Tags
bottom of page