Title: Conditional independence and the Gaussian distribution
Authors: Haavard Rue - KAUST (Saudi Arabia) [presenting]
Abstract: An important focus in the HSSS was the ability to build statistical models from smaller building blocks, either through the directed acyclic graph (through WinBUGS that was introduced about that time), or using Markov random fields (MRFs) and conditional independence. Due to relative newly (re-)invented Gibbs sampling algorithm and relatives, then Bayesian inference could be conducted using MCMC. We will discuss the Gaussian case and argue for why conditional independence is such an important concept there. Also, why it is important from a computational point of view, as it allows us to factorise very large sparse matrices, which is useful also for approximate Bayesian inference.