What?
Poll: What characterizes “Bayesian Analysis”?
- MAP estimators (maximum a posteriori)
- posterior means
- Bayes rule
- models where some unknown quantities are treated as random
- none of the above
Click for answer
All these popular answers are misleading and/or very incomplete:
- MAP estimators (maximum a posteriori)
- MAP is seldom used by expert Bayesians (mode is misleading in high dimensions)
- posterior means
- the posterior mean is often undefined (e.g. Bayesian analysis over combinatorial objects such as graphs)
- Bayes rule
- Bayes rule is intractable in most practical situations (we use MCMC/variational methods)
- models where some unknown quantities are treated as random
- true for Bayesian models, but also for many non-Bayesian models, e.g., random effect models
So… what is Bayesian Analysis?
Preview of key definitions
Bayesian Analysis: statistical discipline centered around the use of Bayes estimators
Bayes estimators: for data \(Y\), unobserved \(X\), loss \(L\), and possible actions \(A\), the Bayes estimator is defined as:
\[\operatorname{arg\,min}\{ \mathbb{E}[L(a, X) | Y] : a \in A \}\]
Note: you are not expected to understand this equation at this point!
This course
The primary objective of this course is to understand Bayes estimators:
- Why they are so powerful.
- Their limitations (model misspecification, computational challenges).
- Important special cases (posterior means, credible intervals, MAP).
- How to do it in practice
- how to build models
- how to approximate conditional expectations.