Chapter 4 - Markov Chain Monte Carlo# This chapter is about Markov Chain Monte Carlo methods for sampling from a probability distribution. Example 4.1 Example 4.2 Example 4.3 Example 4.4 Example 4.5 Example 4.6 Example 4.7 Example 4.8 Example 4.9 Example 4.10 Example 4.12 Example 4.17