- Title Pages
- Dedication
- Preface
- Section 1 Basics of Probability
- Chapter 1 Introduction to Probability
- Chapter 2 Joint, Marginal, and Conditional Probability
- Section 2 Bayes’ Theorem and Bayesian Inference
- Chapter 3 Bayes’ Theorem
- Chapter 4 Bayesian Inference
- Chapter 5 The Author Problem: Bayesian Inference with Two Hypotheses
- Chapter 6 The Birthday Problem: Bayesian Inference with Multiple Discrete Hypotheses
- Chapter 7 The Portrait Problem: Bayesian Inference with Joint Likelihood
- Section 3 Probability Functions
- Chapter 8 Probability Mass Functions
- Chapter 9 Probability Density Functions
- Section 4 Bayesian Conjugates
- Chapter 10 The White House Problem: The Beta-Binomial Conjugate
- Chapter 11 The Shark Attack Problem: The Gamma-Poisson Conjugate
- Chapter 12 The Maple Syrup Problem: The Normal-Normal Conjugate
- Section 5 Markov Chain Monte Carlo
- Chapter 13 The Shark Attack Problem Revisited: MCMC with the Metropolis Algorithm
- Chapter 14 MCMC Diagnostic Approaches
- Chapter 15 The White House Problem Revisited: MCMC with the Metropolis–Hastings Algorithm
- Chapter 16 The Maple Syrup Problem Revisited: MCMC with Gibbs Sampling
- Section 6 Applications
- Chapter 17 The Survivor Problem: Simple Linear Regression with MCMC
- Chapter 18 The Survivor Problem Continued: Introduction to Bayesian Model Selection
- Chapter 19 The Lorax Problem: Introduction to Bayesian Networks
- Chapter 20 The Once-ler Problem: Introduction to Decision Trees
- Appendix 1 The Beta-Binomial Conjugate Solution
- Appendix 2 The Gamma-Poisson Conjugate Solution
- Appendix 3 The Normal-Normal Conjugate Solution
- Appendix 4 Conjugate Solutions for Simple Linear Regression
- Appendix 5 The Standardization of Regression Data
- Bibliography
- Hyperlinks Accessed August 2017
- Name Index
- Subject Index

# The White House Problem Revisited: MCMC with the Metropolis–Hastings Algorithm

# The White House Problem Revisited: MCMC with the Metropolis–Hastings Algorithm

- Chapter:
- (p.224) Chapter 15 The White House Problem Revisited: MCMC with the Metropolis–Hastings Algorithm
- Source:
- Bayesian Statistics for Beginners
- Author(s):
### Therese M. Donovan

### Ruth M. Mickey

- Publisher:
- Oxford University Press

The “White House Problem” of Chapter 10 is revisited in this chapter. Markov Chain Monte Carlo (MCMC) is used to build the posterior distribution of the unknown parameter *p*, the probability that a famous person could gain access to the White House without invitation. The chapter highlights the Metropolis–Hastings algorithm in MCMC analysis, describing the process step by step. The posterior distribution generated in Chapter 10 using the beta-binomial conjugate is compared with the MCMC posterior distribution to show how successful the MCMC method can be. By the end of this chapter, the reader will have a firm understanding of the following concepts: Monte Carlo, Markov chain, Metropolis–Hastings algorithm, Metropolis–Hastings random walk, and Metropolis–Hastings independence sampler.

*Keywords:*
Monte Carlo, Markov chain, Metropolis–Hastings algorithm, Metropolis–Hastings random walk, Metropolis–Hastings independence sampler, Keith Hastings

Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.

Please, subscribe or login to access full text content.

If you think you should have access to this title, please contact your librarian.

To troubleshoot, please check our FAQs , and if you can't find the answer there, please contact us .

- Title Pages
- Dedication
- Preface
- Section 1 Basics of Probability
- Chapter 1 Introduction to Probability
- Chapter 2 Joint, Marginal, and Conditional Probability
- Section 2 Bayes’ Theorem and Bayesian Inference
- Chapter 3 Bayes’ Theorem
- Chapter 4 Bayesian Inference
- Chapter 5 The Author Problem: Bayesian Inference with Two Hypotheses
- Chapter 6 The Birthday Problem: Bayesian Inference with Multiple Discrete Hypotheses
- Chapter 7 The Portrait Problem: Bayesian Inference with Joint Likelihood
- Section 3 Probability Functions
- Chapter 8 Probability Mass Functions
- Chapter 9 Probability Density Functions
- Section 4 Bayesian Conjugates
- Chapter 10 The White House Problem: The Beta-Binomial Conjugate
- Chapter 11 The Shark Attack Problem: The Gamma-Poisson Conjugate
- Chapter 12 The Maple Syrup Problem: The Normal-Normal Conjugate
- Section 5 Markov Chain Monte Carlo
- Chapter 13 The Shark Attack Problem Revisited: MCMC with the Metropolis Algorithm
- Chapter 14 MCMC Diagnostic Approaches
- Chapter 15 The White House Problem Revisited: MCMC with the Metropolis–Hastings Algorithm
- Chapter 16 The Maple Syrup Problem Revisited: MCMC with Gibbs Sampling
- Section 6 Applications
- Chapter 17 The Survivor Problem: Simple Linear Regression with MCMC
- Chapter 18 The Survivor Problem Continued: Introduction to Bayesian Model Selection
- Chapter 19 The Lorax Problem: Introduction to Bayesian Networks
- Chapter 20 The Once-ler Problem: Introduction to Decision Trees
- Appendix 1 The Beta-Binomial Conjugate Solution
- Appendix 2 The Gamma-Poisson Conjugate Solution
- Appendix 3 The Normal-Normal Conjugate Solution
- Appendix 4 Conjugate Solutions for Simple Linear Regression
- Appendix 5 The Standardization of Regression Data
- Bibliography
- Hyperlinks Accessed August 2017
- Name Index
- Subject Index