Abstract:This chapter presents a powerful generic method, called Markov chain Monte Carlo (MCMC), for approximately generating samples from an arbitrary distribution. The prominent MCMC algorithms are the Metr...This chapter presents a powerful generic method, called Markov chain Monte Carlo (MCMC), for approximately generating samples from an arbitrary distribution. The prominent MCMC algorithms are the Metropolis–Hastings and the Gibbs samplers, the latter being particularly useful in Bayesian analysis. Finally, MCMC sampling is the main ingredient in the popular simulated annealing technique for discrete and continuous optimization. The main idea behind the Metropolis–Hastings algorithm is to simulate a Markov chain such that the stationary distribution of this chain coincides with the target distribution. The estimates obtained via MCMC samples often tend to have much greater variances than those obtained from independent sampling of the target distribution. Hit-and-run is unique in that it only takes polynomial time to get out of a corner. The distinguishing feature of the Gibbs sampler is that the underlying Markov chain is constructed, in a deterministic or random fashion, from a sequence of conditional distributions.Read More
Publication Year: 2016
Publication Date: 2016-11-21
Language: en
Type: other
Indexed In: ['crossref']
Access and Citation
Cited By Count: 3
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot