In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain. The more steps are included, the more closely the distribution of the sample matches the actual desired distribution. Various algorithms exist for constructing chains, including the Metropolis–Hastings algorithm and Gibbs Sampling.
Advantages of Gibbs sampling:
- It is easy to evaluate the conditional distributions,
- Conditionals may be conjugate and we can sample from them exactly,
- Conditionals will be lower dimensional and we can apply rejection sampling or importance sampling.
Disadvantages of Gibbs sampling:
However, the major drawback is that when variables have strong dependencies it is hard to move around. We can introduce auxiliary variables to help move around when such high dimensional variables are correlated.
The code had been written in Metropolis–Hastings.py and Gibbs_Sampling.py.
https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo
https://www.cs.cmu.edu/~epxing/Class/10708-15/notes/10708_scribe_lecture17.pdf
http://people.duke.edu/~ccc14/sta-663/MCMC.html
https://github.com/Joseph94m/MCMC/blob/master/MCMC.ipynb
https://towardsdatascience.com/markov-chain-monte-carlo-in-python-44f7e609be98
https://machinelearningmastery.com/markov-chain-monte-carlo-for-probability/
https://link.springer.com/article/10.3758/s13423-016-1015-8
https://www.cnblogs.com/sddai/p/6144674.html
https://zhuanlan.zhihu.com/p/37121528
We highly recommend you to refer to these references if you would like to know more details about MCMC.
https://www.cs.cmu.edu/~epxing/Class/10708-15/notes/10708_scribe_lecture17.pdf
https://towardsdatascience.com/markov-chain-monte-carlo-in-python-44f7e609be98
https://github.com/Joseph94m/MCMC/blob/master/MCMC.ipynb