Advanced Markov Chain Monte Carlo Methods


Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples (2011) .. by F.Liang etc


Table of Contents

Preface.

Acknowledgments.

Publisher’s Acknowledgments.

1 Bayesian Inference and Markov Chain Monte Carlo.

1.1 Bayes.

1.1.1 Specification of Bayesian Models.

1.1.2 The Jeffreys Priors and Beyond.

1.2 Bayes Output.

1.2.1 Credible Intervals and Regions.

1.2.2 Hypothesis Testing: Bayes Factors.

1.3 Monte Carlo Integration.

1.3.1 The Problem.

1.3.2 Monte Carlo Approximation.

1.3.3 Monte Carlo via Importance Sampling.

1.4 Random Variable Generation.

1.4.1 Direct or Transformation Methods.

1.4.2 Acceptance-Rejection Methods.

1.4.3 The Ratio-of-Uniforms Method and Beyond.

1.4.4 Adaptive Rejection Sampling.

1.4.5 Perfect Sampling.

1.5 Markov Chain Monte Carlo.

1.5.1 Markov Chains.

1.5.2 Convergence Results.

1.5.3 Convergence Diagnostics.

Exercises.

2 The Gibbs Sampler.

2.1 The Gibbs Sampler.

2.2 Data Augmentation.

2.3 Implementation Strategies and Acceleration Methods.

2.3.1 Blocking and Collapsing.

2.3.2 Hierarchical Centering and Reparameterization.

2.3.3 Parameter Expansion for Data Augmentation.

2.3.4 Alternating Subspace-Spanning Resampling.

2.4 Applications.

2.4.1 The Student-t Model.

2.4.2 Robit Regression or Binary Regression with the Student-t Link.

2.4.3 Linear Regression with Interval-Censored Responses.

Exercises.

Appendix 2A: The EM and PX-EM Algorithms.

3 The Metropolis-Hastings Algorithm.

3.1 The Metropolis-Hastings Algorithm.

3.1.1 Independence Sampler.

3.1.2 Random Walk Chains.

3.1.3 Problems with Metropolis-Hastings Simulations.

3.2 Variants of the Metropolis-Hastings Algorithm.

3.2.1 The Hit-and-Run Algorithm.

3.2.2 The Langevin Algorithm.

3.2.3 The Multiple-Try MH Algorithm.

3.3 Reversible Jump MCMC Algorithm for Bayesian Model Selection Problems.

3.3.1 Reversible Jump MCMC Algorithm.

3.3.2 Change-Point Identification.

3.4 Metropolis-Within-Gibbs Sampler for ChIP-chip Data Analysis.

3.4.1 Metropolis-Within-Gibbs Sampler.

3.4.2 Bayesian Analysis for ChIP-chip Data.

Exercises.

4 Auxiliary Variable MCMC Methods.

4.1 Simulated Annealing.

4.2 Simulated Tempering.

4.3 The Slice Sampler.

4.4 The Swendsen-Wang Algorithm.

4.5 The Wolff Algorithm.

4.6 The Moller Algorithm.

4.7 The Exchange Algorithm.

4.8 The Double MH Sampler.

4.8.1 Spatial Autologistic Models.

4.9 Monte Carlo MH Sampler.

4.9.1 Monte Carlo MH Algorithm.

4.9.2 Convergence.

4.9.3 Spatial Autologistic Models (Revisited).

4.9.4 Marginal Inference.

4.10 Applications.

4.10.1 Autonormal Models.

4.10.2 Social Networks.

Exercises.

5 Population-Based MCMC Methods.

5.1 Adaptive Direction Sampling.

5.2 Conjugate Gradient Monte Carlo.

5.3 Sample Metropolis-Hastings Algorithm.

5.4 Parallel Tempering.

5.5 Evolutionary Monte Carlo.

5.5.1 Evolutionary Monte Carlo in Binary-Coded Space.

5.5.2 Evolutionary Monte Carlo in Continuous Space.

5.5.3 Implementation Issues.

5.5.4 Two Illustrative Examples.

5.5.5 Discussion.

5.6 Sequential Parallel Tempering for Simulation of High Dimensional Systems.

5.6.1 Build-up Ladder Construction.

5.6.2 Sequential Parallel Tempering.

5.6.3 An Illustrative Example: the Witch’s Hat Distribution.

5.6.4 Discussion.

5.7 Equi-Energy Sampler.

5.8 Applications.

5.8.1 Bayesian Curve Fitting.

5.8.2 Protein Folding Simulations: 2D HP Model.

5.8.3 Bayesian Neural Networks for Nonlinear Time Series Forecasting.

Exercises.

Appendix 5A: Protein Sequences for 2D HP Models.

6 Dynamic Weighting.

6.1 Dynamic Weighting.

6.1.1 The IWIW Principle.

6.1.2 Tempering Dynamic Weighting Algorithm.

6.1.3 Dynamic Weighting in Optimization.

6.2 Dynamically Weighted Importance Sampling.

6.2.1 The Basic Idea.

6.2.2 A Theory of DWIS.

6.2.3 Some IWIWp Transition Rules.

6.2.4 Two DWIS Schemes.

6.2.5 Weight Behavior Analysis.

6.2.6 A Numerical Example.

6.3 Monte Carlo Dynamically Weighted Importance Sampling.

6.3.1 Sampling from Distributions with Intractable Normalizing Constants.

6.3.2 Monte Carlo Dynamically Weighted Importance Sampling.

6.3.3 Bayesian Analysis for Spatial Autologistic Models.

6.4 Sequentially Dynamically Weighted Importance Sampling.

Exercises.

7 Stochastic Approximation Monte Carlo.

7.1 Multicanonical Monte Carlo.

7.2 1/k-Ensemble Sampling.

7.3 The Wang-Landau Algorithm.

7.4 Stochastic Approximation Monte Carlo.

7.5 Applications of Stochastic Approximation Monte Carlo.

7.5.1 Efficient p-Value Evaluation for Resampling-Based Tests.

7.5.2 Bayesian Phylogeny Inference.

7.5.3 Bayesian Network Learning.

7.6 Variants of Stochastic Approximation Monte Carlo.

7.6.1 Smoothing SAMC for Model Selection Problems.

7.6.2 Continuous SAMC for Marginal Density Estimation.

7.6.3 Annealing SAMC for Global Optimization.

7.7 Theory of Stochastic Approximation Monte Carlo.

7.7.1 Convergence.

7.7.2 Convergence Rate.

7.7.3 Ergodi city and its IWIW Property.

7.8 Trajectory Averaging: Toward the Optimal Convergence Rate.

7.8.1 Trajectory Averaging for a SAMC MC Algorithm.

7.8.2 Trajectory Averaging for SAMC.

7.8.3 Proof of Theorems 7.8.2 and 7.8.3.

Exercises.

Appendix 7A: Test Functions for Global Optimization.

8 Markov Chain Monte Carlo with Adaptive Proposals.

8.1 Stochastic Approximation-Based Adaptive Algorithms.

8.1.1 Ergodi city and Weak Law of Large Numbers.

8.1.2 Adaptive Metropolis Algorithms.

8.2 Adaptive Independent Metropolis-Hastings Algorithms.

8.3 Regeneration-Based Adaptive Algorithms.

8.3.1 Identification of Regeneration Times.

8.3.2 Proposal Adaptation at Regeneration Times.

8.4 Population-Based Adaptive Algorithms.

8.4.1 ADS, EMC, NKC and More.

8.4.2 Adaptive EMC.

8.4.3 Application to Sensor Placement Problems.

Exercises.

References.

Index.