Text
Efficient Parameter Sampling for Markov Jump Processes
Markov jump processes (MJPs) are continuous-time stochastic processes widely used in a variety of applied disciplines. Inference typically proceeds via Markov chain Monte Carlo (MCMC), the state-of-the-art being a uniformization-based auxiliary variable Gibbs sampler. This was designed for situations where the process parameters are known, and Bayesian inference over unknown parameters is typically carried out by incorporating it into a larger Gibbs sampler. This strategy of sampling parameters given path, and path given parameters can result in poor Markov chain mixing. In this work, we propose a simple and efficient algorithm to address this problem. Our scheme brings Metropolis–Hastings (MH) approaches for discrete-time hidden Markov models to the continuous-time setting, resulting in a complete and clean recipe for parameter and path inference in MJPs. In our experiments, we demonstrate superior performance over Gibbs sampling, a more naïve MH algorithm, as well as another popular approach, particle MCMC. We also show our sampler inherits geometric mixing from an “ideal” sampler that is computationally much more expensive. Supplementary materials for this article are available online.
Barcode | Tipe Koleksi | Nomor Panggil | Lokasi | Status | |
---|---|---|---|---|---|
art139199 | null | Artikel | Gdg9-Lt3 | Tersedia namun tidak untuk dipinjamkan - No Loan |
Tidak tersedia versi lain