Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows
View Researcher's Other CodesDisclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).
Please contact us in case of a broken link from here
Authors | Iain Murray, David C. Sterratt, George Papamakarios |
Journal/Conference Name | AISTATS 2019 - 22nd International Conference on Artificial Intelligence and Statistics |
Paper Category | Artificial Intelligence |
Paper Abstract | We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude. We show that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and we discuss diagnostics for assessing calibration, convergence and goodness-of-fit. |
Date of publication | 2018 |
Code Programming Language | Multiple |
Comment |