Differentiable Antithetic Sampling for Variance Reduction in Stochastic Variational Inference

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors Stefano Ermon, Mike Wu, Noah Goodman
Journal/Conference Name AISTATS 2019 - 22nd International Conference on Artificial Intelligence and Statistics
Paper Category
Paper Abstract Stochastic optimization techniques are standard in variational inference algorithms. These methods estimate gradients by approximating expectations with independent Monte Carlo samples. In this paper, we explore a technique that uses correlated, but more representative , samples to reduce estimator variance. Specifically, we show how to generate antithetic samples that match sample moments with the true moments of an underlying importance distribution. Combining a differentiable antithetic sampler with modern stochastic variational inference, we showcase the effectiveness of this approach for learning a deep generative model.
Date of publication 2018
Code Programming Language Python
Comment

Copyright Researcher 2022