On Sparse, Spectral and Other Parameterizations of Binary Probabilistic Models

View Researcher II's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors David Buchman, Mark W. Schmidt, Shakir Mohamed, David Poole, Nando de Freitas
Journal/Conference Name International Conference on Artificial Intelligence and Statistics
Paper Category
Paper Abstract This paper studies issues relating to the parameterization of probability distributions over binary data sets. Several such parameterizations of models for binary data are known, including the Ising, generalized Ising, canonical and full parameterizations. We also discuss a parameterization that we call the ‚Äúspectral parameterization‚ÄĚ, which has received significantly less coverage in existing literature. We provide this parameterization with a spectral interpretation by casting loglinear models in terms of orthogonal WalshHadamard harmonic expansions. Using various standard and group sparse regularizers for structural learning, we provide a comprehensive theoretical and empirical comparison of these parameterizations. We show that the spectral parameterization, along with the canonical, has the best performance and sparsity levels, while the spectral does not depend on any particular reference state. The spectral interpretation also provides a new starting point for analyzing the statistics of binary data sets; we measure the magnitude of higher order interactions in the underlying distributions for several data sets.
Date of publication 2012
Code Programming Language MATLAB
Comment

Copyright Researcher II 2021