Simultaneous Bayesian Sparse Approximation With Structured Sparse Models

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors W. Chen, D. Wipf, Yu Wang, Y. Liu, I. Wassell
Journal/Conference Name IEEE Transactions on Signal Processing
Paper Category
Paper Abstract Sparse approximation is key to many signal processing, image processing, and machine learning applications. If multiple signals maintain some degree of dependency, for example, the support sets are statistically related, then it will generally be advantageous to jointly estimate the sparse representation vectors from the measurement vectors as opposed to solving for each signal individually. In this paper, we propose simultaneous sparse Bayesian learning (SBL) for joint sparse approximation with two structured sparse models (SSMs), where one is row-sparse with embedded element-sparse and the other one is row-sparse plus element-sparse. While SBL has attracted much attention as a means to deal with a single sparse approximation problem, it is not obvious how to extend SBL to SSMs. By capitalizing on a dual-space view of existing convex methods for SMs, we showcase the precision component model and covariance component model for SSMs, where both models involve a common hyperparameter and an innovation hyperparameter that together control the prior variance for each coefficient. The statistical perspective of precision component versus covariance component models unfolds the intrinsic mechanism in SSMs, and also leads to our development of SBL-inspired cost functions for SSMs. Centralized algorithms that include 11 and 12 reweighting algorithms and consensus-based decentralized algorithms are developed for simultaneous sparse approximation with SSMs. In addition, theoretical analysis is conducted to provide valuable insights into the proposed approach, which includes global minima analysis of the SBL-inspired nonconvex cost functions and convergence analysis of the proposed 11 reweighting algorithms for SSMs. Superior performance of the proposed algorithms is demonstrated by numerical experiments.
Date of publication 2016
Code Programming Language Matlab
Comment

Copyright Researcher 2021