EUSIPCO 2020, 28th European Signal Processing Conference, January 18-22, 2021, Amsterdam, The Netherlands (Virtual Conference)
Sparse Bayesian Learning (SBL), initially proposed in the Machine Learning (ML) literature, is an efficient and well-studied framework for sparse signal recovery. SBL uses hierarchical Bayes with a decorrelated Gaussian prior in which the variance profile is also to be estimated. This is more sparsity inducing than e.g. a Laplacian prior. However, SBL does not scale with problem dimensions due to the computational complexity associated with the matrix inversion in Linear Mimimum Mean Squared Error (LMMSE) estimation. To address this issue, various low complexity approximate Bayesian inference techniques have been introduced for the LMMSE component, including Variational Bayesian (VB) inference, Space Alternating Variational Estimation (SAVE) or Message Passing (MP) algorithms such as Belief Propagation (BP) or Expectation Propagation (EP) or Approximate MP (AMP). These algorithms may converge to the correct LMMSE estimate. However, in ML we are often also interested in having posterior variance information. We observed that SBL via SAVE provides (largely) underestimated variance estimates. AMP style algorithms may provide more accurate variance information (per component) as we have shown recently. However, one practical issue associated with most AMP versions is that they may diverge even if for a slight deviation from i.i.d Gaussian or right orthogonally invariant measurement matrices. To this end we extend here the more robust Swept AMP (SwAMP) algorithm to Generalized SwAMP (GSwAMP), which handles
independent but non-i.i.d. priors and to the case of dynamic SBL. The simulations illustrate the desirable convergence behavior of the proposed GSwAMP-SBL under different scenarios on the measurement matrix.
Systèmes de Communication
© 2020 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.