Energy efficient sparse bayesian learning using learned approximate message passing

Thomas, Christo Kurisummoottil; Mundlamuri, Rakesh; Murthy, Chandra R; Kountouris, Marios
SPAWC 2021, 22nd IEEE International Workshop on Signal Processing Advances in Wireless Communications, 27-30 September 2021, Lucca, Italy

Sparse Bayesian learning (SBL) is a well-studied framework for sparse signal recovery, with numerous applications in wireless communications, including wideband (millimeter wave) channel estimation and user activity detection. SBL is known to be more sparsity-inducing than other priors (e.g., Laplacian prior), and is better able to handle ill-conditioned measurement matrices, hence providing superior sparse recovery
performance. However, the complexity of SBL does not scale well with the dimensionality of the problem due to the computational complexity associated with the matrix inversion step in the EM iterations. A computationally efficient version of SBL can be obtained by exploiting approximate message passing (AMP) for the inversion, coined AMP-SBL. However, this algorithm still requires a large number of iterations and careful hand-tuning to guarantee convergence for arbitrary measurement matrices.
In this work, we revisit AMP-SBL from an energy-efficiency perspective. We propose a fast version of AMP-SBL leveraging deep neural networks (DNN). The main idea is to use deep learning to unfold the iterations in the AMP-SBL algorithm using very few, no more than 10, neural network layers. The sparse vector estimation is performed using DNN, and hyperparameters are learned using the EM algorithm, making it robust to different measurement matrix models. Our results show a reduction in energy consumption, primarily due to lower complexity and faster convergence rate. Moreover, the training of the neural network is simple since the number of parameters to be learned is relatively small.

Communication systems
Eurecom Ref:
© 2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.