Approximate message passing for not so large niid generalized linear models

Zhao, Zilu; Xiao, Fangqing; Slock, Dirk
SPAWC 2023, 24th IEEE International Workshop on Signal Processing Advances in Wireless Communications, 25-28 September 2023, Shanghai, China

Best Student Paper Award

Many signal processing problems involve a Generalized Linear Model (GLM), which is a linear model in which the unknowns may be non-identically independently distributed (n.i.i.d.). Vector Approximate Message Passing (VAMP) is a computationally efficient belief propagation technique used for Bayesian inference. However, the posterior variances obtained from (limited complexity) VAMP are only exact when an independent and identically distributed (i.i.d.) prior is assumed, due to the averaging operations involved. In many problems, it is desirable to not only get estimates of the unknowns but also correct posterior distributions. Whereas VAMP and esp. AMP is
applicable to problems of high dimensions, in many applications the dimensions are not very high, allowing for more complex operations. Also, in finite dimensions, the asymptotic regime leading to correct variances under certain measurement matrix
model assumptions does not hold. To address these challenges, we propose a revisited version of VAMP, called reVAMP, which provides both a multivariate Gaussian posterior approximation (including inter-parameter correlations) and accurate posterior
marginals which only require the extrinsic distributions to become Gaussian.

DOI
HAL
Type:
Conférence
City:
Shanghai
Date:
2023-09-25
Department:
Systèmes de Communication
Eurecom Ref:
7412
Copyright:
© 2023 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PERMALINK : https://www.eurecom.fr/publication/7412