The normalization of the step size in the least-mean-square (LMS) algorithm allows easy control of the range of stable operation for the normalized step size in the normalized LMS (NLMS) algorithm, and also easy determination of the step size for maximum convergence speed. A generalized step size normalization for stochastic gradient algorithms in which the gradient vector differs from the data vector is proposed. Three applications are considered in detail: the stochastic Newton scheme, the sign-data LMS algorithm, and an instrumental variable method proposed to speed up the convergence of the LMS algorithm.
On the normalization of the step size in nonsymmetric stochastic gradient algorithms
ASILOMAR 1992, 26th Asilomar IEEE Conference on Signals, Systems and Computers, October 26-28, 1992, Pacific Grove, USA
© 1992 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
PERMALINK : https://www.eurecom.fr/publication/590