Summary: | Recently, the leaky diffusion least-mean-square (DLMS) algorithm has obtained much attention because of its good performance for high input eigenvalue spread and low signal-to-noise ratio. However, the leaky DLMS algorithm may suffer from performance deterioration in the sparse system. To overcome this drawback, the leaky zero attracting DLMS algorithm is developed in this paper, which adds an l<sub>1</sub>-norm penalty to the cost function to exploit the property of sparse system. The leaky reweighted zero attracting DLMS algorithm is also put forward, which can improve the estimation performance in the presence of time-varying sparsity. Instead of using the l<sub>1</sub>-norm penalty, in the reweighted version, a log-sum function is employed as the substitution. Based on the weight error variance relation and several common assumptions, we analyze the transient behavior of our findings and determine the stability bound of the step-size. Moreover, we implement the steady state theoretical analysis for the proposed algorithms. Simulations in the context of distributed network system identification illustrate that the proposed schemes outperform various existing algorithms and validate the accuracy of the theoretical results.
|