Robust distributed estimation based on a generalized correntropy logarithmic difference algorithm over wireless sensor networks☆
Introduction
With the development of the Internet of Things (IoT), there is a growing concern about distributed signal processing over Wireless Sensor Networks (WSNs). The WSNs usually consist of a large number of distributed, small nodes that are constrained by limited power and wireless bandwidth. Moreover, these networks have a broad range of practical applications: area-monitoring applications, industrial automation, measuring phenomena, etc [1], [2], [3]. Distributed estimation is of great importance for distributed signal processing [4], [5], [6]. The aim of distributed estimation is to assess some parameters of interest against noise measurements with cooperation between nodes over networks. Distributed estimation algorithms are mainly divided into three categories: incremental strategies [7], consensus strategies [8] and diffusion strategies [9], [10], [11]. Diffusion strategies for distributed estimation over networks are especially of great significance since they are robust, flexible and accurate [12], [13]. For the above reasons, we pay more attention to the diffusion strategies for distributed estimation here.
Most prior algorithms for distributed estimation on diffusion strategies have been derived by the well-known mean square error (MSE), such as diffusion Least Mean Square (d-LMS) [14], [15], the diffusion normalized least-mean-square algorithm (d-NLMS) [16], and the diffusion Recursive Least Squares (d-RLS) [17]. These algorithms can achieve ideal performance if the assumption about the noise signal with a Gaussian distribution is met. However, in real-world applications, this assumption is not often available. Compared to Gaussian noise, these algorithms based on MSE are extremely sensitive to non-Gaussian noise, especially outliers (impulsive noise) [18], [19], [20]. Therefore, algorithms for distributed estimation over networks may deteriorate dramatically with outliers, such as impulsive noise.
To address these issues, diffusion strategies for distributed estimation have been proposed, such as diffusion least-mean power (d-LMP) and diffusion sign-error Least Mean Square (dSE-LMS) [21], [22]. In recent years, Chen and his co-workers conduct in-depth research and propose several novel algorithms based on kernel method in order to deal with non-Gaussian noise, and these methods provide a nonlinear measure of similarity between two random variables [23], [24]. The diffusion maximum correntropy criterion (d-MCC) algorithm is an efficient kernel method based on information theoretic learning (ITL), and can handle non-Gaussian noise effectively [25]. Compared to correntropy, generalized correntropy and the minimum kernel risk-sensitive loss (MKRSL) provide more flexible frameworks and show better performance in many problems such as parameter estimation [26], [27], [28], [29], [30]. There are also some other effective methods proposed to resist impulsive noise, such as maximum versoria criterion-based robust adaptive filtering algorithm [31], a family of robust M-shaped error weighted least mean square algorithms [32], and a family of robust adaptive filtering algorithms based on sigmoid cost [33].
For the anti-impulsive noise algorithms discussed above, the ATC-DSE-LMS algorithm is flexible and effective without introducing any signal delay. But a lower stable-state error performance is the main defect of the robust algorithm. While the D-LMP is one of the most classical robust diffusion algorithms, the robustness of the D-LMP algorithm and its varieties are highly affected by the parameter p. Based on the instantaneous gradient-descent method to minimize the cost function, the diffusion Huber-based NLMS (d-NHuber LMS) algorithm was proposed [34]. However, some important parameters controlling the algorithm robustness should be tuned in accordance with the practitioner knowledge of the noise distribution, which may not always be available.
A robust algorithm based on a diffusion strategy is proposed in this paper for the purpose of improving the estimate performance over networks in various environments. The algorithm combines the logarithm operation and generalized correntropy criterion to construct a new cost function, which is proven to be an effective measure for rejecting outliers (or impulsive noise). Furthermore, inspired by the basic implementation of the diffusion strategy, namely the adapt-then-combine (ATC) d-LMS algorithm [14], the algorithm is designed for distributed estimation over networks. Simulations confirm the advantages of the proposed algorithm under Gaussian noise and its robustness even to impulsive noise. Additionally, the algorithm has also been verified for a nonlinear model and nonstationary environment, where it still maintains its desirable performance. The main contributions of this paper are summarized as follows: (i) A new cost function is designed, which can provide the anti-noise capacity. (ii) A robust diffusion algorithm based on the new designed cost function is derived and shows superior performance for distributed estimation. (iii) The stability of the proposed d-GCLD algorithm is analyzed. (iv) Simulation results are demonstrate the proposed algorithm has potential applications in WSNs, where the sensors may suffer from various situations, including Gaussian and impulsive noise.
This paper is organized as follows. In Section 2, we briefly introduce some preliminary knowledge about the data model and diffusion LMS. The diffusion generalized correntropy logarithmic difference (d-GCLD) algorithm is formulated in Section 3. In Section 4, the theoretical stability analysis of the proposed algorithm is proven. In Section 5, simulation results are presented to show the advantages of the proposed algorithm. Finally, we state the conclusions in Section 6.
Notation: In this paper, (.)T denotes transposition, and E[.] stands for expectation. sign(.) and ⊗ is sign and Kronecker product operators, respectively. Im denotes an m × m identity matrix. 1 is an N × 1 all-unity vector. We use Tr{.} to calculate the trace of a matrix and |.| is the absolute value of a scalar.
Section snippets
Data model
Let us consider a connected network with N nodes. At time i, each node k communicates only with its neighborhood Nk. The node collects a scalar data dk,i and random measurement vectors . Our goal is to estimate a vector wo of length M from the background noise vk. The data model of node k can be depicted as:where vk(i) is the background noise with variance and each node has a different value of vk(i). According to the general settings in signal processing, the
Proposed algorithm
The basic concept of generalized correntropy is first introduced in this section. Moreover, the diffusion generalized correntropy logarithmic difference (d-GCLD) algorithm for distributed estimation is proposed.
The correntropy criterion was proposed by Liu et.al. [35], [36], as an important part of ITL. Correntropy is a nonlinear measure of the similarity of random processes. Given the random variables X, and Y, the correntropy criterion is defined bywhere κσ(.)
Performance analysis
In this section, the stability of the diffusion generalized correntropy logarithmic difference algorithm is theoretically analyzed. To perform tractable analysis, we use the following assumptions to facilitate the performance analysis: Assumption 1 The input vector uk,i is independently and identically distributed (i.i.d.). Moreover, . Assumption 2 The noise vk, i is independent of uk,i for all k and i, and independent of vl, j for l ≠ k or i ≠ j. Additionally, the noise vk, i is a mixture signal of zero
Simulation
In this section, we first evaluate the simulation results of the d-GCLD algorithm over networks under Gaussian as well as Impulsive noise signals. Moreover, the proposed algorithm is also verified over a non-stationary environment and a non-linear system. During the simulations, 1000 independent Monte Carlo trials are performed to obtain averaged results.
The noise vk(i) is modeled as a summation of two independently zero-mean Gaussian noise [38], [44], [45] aswhere ν1,i is the
Conclusion
In this article, a robust diffusion algorithm for distributed estimation over networks is proposed, called d-GCLD algorithm. This algorithm combines the “logarithmic function” and “generalized correntropy” as its cost function. We also investigate the stability of the proposed algorithm. Through theoretical analysis, the stability condition of the d-GCLD algorithm is derived. Numerical simulations show that the d-GCLD algorithm can achieve superior performance than related algorithms in both
CRediT authorship contribution statement
Xinyu Li: Software, Data curation, Writing - original draft. Mingyu Feng: Writing - review & editing. Feng Chen: Conceptualization, Methodology. Qing Shi: Visualization, Software, Validation. Jurgen Kurths: Investigation, Supervision.
Declaration of Competing Interest
We declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled” “Robust distributed estimation based on a generalized correntropy logarithmic difference algorithm over wireless sensor networks”.
References (49)
- et al.
Diffusion fused sparse LMS algorithm over networks
Signal Process.
(2020) - et al.
Broken-motifs diffusion LMS algorithm for reducing communication load
Signal Processing
(2017) - et al.
Diffusion least logarithmic absolute difference algorithm for distributed estimation
Signal Process.
(2018) - et al.
Robust diffusion LMS over adaptive networks
Signal Process.
(2019) - et al.
Mixture correntropy for robust learning
Pattern Recognit.
(2018) - et al.
Diffusion maximum correntropy criterion algorithms for robust distributed estimation
Digit. Signal Process.
(2016) - et al.
Diffusion generalized maximum correntropy criterion algorithm for distributed estimation over multitask network
Digit Signal Process.
(2018) - et al.
A family of robust adaptive filtering algorithms based on sigmoid cost
Signal Process
(2018) - et al.
Fast linear iterations for distributed averaging
Syst. Control Lett.
(2004) - et al.
Distributed maximum a posteriori estimation under non-stationary condition
Inf. Sci. (Ny)
(2019)
A surevy on sensor networks
IEEE Commun. Mag.
The design space of wireless sensor networks
IEEE Wirel. Commun.
Distributed collaborative control for industrial automation with wireless sensor and actuator networks
IEEE Trans. Ind. Electron.
Joint optimization of dimension assignment and compression in distributed estimation fusion
IEEE Trans. Signal Process.
Optimum distributed estimation of a spatially correlated random field
IEEE Trans. Signal Inf. Process. Netw.
Partial diffusion Kalman filtering for distributed state estimation in multiagent networks
IEEE Trans. Neural Netw. Learn. Syst.
Incremental adaptive strategies over distributed networks
IEEE Trans. Signal Process.
Consensus-based algorithms for distributed network-state estimation and localization
IEEE Trans. Signal Inf. Process. Netw.
Robust distributed diffusion recursive least squares algorithms with side information for adaptive networks
IEEE Trans. Signal Process.
Diffusion minimum generalized rank norm over distributed adaptive networks: Formulation and performance analysis
IEEE Trans. Signal Inf. Process. Netw.
Diffusion adaptation strategies for distributed optimization and learning over networks
IEEE Trans. Signal Process.
Diffusion strategies outperform consensus strategies for distributed estimation over adaptive networks
IEEE Trans. Signal Process.
Diffusion LMS strategies for distributed estimation
IEEE Trans. Signal Process.
Performance limits for distributed estimation over LMS adaptive networks
IEEE Trans. Signal Process.
Cited by (14)
Diffusion robust algorithm based on inverse hyperbolic sine and generalized entropy
2024, Digital Signal Processing: A Review JournalA diffusion strategy for robust distributed estimation based on streaming graph signals
2023, ISA TransactionsManagement of the optimizer's curse concept in single-task diffusion networks
2023, Information SciencesAdaptive multitask clustering algorithm based on distributed diffusion least-mean-square estimation
2022, Information SciencesCitation Excerpt :With the performance improvement of sensors, wireless sensor networks (WSNs) are becoming more and more popular in many types of researches [18,20,24].
Simplified augmented cubature information filtering and multi-sensor fusion for additive noise systems
2022, Aerospace Science and TechnologyCitation Excerpt :However, as the number of sensors increases, the centralized model imposes a heavy computational burden on the processing center [5,19,20]. To handle this problem, a decentralized information fusion model is presented [21,22], in which the observation information from every sensor is processed locally, and all the processed information is transformed to the processing center for updating the global state [23]. Ge et al. [24] combined the ensemble scheme with centralized and distributed frameworks and the proposed ensemble multi-sensor fusion method obtained better filtering accuracy compared to conventional nonlinear filters, and further Ge et al. indicated that the centralized and distributed frameworks can achieve a comparable filtering efficiency.
Diversity-based diffusion robust RLS using adaptive forgetting factor
2021, Signal Processing
- ☆
This work was supported in part by the Fundamental Research Funds for the Central Universities (XDJK2020B034, SWU019029)