Next Article in Journal
Identification and Prediction of Ship Maneuvering Motion Based on a Gaussian Process with Uncertainty Propagation
Previous Article in Journal
Investigation of the Spatio-Temporal Behaviour of Submarine Groundwater Discharge Using a Low-Cost Multi-Sensor-Platform
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

An Improved Background Normalization Algorithm for Noise Resilience in Low Frequency

1
College of Meteorology and Oceanology, National University of Defense Technology, Changsha 410073, China
2
Hunan Key Laboratory for Marine Detection Technology, Changsha 410073, China
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2021, 9(8), 803; https://doi.org/10.3390/jmse9080803
Submission received: 30 May 2021 / Revised: 4 July 2021 / Accepted: 7 July 2021 / Published: 27 July 2021
(This article belongs to the Section Physical Oceanography)

Abstract

:
Background normalization algorithms attempt to suppress the ambient and self-noise during the measurements of sonar, which enhance the detection performance and the display effect of weak signals. Conventional background normalization methods are usually sensitive to the accuracy of prior set filtering interval and threshold, while significant noise is still detected in low frequency. In this paper, an improved background normalization algorithm is proposed by thresholding the processing interval between several local peak values and local valley values. Compared to the existing scenarios, the proposed approach automatically calculates the filtering interval and threshold, with substantial resilience to the noise level in low frequency. Experimental results illustrate the effectiveness of our algorithm.

1. Introduction

In sonar detection, background normalization is a kind of constant false-alarm rate (CFAR) processing that estimates the magnitude of background ambient and self-noise as a threshold, and thresholding the weak signals of interest by providing dynamic magnitude compression for the data visualization displays [1]. This technique has been widely used in searching low level signals and detecting line spectrums, especially those in very low frequency (usually lower than 100 Hz) but submerged in background environmental noise and receiver-self-noise in most situations, since they possibly contain the power and character frequencies of targets like autonomous underwater vehicles (AUVs) and submersibles, etc.
Over three decades ago, several classical background normalizers, including two pass mean (TPM) algorithms, split three-pass mean (S3PM) algorithms (also called two-pass split-window (TPSW) algorithms in subsequent literature [1,2,3]), order truncate average (OTA) algorithms, and split average exclude average (SAXA) algorithms, were introduced by Struzinski and Lowe [4]. Stergiopoulos then proposed a normalization algorithm that estimates the mean of noise in the beam and frequency domain [1]. An improved OTA algorithm combined with a median filter was next studied by Li et al. in 2000, which mainly focuses on the inhomogeneous and non-stationary background [5]. In 2006, Joo and Jum further evaluated the previous works and specially compared the performances of TPSW and OTA methods in terms of the window length as well as threshold [6]. Wang and Zheng summarized the above approaches and employed another background normalization method in the later research, called the beam characteristics scan algorithm [3,7]. In addition, the background normalizers had been practically applied in shallow water with a multipath [8] and high-clutter environment [9], for better signal detection performances in broadband interferences.
Nevertheless, most of the aforementioned algorithms need to manually set several parameters (except to the beam characteristics scan algorithm) such as normalization interval and thresholding factor, which may seriously influence the normalization effect. On the other hand, all these normalizers are still performing unsatisfied for low frequency noise resilience. Therefore, we consider an improved background normalization algorithm that focuses on the noise suppression in low frequency, without the prior selection of experiential parameters. We are able to illustrate that the proposed scenario can decrease the background noise level in low frequency more significantly than those conventional normalization methods.
The remainder of this paper is organized as follows. Three commonly used background normalization algorithms mentioned above—TPSW algorithm, OTA algorithm and beam characteristics scan algorithm, are briefly reviewed in Section 2. Section 3 demonstrates the problem and the proposed normalizer in detail. In Section 4, experimental data are employed for the validation of the algorithm put forward compared to the previous methods, particularly for the normalization in low frequency, with the calculation burden and the influence of processing range also analyzed. Finally, conclusions are made in Section 5.

2. Review of Existing Background Normalization Algorithms

2.1. TPSW Algorithm

The TPSW algorithm, or S3PM algorithm, is proved to be a very robust and system-implementation-friendly normalizer [1]. For a string of data X = [ X ( 1 ) , X ( 2 ) , , X ( N ) ] , where N is the length of data, it gives the noise level estimation of the element of interest X ( k ) ( k = 1 , 2 , , N ) from two neighborhood windows, as shown in Figure 1.
Thus, the interval index of two windows is expressed as
R = [ k G , k G + 1 , , k E 1 , k E , k + E , k + E + 1 , , k + G 1 , k + G ]
For the situation in which k is close to the head or the end of the data string, the corresponding window is truncated (or even abandoned) at the boundaries of the string. Then, the local mean value of the elements in the two windows is given by
X ˜ ( k ) = mean k R [ X ( k ) ]
Next, a new sequence that helps eliminate the magnitude of possible signal and estimate the noise is formed as
Φ ( k ) = X ( k ) if X ( k ) < r X ˜ ( k ) X ˜ ( k ) if X ( k ) r X ˜ ( k )
where r = { 1 + C [ ( 4 / π 1 ) / A ] 1 / 2 } is a threshold regulator, C is a constant not less than 1, and A = G E + 1 is the length of each window [4].
After all the Φ ( k ) ( k = 1 , 2 , , N ) are obtained, the normalized values of TPSW are deduced from [ Z TPSW ( k ) ] k = 1 N , in which
Z TPSW ( k ) = X ( k ) / u ( k )
where u ( k ) = mean k R [ Φ ( k ) ] is the local mean value of the estimated noise.

2.2. OTA Algorithm

Compared to the TPSW algorithm, the OTA algorithm is more efficient and effective [1], which makes it one of the most popular background normalization schemes [10].
The algorithm achieves the normalization through thresholding and median filtering. In detail, the data string X = [ X ( 1 ) , X ( 2 ) , , X ( N ) ] is first extended as
X ( K + 1 ) , X ( K ) , , X ( 2 ) , X ( 1 ) , X ( 2 ) , , X ( N ) , X ( N 1 ) , , X ( N K )
and also re-written in the following format:
Y ( 1 ) , Y ( 2 ) , , Y ( N + 2 K )
Note that K is the length of the extension window to avoid the edge effect.
The operation of OTA method is illustrated in Figure 2.
The window including Y ( i ) to Y ( 2 K + i ) centered at Y ( K + i ) , i = 1 , 2 , , N , is next sorted from small to large as
y ( 1 ) , y ( 2 ) , , y ( 2 K + 1 ) ,
and we calculate the truncated mean value y ¯ as the estimation of noise level.
y ¯ = 1 K + 1 κ = 1 K + 1 y ( K + κ )
Then, outputs of the OTA algorithm are [ Z OTA ( i ) ] i = 1 N , in which
Z OTA ( i ) = 0 if Y ( K + i ) < α y ¯ Y ( K + i ) < y ¯ otherwise
where α is an empirical threshold regulator.

2.3. Beam Characteristics Scan Algorithm

Actually, the beam characteristics scan algorithm is an improved OTA approach that automatically defines the length of processing window as well as the threshold [11]. The peaks of data string X = [ X ( 1 ) , X ( 2 ) , , X ( N ) ] are recorded at the beginning of this algorithm. For an arbitrary peak p, denote the left and right neighbor valleys as v ( 1 ) = v L and v ( M ) = v R , and the data vector (with the length M) between the two valleys is expressed as
v ( 1 ) , v ( 2 ) , , v ( M )
Similar to the OTA algorithm, the vector is sorted in ascending order to be
w ( 1 ) , w ( 2 ) , , w ( M ) ,
and the following deduction of w ¯ is employed as the normalization threshold with a self-calculated thresholding regulator q:
w ¯ = 2 M + 1 ε = ( M + 1 ) / 2 M w ( ε ) M is odd 2 M ε = M / 2 + 1 M w ( ε ) M is even
q = min ( | p v L | , | p v R | ) max ( | p v L | , | p v R | )
Thus, the normalization results are
Z BCS ( m ) = 0 if v ( m ) < w ¯ q [ v ( m ) w ¯ ] otherwise , m = 1 , 2 , , M
and the whole string is denoted as [ Z BCS ( n ) ] n = 1 N .
The beam characteristics scan algorithm is schematically demonstrated in Figure 3.

3. Problem Formulation and Proposed Approach

3.1. Problem Formulation

According to the previous discussion, the performances of TPSW and OTA algorithm are significantly dependent on the proper choice of window length and threshold [12]. The beam characteristics scan scheme is free of empirical parameters, while it probably fails to process the edge of the data string (respected to the low and high frequency band in the delay–frequency map when normalizing along the frequency domain) since the peak may not exist in these areas (as illustrated in the shadow parts of Figure 3). In addition, the low frequency ambient and receiver-self-noise are unable to be well suppressed under all of the current approaches, which will be further validated in Section 4.
Therefore, the problems of traditional background normalization methods that worsen the detection of target line spectrums are concluded below:
  • Unstable performance due to the experientially set parameters (TPSW and OTA);
  • High level environmental and self-noise in low frequency cannot be fully suppressed (beam characteristics scan algorithm).

3.2. Proposed Approach

To combine the advantages of those three algorithms, we expect to design an improved approach based on the beam characteristics scan algorithm, since it is the only one in the three with adaptively set parameters. Note that we abandon to apply the extension window like the OTA algorithm for the beam characteristics scan to illuminate the blind areas during the processing (although it may be an obvious and simple idea) because it introduces empirical values again.
The approach we propose is an inverse beam characteristics scan (IBCS) algorithm that normalizes the data in an interval between two peaks instead of two valleys, followed by a pointwise nonlinear combination with a beam characteristics scan method.
As a complementary process, the inverse beam characteristics scan algorithm achieves normalization to the data string edge (shadow parts in Figure 3) that is neglected by the traditional beam characteristics scan algorithm. Typically, this scheme contains the following steps:
  • Step 1. Search all the valley values of data string X = [ X ( 1 ) , X ( 2 ) , , X ( N ) ] .
  • Step 2. For a given valley v , find the third closest left and right peaks as p ( 1 ) = p L and p ( M ) = p R , and represent the data vector (with the length M ) between the two peaks as
    p ( 1 ) , p ( 2 ) , , p ( M )
Remark 1.
Here, we extended the processing boundaries to the third closest left and right peaks, to make the whole procedure of the proposed approach remain comparable in complexity to the existing beam characteristics scan algorithm. Furthermore, this scenario is considered in detail as a compromise in Section 4 with respect to the normalized noise level and computing burden.
  • Step 3. Sort Equation (15) in ascending order as
    w ( 1 ) , w ( 2 ) , , w ( M )
    and derive the threshold w ¯ and regulator q from
    w ¯ = 2 M + 1 ξ = ( M + 1 ) / 2 M w ( ξ ) M is odd 2 M ξ = M / 2 + 1 M w ( ξ ) M is even
    q = min ( | p L v | , | p R v | ) max ( | p L v | , | p R v | )
  • Step 4. Export the normalized data vector
    Z IBCS ( m ) = 0 if p ( m ) < w ¯ q [ p ( m ) w ¯ ] otherwise , m = 1 , 2 , , M
    and the whole normalized data string is represented as [ Z IBCS ( n ) ] n = 1 N .
On the other hand, the normal beam characteristics scan algorithm is simultaneously employed, in which the v L and v R are correspondingly replaced to be the third closest left and right valleys, with the whole processed data string written as [ Z BCS ( n ) ] n = 1 N .
Next, a pointwise minimization operator [13] is applied to combine the results of beam characteristics scan and inverse beam characteristics scan algorithms, whose final output is expressed as [ Z ( n ) ] n = 1 N , where
Z ( n ) = min [ Z BCS ( n ) , Z IBCS ( n ) ]
This operation selects the lower level of background noise, while retaining the energy of target line spectrums with acceptable magnitudes to form the final data string.

4. Experimental Results and Further Discussion

4.1. Analysis of Experimental Results

In this subsection, a group of underwater acoustic experimental data are derived from a lake through a passive 12-element uniform circular array (whose radius is 1 m ). After some conventional signal processing analysis, the received time-domain signal is demonstrated as a delay-frequency map in Figure 4, and employed for the validation of the performance of proposed approach.
It is easy to find that the delay–frequency map contains several obvious target line spectrums at [120 Hz, 210∼245 s], [120 Hz, 399∼522 s], [190 Hz, 420∼522 s], [360 Hz, 356∼456 s], and [390 Hz, 65∼245 s], respectively, surrounded by quantities of ambient noise. Furthermore, receiver-self-noise with the magnitude much higher than 10 dB is mainly distributed in the frequency band 0∼50 Hz, and interferences on full frequency range are also observed at the 54 th , 134 th and 189 th second.
Figure 5 illustrates the background normalization effects (along the frequency axis) of TPSW, OTA, beam characteristics scan, and the proposed approach, where the experiential parameters of TPSW and OTA methods are: for TPSW, G = 16 , E = 3 , r = 1.14 [1]; for OTA, K = 25 , α = 1.05 [3]; as referred to previous research. The four methods indeed make global suppression of the background noise to a significant extent, and all the whole frequency band interferences are removed.
As to the performances, our proposed scheme shows a comparable visual display effect to the OTA algorithm, and much clearer noise background to the other two methods. Specifically, the approach we propose outperforms the OTA on the preservation of target line spectrums. For instance, our work highlights the originally discrete target line spectrums at [120 Hz, 399∼522 s] and [190 Hz, 420∼522 s] to be more continuous, while the normalization of OTA makes them further interrupted and even harder to be regarded as “line spectrums”.
The normalized noise level of these four schemes at each frequency bin is next displayed in Figure 6. Comparison results exhibit the lowest noise level of the proposed method, which is specifically over 12 dB and 24 dB lower than the OTA and BCS at 30 Hz , while TPSW achieves the normalization approximate to 2 dB between 0∼ 500 Hz at the cost of increased noise level in higher frequency.

4.2. Further Discussion

Firstly, the computational complexity of algorithms is evaluated and listed in Table 1. Notice that these results just reflect a relative complexity between the approaches, which varies with the size of data. The operations are based on a Thinkpad laptop that carries a Core i7-8565 CPU, with 16 GB of the internal storage.
The complexity indicates that OTA is the most efficient algorithm, which only costs about 1 / 3 of time compared to the other three schemes. The proposed approach consumes almost the same time as the beam characteristics scan algorithm, and both are better than that of TPSW.
On the other hand, the Remark in Section 3 points out that the proposed approach included d = 3 peaks (valleys) in the left and right at a given valley (peak) to form a data vector for processing, in order to make a balance on the normalized noise level as well as the computing time. Hence, we further consider the performances of these two indicators under different values of d, as expressed in Figure 7 and Figure 8.
As explained before, d refers to the counting number of peaks/valleys from a given valley/peak in the proposed approach, where the first and last peaks/valleys determine a data vector for calculation. It is observed that the normalized noise achieves an overall lowest level at d = 3 and then picks up. The reason is that a narrow processing band may be unable to separate the target line spectrums and background noise, while a too wide processing band may include strong receiver-self-noise and relatively weak target line spectrums at the same time, which means that the targets would be falsely suppressed and leads to an error normalization output (thus the situations that d > 6 are not considered in this paper). However, the reduction of processing time is slowing down with the increase of d. Though the time consumption turns to be less than the beam characteristics scan algorithm at d = 3 , it is still more than double compared to the OTA algorithms when d increases from 4 to 6. Therefore, in summary, d = 3 is adopted for the proposed approach.

5. Conclusions

The paper proposes an improved background normalization scheme including an inverse beam characteristics scan algorithm based on the conventional beam characteristics scan method and a pointwise minimization operator that combines the above two approaches. The proposed scheme achieves over 40 dB , 10 dB and 20 dB lower normalized noise level (especially in the low frequency band 0∼ 100 Hz ) than the existing TPSW, OTA, and beam characteristics scan algorithms, with almost the same (a little bit better) performance as the beam characteristics scan in terms of the computational complexity.
Future avenues of the work may direct to the evaluation of other kinds of practical data, such as sea trial data, and the possible modifications to the presented approach.

Author Contributions

Conceptualization, C.P. and Z.H.; methodology, J.Z. and Z.H.; software, J.Z. and B.Z.; experimental validation, G.X. and W.J.; writing—original draft preparation, J.Z.; writing—review and editing, C.P. and Y.W.; supervision, M.Z.; project administration, M.Z.; funding acquisition, J.Z., C.P., and B.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partially supported by the Natural Science Foundation of Hunan Province under Grant No. 2020JJ5677, the National Natural Science Foundation of China under Grant No. 62001490, and the Project of National University of Defense Technology under Grant Nos. ZK20-35 and ZK19-36.

Acknowledgments

The authors would like to thank the colleagues of the lake trail for their hard work on obtaining the experimental data, and also thank the editors and reviewers for their valuable suggestions on the improvement of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Stergiopoulos, S. Noise normalization technique for beamformed towed array data. J. Acoust. Soc. Am. 1995, 97, 2334–2345. [Google Scholar] [CrossRef]
  2. Filho, W.S.; de Seixas, J.M.; de Moura, N.N. Preprocessing passive sonar signals for neural classification. IET Radar Sonar Navig. 2011, 5, 605–612. [Google Scholar] [CrossRef]
  3. Zheng, J. Research on Passive Sonar Display Optimization Technology. Master’s Thesis, Harbin Engineering University, Harbin, China, 2013. (In Chinese). [Google Scholar]
  4. Struzinski, W.A.; Lowe, E.D. A performance comparison of four noise background normalization schemes proposed for signal detection systems. J. Acoust. Soc. Am. 1984, 76, 1738–1742. [Google Scholar] [CrossRef]
  5. Li, Q.; Pan, X.; Li, Y. A new algorithm of background equalization in digital sonar. Acta Acust. 2000, 25, 5–9. (In Chinese) [Google Scholar]
  6. Joo, J.H.; Jun, B.D.; Shin, K.C.; Kim, D.Y. The performance test of the background noise normalization in the narrow band detection. In Proceedings of the UDT Europe, Hamburg, Germany, 27–29 June 2006; pp. 1–4. [Google Scholar]
  7. Wang, X. Studies on Passive Sonar Broadband Display Method and Bearing Estimation Technology. Master’s Thesis, Northwestern Polytechnical University, Xi’an, China, 2009. (In Chinese). [Google Scholar]
  8. Kuhn, J.P.; Heath, T.S. Apparatus for and Method of Adaptively Processing Sonar Data. U.S. Patent US5481503A, 2 January 1996. [Google Scholar]
  9. Bentrem, F.W.; Botts, J.; Summers, J.E. Design of a Signal Normalizer for High-Clutter Active-Sonar Detection. J. Acoust. Soc. Am. 2018, 143, 1760. [Google Scholar] [CrossRef]
  10. Nielsen, R.O. Sonar Signal Processing; Artech House Publishers: Norwood, MA, USA, 1991. [Google Scholar]
  11. Qiu, J.; Wang, Y.; Ding, C.; Cheng, Y. Adaptive threshold background normalization algorithm for bearing-time recording. Ship Sci. Technol. 2019, 41, 133–137. (In Chinese) [Google Scholar]
  12. Struzinski, W.A.; Lowe, E.D. The effect of improper normalization on the performance of an automated energy detector. J. Acoust. Soc. Am. 1985, 78, 936–941. [Google Scholar] [CrossRef]
  13. Zhu, J.; Wang, X.; Huang, X.; Suvorova, S.; Moran, B. Range sidelobe suppression for using Golay complementary waveforms in multiple moving target detection. Signal Process. 2017, 141, 28–31. [Google Scholar] [CrossRef]
Figure 1. The schematic diagram of the TPSW algorithm.
Figure 1. The schematic diagram of the TPSW algorithm.
Jmse 09 00803 g001
Figure 2. The schematic diagram of the OTA algorithm.
Figure 2. The schematic diagram of the OTA algorithm.
Jmse 09 00803 g002
Figure 3. The schematic diagram of the beam characteristics scan algorithm.
Figure 3. The schematic diagram of the beam characteristics scan algorithm.
Jmse 09 00803 g003
Figure 4. Delay-frequency map of experimental data. (the unit of colorbar is dB).
Figure 4. Delay-frequency map of experimental data. (the unit of colorbar is dB).
Jmse 09 00803 g004
Figure 5. Normalization results of (a) TPSW; (b) OTA; (c) beam characteristics scan; and (d) our proposed approach. (the unit of colorbar is dB).
Figure 5. Normalization results of (a) TPSW; (b) OTA; (c) beam characteristics scan; and (d) our proposed approach. (the unit of colorbar is dB).
Jmse 09 00803 g005
Figure 6. (a) Comparison of normalized noise level in different algorithms; (b) a magnified version of (a) at 10∼ 100 Hz .
Figure 6. (a) Comparison of normalized noise level in different algorithms; (b) a magnified version of (a) at 10∼ 100 Hz .
Jmse 09 00803 g006
Figure 7. (a) Comparison of normalized noise level under different numbers of (unilateral) peaks/valleys included for processing; (b) a magnified version of (a) at 10∼ 100 Hz .
Figure 7. (a) Comparison of normalized noise level under different numbers of (unilateral) peaks/valleys included for processing; (b) a magnified version of (a) at 10∼ 100 Hz .
Jmse 09 00803 g007
Figure 8. Processing time varied with the (unilateral) included peaks/valleys.
Figure 8. Processing time varied with the (unilateral) included peaks/valleys.
Jmse 09 00803 g008
Table 1. Comparison of computational complexity.
Table 1. Comparison of computational complexity.
AlgorithmsTPSWOTABeam Characteristics Scan MethodProposed Approach
Time Burden 2.770266 s 0.762545 s 2.189423 s 2.155367 s
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhu, J.; Peng, C.; Zhang, B.; Jia, W.; Xu, G.; Wu, Y.; Hu, Z.; Zhu, M. An Improved Background Normalization Algorithm for Noise Resilience in Low Frequency. J. Mar. Sci. Eng. 2021, 9, 803. https://doi.org/10.3390/jmse9080803

AMA Style

Zhu J, Peng C, Zhang B, Jia W, Xu G, Wu Y, Hu Z, Zhu M. An Improved Background Normalization Algorithm for Noise Resilience in Low Frequency. Journal of Marine Science and Engineering. 2021; 9(8):803. https://doi.org/10.3390/jmse9080803

Chicago/Turabian Style

Zhu, Jiahua, Chengyan Peng, Bingbing Zhang, Wentao Jia, Guojun Xu, Yanqun Wu, Zhengliang Hu, and Min Zhu. 2021. "An Improved Background Normalization Algorithm for Noise Resilience in Low Frequency" Journal of Marine Science and Engineering 9, no. 8: 803. https://doi.org/10.3390/jmse9080803

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop