Poisson denoising under a Bayesian nonlocal approach using geodesic distances with low-dose CT applications

https://doi.org/10.1016/j.dsp.2020.102835Get rights and content

Abstract

Poisson noise tends to be the primary source of image degradation in several applications. When reducing the exposure time, the image may be severely degraded by noise. One of the strategies is to use adaptations of Non-Local Means for filtering. The NLM algorithm filters an image using a non-local average weighted by a function of the Euclidean distance between two patches of the image. Since geodesic distances induce a metric for the dissimilarity between two distributions, they can be used to compare two patches of an image. This paper alters the NLM algorithm to filter Poisson noise by changing the distance metric with a geodesic one that is more representative of this type of noise, adopting a general, and computationally efficient Bayesian approach. We evaluate the proposed method in the context of low-dose sinogram denoising. Among the geodesic distances evaluated, we found a closed solution for the Shannon entropy for Gamma distributions. Comparisons are made with other NLM strategies as well as state-of-the-art methods, achieving competitive results.

Introduction

Several applications, like X-ray tomography [1], fluorescence microscopy [2], astronomy [3], mammography [4], tomosynthesis [5], tend to suffer from Poisson noise degradation, mainly due to low counting of photons [6].

This work proposes to filter Poisson noise and apply our method in the domain of low-dose CT sinogram denoising. In this domain, the tomographic data acquisition process is performed through the photon counting, and the noise is modeled by a Poisson probability distribution [7]. In a sinogram corrupted by Poisson noise, the noise variance is equal to the average rate of photons, so the noise is signal-dependent [8]. The sinogram must then be reconstructed to obtain the final tomographic image.

The Nonlocal Means (NLM) algorithm proposed by Buades et al. [9] introduces a new method for filtering noisy images with a nonlocal principle that is efficient in filtering additive Gaussian noise. Patches of the image are compared using Euclidean distance through a Gaussian kernel.

Stochastic geodesic distances can be used to compare patches in an image [10], and also to classify images [11]. Santos and Mascarenhas [12] have used this approach to filter ultrasound data.

This work adapts the NLM algorithm for Poisson noise filtering with a Bayesian approach, using the conjugated Poisson and Gamma distributions and geodesic stochastic distances for the Gamma distribution. Furthermore, the proposed filters are compared with the methods: Non-Local Means [9], Poisson-NLM [13], Stochastic Poisson-NLM [14], and BM3D [15]. Both Non-Local Means and BM3D are performed in the Anscombe domain. The comparisons are made with the reconstructed images.

This paper is organized as follows. Section 2 presents the theoretical foundation for this work describing the used ideas and methods. Section 3 presents the modifications proposed for the Geodesic Non-Local Means. Section 4 describes the data set. Section 5 explains how the images were reconstructed and evaluated with their results in Section 6. The conclusions are in Section 7.

Section snippets

Non-local means

Given a noisy image Y={yi|iI}, resulting from the degradation of an original, noise-free image X={xi|iI}, the algorithm proposed by Buades et al. [9] seeks to obtain an estimate Xˆ={xˆi|iI} of the original image X, computing the filtered pixel xˆi through a weighted average between the pixels yj of the search window in the noisy image Y.xˆi=1cijW(i,r)wijyj where W(i,r) indicates the square search window centered at i, of r size, 0wij1 are the weights corresponding to each pixel in the

Proposed method

Our proposed method consists of first calculating the Bayesian estimates for the posterior Gamma distribution and then substituting the euclidean distance by a geodesic one. The geodesic distances are calculated between distributions defined by these posterior parameters.

To obtain the estimates of the posterior parameters, the phantom is first filtered with a mean kernel of size 3×3. The parameters are computed from this pre-filtered data using equations (7) and (8) with a window of size 3×3.

In

Data set

The data set used in this work consists of the Shepp-Logan phantom as well as five real phantoms sinograms [24]. The noisy sinograms were obtained using a 3-second exposure per projection point, while the noise-free phantoms were obtained with 20-second exposure per projection point. The x-ray used 59.5 keV of energy to acquire the data, a high voltage of 89.5 kV, 0.3 mA of current and the tomographic scan was performed at an angular rotation of 180° with a 1 mm step.

To simulate the noisy

Reconstruction and image evaluation

Once a sinogram is filtered, images are reconstructed by the Filtering-Back Projection (FBP) and Projection onto Convex Sets (POCS) algorithms. The FBP method is the standard tomographic reconstruction method. The POCS method generates images with less noise but also with reduced contrast, enabling a more complete assessment of the denoising methods.

The quality evaluation of the reconstructed images is obtained by reference indexes PSNR and SSIM since the original noise-free images are

Results

The best result, in each metric, achieved by the evaluated methods, is presented through charts in Fig. 2, Fig. 3, Fig. 4, Fig. 5, Fig. 6, Fig. 7. For each metric, the best results are highlighted in black, and the second-bests are highlighted in gray.

The experiments were executed with an AMD RyzenTM 3800X (3.9 GHz). Table 2 lists the average run time, from 10 executions, to filter the Shepp-Logan phantom with each method. For the proposed method, the time for both Shannon entropy and the

Conclusion

In this paper, we developed a general Bayesian approach for Poisson denoising of images, using geodesic distances in the Non-Local Means algorithm. We applied this strategy in the context of sinogram denoising for low-dose CT.

For mathematical simplification, in this Bayesian approach, it was considered that the original noise-free sinogram follows a Gamma distribution since it is the conjugate of the Poisson distribution, used for modeling the noise. The estimates for the posterior

CRediT authorship contribution statement

Daniel A. Góes: Formal analysis, Software, Validation, Visualization, Writing - original draft. Nelson D.A. Mascarenhas: Conceptualization, Methodology, Resources, Supervision, Writing - review & editing.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Code

The code of the proposed method is published in the CodeOcean platform under the following link: https://doi.org/10.24433/CO.baa5e6c4-d046-4e20-8eb2-9077c1ce0dee

Acknowledgements

This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001.

We thank Dr. Paulo Estevão Cruvinel of EMBRAPA-CNPDIA who provided the real phantoms data used in this work.

The work of N.D.A. Mascarenhas was supported by CNPq, Brazil, through a scholarship under Grant Process 306742/2017-9.

Daniel A. Góes received a BSc degree in Computer Science from the University of Campinas (UNICAMP), Brazil, in 2012, and the MSc degree in Computer Science from Campo Limpo Paulista University Center (UNIFACCAMP), Brazi, in 2019. His current research interests include digital image processing and pattern recognition.

References (28)

  • R. Gu et al.

    Blind X-ray CT image reconstruction from polychromatic Poisson measurements

    IEEE Trans. Comput. Imaging

    (2016)
  • Y. Zhang et al.

    A Poisson-Gaussian denoising dataset with real fluorescence microscopy images

  • E. Anisimova et al.

    Astronomical image denoising using curvelet and starlet transform

  • G. Hamed et al.

    A proposed model for denoising breast mammogram images

  • Pei Chen et al.

    Maximum likelihood reconstruction for tomosynthesis

  • N. Savage

    Medical imagers lower the dose

    IEEE Spectr.

    (2010)
  • J.Z. Liang et al.

    Guest editorial low-dose CT: what has been done, and what challenges remain?

    IEEE Trans. Med. Imaging

    (2017)
  • M. Bertero et al.

    A discrepancy principle for Poisson data

    Inverse Probl.

    (2010)
  • A. Buades et al.

    A review of image denoising algorithms, with a new one

    SIAM J. Multiscale Model. Simul.

    (2005)
  • J. Naranjo-Torres et al.

    The geodesic distance between Gi0 models and its application to region discrimination

    IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.

    (2017)
  • X. Yang et al.

    Polarimetric SAR image classification using geodesic distances and composite kernels

    IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.

    (2018)
  • C.A.N. Santos et al.

    Geodesic distances in probabilistic spaces for patch-based ultrasound image processing

    IEEE Trans. Image Process.

    (2019)
  • C. Deledalle et al.

    Poisson NL means: unsupervised non local means for Poisson noise

  • A.A. Bindilatti et al.

    A nonlocal Poisson denoising algorithm based on stochastic distances

    IEEE Signal Process. Lett.

    (2013)
  • Cited by (5)

    • Learning-to-augment strategy using noisy and denoised data: Improving generalizability of deep CNN for the detection of COVID-19 in X-ray images

      2021, Computers in Biology and Medicine
      Citation Excerpt :

      However, as the radiopacity variation is the key to visualize COVID-19 infection in a chest X-ray image and its quality is dependent on the exposure time, we assume that these X-ray images are acquired with an exposure more than 6 mA-seconds (mAs) to ensure good quality of image. X-ray images often get degraded by impulse noise [15], Gaussian noise [33–35], speckle noise [36–38] and Poisson noise [39–42] at the time of acquisition, transmission, or storage. Noisy images can be used as inputs to a CNN in two ways [39]: (i) using unprocessed noisy images as inputs to the networks, (ii) using denoised images as inputs to the network.

    • DAGAN: A GAN Network for Image Denoising of Medical Images Using Deep Learning of Residual Attention Structures

      2024, International Journal of Pattern Recognition and Artificial Intelligence

    Daniel A. Góes received a BSc degree in Computer Science from the University of Campinas (UNICAMP), Brazil, in 2012, and the MSc degree in Computer Science from Campo Limpo Paulista University Center (UNIFACCAMP), Brazi, in 2019. His current research interests include digital image processing and pattern recognition.

    Nelson D. A. Mascarenhas received the BSc and the MSc degrees in Electronic Engineering from the Technological Institute of Aeronautics (ITA), Brazil, in 1966 and 1969, and the PhD degree in Electrical Engineering from the University of Southern California, in 1974. He worked for ITA until 1979 and for the Brazilian National Space Research Institute (INPE) until 1994. He is now a Collaborating Retired Associate Professor at the Computing Department of the Federal University of São Carlos, Brazil and he is now associated with the MSc Program in Computer Science of UNIFACCAMP, Brazil. He is an Associate Editor of IEEE Geoscience and Remote Sensing Letters. His present research interests are in Digital Image Processing and Pattern Recognition, in general and image denoising problems, in particular.

    1

    During the development of this work, D. A. Goes was with the Master's Program in Computer Science.

    View full text