A non-local propagation filtering scheme for edge-preserving in variational optical flow computation

https://doi.org/10.1016/j.image.2021.116143Get rights and content

Abstract

The median filtering heuristic is considered to be an indispensable tool for the currently popular variational optical flow computation. Its attractive advantages are that outliers reduction is attained while image edges and motion boundaries are preserved. However, it still may generate blurring at image edges and motion boundaries caused by large displacement, motion occlusion, complex texture, and illumination change. In this paper, we present a non-local propagation filtering scheme to deal with the above problem during the coarse-to-fine optical flow computation. First, we analyze the connection between the weighted median filtering and the blurring of image edge and motion boundary under the coarse-to-fine optical flow computing scheme. Second, to improve the quality of the initial flow field, we introduce a non-local propagation filter to reduce outliers while preserving context information of the flow field. Furthermore, we present an optimization combination of non-local propagation filtering and weighted median filtering for the flow field estimation under the coarse-to-fine scheme. Extensive experiments on public optical flow benchmarks demonstrate that the proposed scheme can effectively improve the accuracy and robustness of optical flow estimation.

Introduction

Optical flow estimation aims at determining displacements for each pixel between two consecutive frames. It plays an essential role in image processing and visual tasks. The applications include video object tracking [1] and segmentation [2], action recognition [3], [4], autonomous navigation [5], medical diagnostics [6], and many other fields. Despite an abundance of literature related to this topic, optical flow computation remains a challenging problem mainly because of illumination variation, large displacement, occlusion and texture-less regions.

Various optical flow methods have been proposed during the past thirty years to overcome these challenges. Variational [7], [8], [9], [10] and convolutional neural network (CNN)-based methods [11], [12], [13] are considered to be important methods for optical flow computation. The CNN-based method has made recent progress, but it has several limitations, such as lack of labeled training datasets [14] and poor interpretability. In contrast, the variational method remains a valuable approach because of its excellent performance and complete mathematical theory. This paper addresses the issue of blurring at the image edge and motion boundary in the variational optical flow computation.

The variational optical flow method was introduced by Horn and Schunck [7]. It generates the dense flow field by minimizing the global energy function that contains a data term and a smoothness term with a weighting factor. However, the original model reveals many limitations in practice. For instance, illumination variation generates the violent changes of the pixel brightness, and the case of complex motions brings discontinuities in the flow field. Many effective modifications and extensions [10], [15], [16], [17], [18], [19] have emerged, which can generally be classified into two aspects: 1. Improvements in the data term, such as adding various constancy assumptions [20], pre-processing of the input image [15], or using robust image data [21], [22] to make the model more robust under illumination variation, image noise, and large displacements. 2. Improvements in the smoothness term, such as using robust penalty functions [8], [23], [24], improving the flow diffusion strategy [25], [26], [27], adding adaptive weights with a spatially varying function [28] to enhance the capability to deal with complex motion boundaries.

In addition to modifications of the model, reducing outliers (e.g., image noise and flow errors) during the optical flow computation is also an effective way to improve the performance of the variational model. The weighted median filtering introduced by Sun [29] is currently the most successful method for reducing outliers while preserving image edges and motion boundaries. Since it can significantly improve the accuracy and robustness of almost all optical flow algorithms, it has been considered as an indispensable operation for the variational optical flow computation. However, the weighted median filtering remains to generate over-segmentation or blurring at images and motion boundaries caused by large displacement, occlusion, or complex scene. To deal with this problem, some researchers propose to introduce occlusion information into the weight of median filtering [30] or combine the post-processing method [31] to smooth the flow field. Although these methods improve the accuracy of motion boundary computation in some sense, the edge-preserving ability of weighted median filtering still has a limitation. This limitation is that when the flow field contains outliers and fuzzy structure information, the filtering weight based on inaccurate non-local information is difficult to alleviate the cross-region mixing problem caused by the explicit spatial kernel during the filtering process. Consequently, the cross-region mixing evolve into the blurring of image edge and motion boundary in the final estimated flow field.

Aiming to solving this problem, we present a non-local propagation filtering scheme (NLPF) for optical flow computation that implicitly defines the spatial filtering kernel function while robustly exploiting self-predictions and self-similarities that the intermediate flow can provide to determine the filter weight for denoising. In particular, the proposed filtering method acts as a pre-filtering operation of weighted median filtering to alleviate cross-region mixing problems in the estimated flow field. We apply the NLPF method to several typical and state-of-the-art approaches of optical flow computation and employ the training sequences of Middlebury, MPI-Sintel, and KITTI to show benefits of the non-local propagation filter for edge-preserving.

The remainder of this paper is organized as follows. Section 2 reviews related work. In Section 3, we introduce the non-local TV-L1 model of optical flow estimation and discuss limitations of weighted median filtering in edge-preserving. In Section 4, we propose a non-local propagation filtering for flow field estimation. In Section 5, we describe the non-local propagation filtering scheme for optical flow computation. The experiment results are presented in Section 6. Our conclusions are provided in Section 7.

Section snippets

Related work

It is beyond the scope of this paper to review the entire literature on optical flow. Please refer to the previous literature [29], [32] for a detailed overview of optical flow algorithms. Instead, we review the work most related to our method. In particular, we will focus on the work that addresses motion boundary blur and flow noise in the variational optical flow computation.

The research on improving the quality of the flow field in the early work mainly focuses on how to satisfy the

Non-local TV-L1 model of optical flow

For an image sequence, Let Ix,t:(ΩR2) indicate the gray value of pixel x in the first frame, and Ix+w,t+1 indicate the gray value of the corresponding pixel in the second frame. The brightness constancy assumption states that the brightness of the underlying object does not change: Ix,t=Ix+w,t+1where x=(x,y)T denotes the coordinate of pixels in the spatial image domain Ω, and w=(u,v)T is the flow vector of pixel x, where u and v denote the displacement in x-and y-direction, respectively. The

Non-local propagation filtering for flow field estimation

For the coarse-to-fine incremental estimation process, let W=(u,v)T denote the intermediate flow field after every warping iteration. The filtered flow field Wˆi=(uˆi,vˆi)T at position i=(x,y)T optimized using the non-local propagation filter is calculated by: Wˆi=1ziiNiwiiWiwhere Ni denotes the set of neighboring of the centered position i, and Wi=(ui,vi)T denotes the optical flow of any pixel of neighbors of the center pixel. We have wii as the weight for each flow vector to perform

Scheme of non-local propagation filtering for optical flow estimation

To be robust against the blurring of images or motion boundaries caused by the coarse-to-fine computation with heuristic median filtering, we design the non-local propagation filter to be a pre-processing operation of optical flow estimation at each layer of image pyramid. A schematic of coarse-to-fine estimation with non-local propagation filtering is shown in Fig. 3. For any layer k in the image pyramid, the output optical flow (uk+1,vk+1)T is first obtained by adding the initialization (uk,vk

Experimental results

In this section, we evaluate the effectiveness of our proposed method in improving optical flow accuracy and preserving image edges and motion boundaries. Specifically, we first introduce a measure of performance for optical flow computation, and then we test whether the proposed method is effective on three state-of-the-art evaluation benchmarks: Middlebury [32], MPI-Sintel [14] and KITTI [38], [46]. Finally, we evaluate the effect of filtering window radius and filtering sequence on the

Conclusion

In this paper, we present a non-local propagation filtering scheme to alleviate the blurring of image edges and motion boundaries during the coarse-to-fine optical flow computation. We begin with the analysis of the connection between the weighted median filtering and the blurring of motion boundary. Second, to address the blurring of motion boundaries, we propose a non-local propagation filtering scheme to optimize the coarse-to-fine optical flow computation. In this scheme, the non-local

CRediT authorship contribution statement

Chong Dong: Conceptualization, Methodology, Writing. Zhisheng Wang: Original draft preparation, Reviewing and editing. Jiaming Han: Visualization, Investigation. Changda Xing: Software, Validation. Shufang Tang: Data analysis.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgment

This research is supported by the Natural Science Foundation of China [grant number 61473144].

References (47)

  • HornB.K. et al.

    Determining optical flow

  • BruhnA. et al.

    Lucas/Kanade meets Horn/Schunck: Combining local and global optic flow methods

    Int. J. Comput. Vis.

    (2005)
  • BroxT. et al.

    High accuracy optical flow estimation based on a theory for warping

  • A. Dosovitskiy, P. Fischer, E. Ilg, P. Hausser, C. Hazirbas, V. Golkov, P. Van Der Smagt, D. Cremers, T. Brox, Flownet:...
  • E. Ilg, N. Mayer, T. Saikia, M. Keuper, A. Dosovitskiy, T. Brox, Flownet 2.0: Evolution of optical flow estimation with...
  • D. Sun, X. Yang, M.-Y. Liu, J. Kautz, Pwc-net: Cnns for optical flow using pyramid, warping, and cost volume, in:...
  • ButlerD.J. et al.

    A naturalistic open source movie for optical flow evaluation

  • WedelA. et al.

    An improved algorithm for tv-l 1 optical flow

  • BroxT. et al.

    Large displacement optical flow: Descriptor matching in variational motion estimation

    IEEE Trans. Pattern Anal. Mach. Intell.

    (2011)
  • XuL. et al.

    Motion detail preserving optical flow estimation

    IEEE Trans. Pattern Anal. Mach. Intell.

    (2012)
  • SunD. et al.

    Secrets of optical flow estimation and their principles

  • Z. Chen, H. Jin, Z. Lin, S. Cohen, Y. Wu, Large displacement optical flow from nearest neighbor fields, in: The IEEE...
  • PapenbergN. et al.

    Highly accurate optic flow computation with theoretically justified warping

    Int. J. Comput. Vis.

    (2006)
  • Cited by (10)

    • Robust optical flow estimation to enhance behavioral research on ants

      2022, Digital Signal Processing: A Review Journal
    • Calculation Method of RGBD Scene Flow Combining Gaussian Mixture Model and Multi-channel Bilateral Filtering

      2023, Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence
    View all citing articles on Scopus
    View full text