Structured-light-field 3D imaging without phase unwrapping
Introduction
Fringe-projection-based three-dimensional (3D) imaging with an advantage of non-contact, high-accuracy, and high-density 3D data acquisition has been widely used in a broad range of scenarios, such as industry, biomedicine, and entertainment [1], [2], [3], [4]. A crucial step of the fringe-projection technology is to extract the phase information from the captured fringe image(s) by using either phase-shifting [5, 6] or Fourier-transform [7, 8] methods. However, only a 2π-mode wrapped phase can be obtained due to the inverse-tangent calculation. The wrapped phase may cause phase ambiguity and must be unwrapped correctly or used under proper constraint conditions [9].
Spatial and temporal phase unwrapping are two classical kinds of methods. The former searches phase jumps among adjacent pixel points but is quite sensitive to sensor noise and depth discontinuity when the measured surfaces experience an abrupt change in depth or multiple isolated objects in a scene [10], [11], [12], [13]. The latter employs additional encoding information to process the wrapped phase pixel-wise [14]. The additional encoding information can be encoded in time-series signals, such as frequency-varying fringe patterns [15], [16], [17], [18], gray code [19, 20], and pseudorandom fringe [21], or embedded in the background or modulation intensity of original fringe patterns [22], [23], [24]. Unfortunately, temporal phase unwrapping suffers from low measuring speed due to additional image-sequence acquisition or low robustness and accuracy due to problematic signal separation and fringe-contrast reduction. Alternatively, multi-view geometric constraints can be used to address the phase ambiguity without temporally acquiring or embedding the additional encoding information [25], [26], [27], [28], [29]. Generally, sufficient views are suggested for robustly handling the problem of phase ambiguity.
Recently, we presented a method for absolute phase unwrapping based on light-field imaging [30]. From the recorded light field under structured illumination, i.e., structured-light field (SLF), a wrapped phase-encoded field (PEF) can be retrieved and resampled at possible image planes associated with several candidate spatial points in the measurement volume. Correct fringe orders can be determined by checking the phase consistency in the resampled wrapped PEF. The candidate spatial points with absolute phases can be reconstructed as long as the SLF system is structure-fixed and pre-calibrated. If the system parameters take change, the SLF system must be recalibrated, resulting in time-consuming processes. Moreover, frequent system calibration is impractical. More recently, we proposed another method for SLF depth estimation. The correspondence depth cue with the weighted angular variance of the unwrapped PEF can be used to estimate depths accurately [31]. This work indicates that the correspondence cue focusing on the local angular variance of the light field may provide potential constraint conditions to deal with the phase ambiguity in the wrapped PEF.
In this paper, we propose a method for SLF 3D imaging without phase unwrapping. It will be proved that the correspondence cue can be applied to the wrapped PEF for depth estimation and 3D reconstruction as long as the wrapped PEF is processed to be locally continuous with respect to the phases in specific angular coordinates. Compared with our previous methods [30, 31], the proposed method has two major advantages: on the one hand, the wrapped PEF is processed to produce sufficient constraints for effective SLF depth estimation without the need for phase unwrapping; on the other hand, system fixation and calibration are no longer mandatory so that the SLF system is given more degrees of freedom to flexibly adapt to the changes in the measurement environment, which has a special meaning in practice.
Section snippets
Principle
In this section, the basic principle involving the phase computation and SLF depth estimation is briefly described, following by the detailed explanations and discussions of the proposed method in the next sections.
Method
The PEF retrieved using Eq. (2) is wrapped, i.e., ϕw(s, u). Using Eq. (4) to compute the refocused phase maps of the wrapped PEF is problematic, because the angular distribution of the wrapped PEF, i.e., , may be discontinuous in the region of phase jumps. A general manipulation is to determine the fringe orders K(s, u) for obtaining the absolute unwrapped PEF ϕu(s, u), which can be used to compute the correspondence response using Eqs. (4) and (5). According to the method of
Experiments and analysis
A microlens-array-based unfocused plenoptic camera (Lytro Illum) and a digital projector (Casio XJ-VC100, 1024 × 768 pixels) were combined to set up an experimental configuration of the SLF system, as shown in Fig. 4. This experimental layout was used to demonstrate and verify the proposed method. Raw images captured by the plenoptic camera were decoded with the spatial and angular resolutions as 434 × 625 pixels and 15 × 15 pixels, respectively [34]. The plenoptic camera was calibrated so that
Conclusion
In this paper, we proposed a method for SLF 3D imaging without phase unwrapping. The wrapped PEF was processed to make the phase distribution in each microlens unit continuous. As a result, the correspondence cue focusing on the local angular variance of the light field could be applied to the processed wrapped PEF for depth estimation and 3D reconstruction without the requirement of system pre-calibration. Additionally, we performed the comparative experiments under various measurement
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgement
This work was supported by the Natural Science Foundation of Guangdong Province under the grant 2018A030313831, the Sino-German Research Cooperation group under the grant GZ 1391, and the National Natural Science Foundation of China under the grant 11804231 and 61875137.
References (35)
- et al.
Fringe projection techniques: whither we are?
Opt Lasers Eng
(2010) Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques
Opt Lasers Eng
(2012)- et al.
Calibration of fringe projection profilometry using an inaccurate 2D reference target
Opt Lasers Eng
(2017) - et al.
Dynamic 3-D shape measurement method: a review
Opt Lasers Eng
(2010) Absolute phase retrieval methods for digital fringe projection profilometry: a review
Opt Lasers Eng
(2018)- et al.
Reliability-guided phase unwrapping algorithm: a review
Opt Lasers Eng
(2004) - et al.
Temporal phase unwrapping algorithms for fringe projection profilometry: a comparative review
Opt Lasers Eng
(2016) - et al.
Multi-resolution reconstruction of 3-D image with modified temporal unwrapping algorithm
Opt Commun
(2003) - et al.
3-D shape measurement based on complementary Gray-code light
Opt Lasers Eng
(2012) - et al.
Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping
Opt Lasers Eng
(2013)
Fringe projection 3D microscopy with the general imaging model
Opt Express
Automated phase-measuring profilometry of 3-D diffuse objects
Appl Opt
Wavefront reconstruction and three-dimensional shape measurement by two-step dc-term-suppressed phase-shifted intensities
Opt Lett
Fourier transform profilometry for the automatic measurement of 3-D object shapes
Appl Opt
Satellite radar interferometry: two-dimensional phase unwrapping
Radio Sci
Robust two-dimensional weighted and unweighted phase unwrapping that uses fast transforms and iterative methods
J Opt Soc Am. A
Least-squares two-dimensional phase unwrapping using FFT’s
IEEE Trans Geosci Remote Sens
Cited by (20)
Direct fringe projection profilometry based on the spatial frequency distribution function in optical 3D imaging
2024, Optics and Laser TechnologyThree-dimensional reconstruction of light field based on cubic B-spline curve fitting
2024, Optics CommunicationsA dense light field reconstruction algorithm for four-dimensional optical flow constraint equation
2023, Pattern RecognitionCitation Excerpt :Light field imaging [2] obtains 3D information about the scene by capturing light distribution in free space, which contains many light field information. It has essential research significance and application prospects in 3D rendering [3] and displays [4]. Camera array is a meaningful way to collect optical fields [5], synchronously acquiring multi-dimensional optical field information.
Improved depth of field of the composite micro-lens arrays by electrically tunable focal lengths in the light field imaging system
2022, Optics and Laser TechnologyCitation Excerpt :3D imaging technologies have been widely used to capture 3D information of spatial objects [1–3].
Phase unwrapping error correction based on phase edge detection and classification
2021, Optics and Lasers in EngineeringCitation Excerpt :They accomplish phase unwrapping based on various principles. These methods can be divided into two main categories: type-Ⅰ PUAs unwrap the phase with aid of phases of adjacent pixels [4,6]; type-Ⅱ PUAs unwrap the phase with aid of additional information at each pixel [7–16,18–20]. In static 3D measurement, type-Ⅱ PUAs are preferred because they are more robust and more efficient.
Improved two-frequency temporal phase unwrapping method in fringe projection profilometry
2024, Applied Physics B: Lasers and Optics