Feasibility investigation of a monostatic imaging lidar with a parallel-placed image sensor for atmospheric remote sensing
Introduction
Light detection and ranging (Lidar), as an optical remote sensing technology with high spatial and temporal resolution, has been widely applied for various atmospheric sensing, e.g., humidity, temperature, pressure, wind, trace gasses, aerosols and clouds [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11]. Elastic backscattering lidar techniques are the most widespread, in which a nanosecond-scale laser pulse is often transmitted into the atmosphere and the range-resolved backscattering light can be collected according to the time-of-flight principle [12], [13], [14]. However, the pulsed lidar technique still face some challenges such as incomplete overlap of the field-of-view, high cost, requirements of large dynamic range and after-pulse effects, etc. [15,16].
Another approach developed for range-resolved atmospheric remote sensing is the imaging lidar technique [17,18], which utilizes image sensors, e.g., charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) image sensor as receivers and/or detectors. The backscattering image of the transmitted laser beam is then acquired and the range information is obtained according to the angle of the backscattering light rather than the time-of-arrival. By employing area image sensors, imaging lidar techniques do not need to calibrate the overlap factor. Meanwhile, the detected lidar signal does not decrease with the square of the measurement distance, putting less demand on the dynamic range. Depending on the separation between the transmitter and the receiver units, imaging lidar techniques can be classified into two categories, the bi-static imaging lidar and the monostatic imaging lidar. The transmitter and the receiver units are often placed far away from each other for the bi-static imaging lidar, from a few meters up to 100 m, as has been demonstrated in [19,20]. The monostatic imaging lidar often equips with close-contact transmitter and receiver units (co-axial or bi-axial configurations), which can be readily integrated into a single setup [21,22].
In 1990s, Meiki and others demonstrated a bi-static imaging lidar technique employing commercial cameras (CCD cameras) with a small-aperture wide-angle receiving lens as the receiver, which was historically referred to as the Camera Lidar (CLidar) [19]. The baseline between the camera and the laser source is often tens of meters, e.g., 52 m, 122 m, 158 m [17,23,24]. The CLidar technique can acquire the backscattering image of a pulsed or continuous-wave (CW) laser beam from about 10 m up to several kilometers [17,25]. However, aerosol phase function needs to be considered as the angle of the side scatter light may vary significantly with the changing of the measurement altitude [26]. Although the wide-angle receiving lens enables large depth-of-field, which allows sharp imaging of the transmitted laser beam over several kilometers, it limits the application of the CLidar technique in nighttime because of low light collection efficiency (small aperture), poor performance of the optical filter caused by a large cone half angle of the incoming light (small f-number), etc.
In recent years, another imaging lidar technique based on the Scheimpflug principle has been proposed and demonstrated for atmospheric remote sensing, which is referred to as the Scheimpflug lidar (SLidar) [27,28]. The SLidar technique, employing high-power CW laser diodes as light sources and tilted CCD/CMOS image sensors (typically about 45°) as detectors, has shown a great potential in atmospheric remote sensing [29], [30], [31]. Meanwhile, recent comparison studies with the pulsed lidar technique have successfully validated the lidar signal measured by the SLidar technique [32]. Because of the short separation, e.g., about 0.8 m, the transmitter and the receiver units can be readily integrated into a single setup for the SLidar technique, which can thus be considered as a monostatic lidar system [21]. Meanwhile, the variation of the phase function can be neglected in most situations [18]. The SLidar technique can achieve infinite depth-of-focus while employing a large aperture telescope (large f-number) based on the Scheimpflug configuration. Thus, it has a much larger collecting efficiency of the backscattering light, small field-of-view for a single pixel enabling operation under strong sunlight background, and a narrow field-of-view allowing the usage of narrowband filters. As a result, all-day operation is feasible for the SLidar system operating at various wavelengths, e.g., 407 nm, 450 nm and 808 nm. However, the CMOS/CCD camera should be tilted with 45° to satisfy the Scheimpflug principle, namely the image plane, the lens plane and the object plane intersect. Unfortunately, the quantum efficiency (QE) of the image sensor is highly dependent on the incident angle of the incoming light. The QE will significantly decrease as the incident angle changes from 0° to e.g., 45° [33], [34], [35]. Moreover, the QE can decrease substantially beyond 20°, and the QE at 45° may be as low as 20% (or even smaller) of the maximum QE for many image sensors [36]. Although a 45° tilted camera can achieve sharp focus for the backscattering light from the near range (tens of meters) to the far range (kilometers), the signal-to-noise ratio (SNR) of the detected lidar signal is not optimized because of the low QE for the incoming light with large incident angles (e.g., 45°).
In this work, a monostatic imaging lidar, employing a large aperture receiving telescope as the receiver and a parallel-placed CMOS camera as the detector, has been proposed for atmospheric aerosol sensing. Under this circumstance, the Scheimpflug principle is not satisfied. As a result, the depth-of-field (DoF) is limited due to the large-aperture receiving telescope, leading to defocusing of the backscattering image either in the near range or in the far range. In order to distinguish with the SLidar technique, the monostatic imaging lidar proposed in this work is referred to as the Shallow Depth-Of-Field Imaging Lidar (SDOFI-Lidar). The SDOFI-Lidar features of large receiving efficiency (large aperture), small field-of-view for a single pixel (large f-number), and high quantum efficiency for the incoming light. The pixel-distance relationship of the SDOFI-Lidar has been theoretically established according to geometrical optics. An experimental setup has been developed for simultaneous atmospheric measurements by integrating the SDOFI-Lidar and the SLidar. The influence of the out-of-focus backscattering image on the lidar signal has also been evaluated. The feasibility of the SDOFI-Lidar for atmospheric remote sensing has been investigated through inter-comparison measurements with the SLidar technique.
Section snippets
Principle of the shallow depth-of-field imaging Lidar
The schematic diagrams of the SDOFI-Lidar as well as the SLidar are shown in Fig. 1. The SDOFI-Lidar employs an image sensor parallel to the receiving lens/telescope, while the SLidar utilizes a 45-degree tilted image sensor in order to satisfy the Scheimpflug principle. When a laser beam is transmitted into the atmosphere, the backscattering image can be clearly observed by the SLidar system with infinite depth-of-focus (DoF) as long as the image plane, the lens plane and the object plane
Atmospheric measurements and discussion
The experimental setup, integrating the SDOFI-Lidar and the SLidar, was deployed on the rooftop of the Graduate Education Building, Dalian University of Technology (DLUT), Dalian, China (38.886°N, 121.532°E), about 33 m above the ground. A tall building located at approximately 2.3 km from the lidar system was utilized to calibrate the relationship between the pixel and the distance. Atmospheric vertical and horizontal measurements under different atmospheric conditions have been carried out in
Conclusions and outlook
In this work, a monostatic imaging lidar, featuring of large receiving efficiency, small field-of-view for a single pixel, and high quantum efficiency for the incoming backscattering light, has been developed by employing a parallel-placed CMOS camera as the detector and a large aperture receiver. Since the Scheimpflug principle is not satisfied while employing a large-aperture receiving aperture, the Depth-of-field (DoF) is limited. Thus, the monostatic imaging lidar demonstrated in this work
Funding
National Natural Science Foundation of China (61705030); Fundamental Research Funds for the Central Universities (DUT18JC22).
CRediT authorship contribution statement
Zheng Kong: Methodology, Data curation, Writing - original draft. Teng Ma: Software, Resources. Yuan Cheng: Validation, Formal analysis. Zhen Zhang: Investigation. Yichen Li: Visualization. Kun Liu: Project administration, Supervision. Liang Mei: Conceptualization, Funding acquisition, Writing - review & editing.
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgments
The authors greatly acknowledge the help of Ruonan Fei and Qingqing Lu for the experimental work. The authors also appreciate the reviewers for their detailed reviews that have substantially improved the work.
References (41)
- et al.
Fast retrieval of atmospheric CO2 concentration based on a near-infrared all-fiber integrated path coherent differential absorption lidar
Infrared Phys Techn
(2018) - et al.
Rayleigh lidar temperature measurements in the upper troposphere and lower stratosphere
J Atmos Sol - Terr Phy
(2004) - et al.
Investigate the relationship between multiwavelength lidar ratios and aerosol size distributions using aerodynamic particle sizer spectrometer
J Quant Spectrosc Rad Transf
(2017) - et al.
A method of determining multi-wavelength lidar ratios combining aerodynamic particle sizer spectrometer and sun-photometer
J Quant Spectrosc Rad Transf
(2018) - et al.
Optical properties of mixed aerosol layers over Japan derived with multi-wavelength Mie-Raman lidar system
J Quant Spectrosc Rad Transf
(2017) - et al.
Ground-based network observation using Mie-Raman lidars and multi-wavelength Raman lidars and algorithm to retrieve distributions of aerosol components
J Quant Spectrosc Rad Transf
(2017) - et al.
Smoke plume optical properties and transport observed by a multi-wavelength lidar, sunphotometer and satellite
Atmos Environ
(2012) - et al.
Noise modeling, evaluation and reduction for the atmospheric lidar technique employing an image sensor
Opt Commun
(2018) - et al.
Development and application of an airborne differential absorption lidar for the simultaneous measurement of ozone and water vapor profiles in the tropopause region
Appl Opt
(2019) - et al.
Retrieval and analysis of a polarized high-spectral-resolution lidar for profiling aerosol optical properties
Opt Express
(2013)
Generalized high-spectral-resolution lidar technique with a multimode laser for aerosol remote sensing
Opt Express
Hygroscopic growth of atmospheric aerosol particles based on active remote sensing and radiosounding measurements: selected cases in southeastern Spain
Atmos Meas Tech
Measurements of wind turbulence parameters by a conically scanning coherent Doppler lidar in the atmospheric boundary layer
Atmos Meas Tech
Characterization of atmospheric aerosol optical properties based on the combined use of a ground-based Raman lidar and an airborne optical particle counter in the framework of the Hydrological Cycle in the Mediterranean Experiment - Special Observation Period 1
Atmos Meas Tech
Retrieval of Aerosol Components Using Multi-Wavelength Mie-Raman Lidar and Comparison with Ground Aerosol Sampling
Remote Sens-Basel
Lidar return signals for coaxial and noncoaxial systems with central obstruction
Appl Opt
Full-time, eye-safe cloud and aerosol lidar observation at atmospheric radiation measurement program sites: instruments and data processing
J Atmos Ocean Tech
Boundary layer scattering measurements with a charge-coupled device camera lidar
Appl Opt
Atmospheric aerosol monitoring by an elastic Scheimpflug lidar system
Opt Express
Range-resolved bistatic imaging lidar for the measurement of the lower atmosphere
Opt Lett
Cited by (9)
Camera-guided real-time laser ranging for multi-UAV distance measurement
2022, Applied OpticsApplication of hyperspectral imager and lidar in marine biological detection
2021, Hongwai yu Jiguang Gongcheng/Infrared and Laser Engineering