DJI published their newest product: DJI-L1 and P1, early this year. The former one is a UAS with LiDAR camera. The new product would be available in April, 2021. So, if the UAS with LiDAR (P1 is equipped with LiDAR and RGB together) has the potential to improve the way of field observation in the next 5 years?
Above ground biomass (AB) is an important parameter in ecology and agriculture. It reflects the carbon accumulated in plant organism, and the information of canopy structure. Traditional way to observe AB is extreme time consuming (measure and record every single tree in each sample), and sometimes impossible. LiDAR equipped by plain has proved to be effective and accurate projecting regional AB by Zhao et al. (2018), see Fig. 1. But it cost a huge amount of money to carry out the investigation. So the new product of DJI makes it possible to do the same observation with, maybe, 10% of the cost compared it used to be.
Fig. 1. Theory of LiDAR observing AB. Equipment on plain (or UAS in the future) active a laser to the ground, receive and record the distance between the platform and the target point. Usually, several laser point (more than 3) would be projected within one square meter, which could be used to produce a DSM (Digital Surface Model) for the target region. The difference between DSM and DEM (Digital Elevation Model) for soil surface is the CHM (Canopy Height Model). We usually think CHM has a positive relationship with AB.
Let’s guess the potential application of UAS-LiDAR. There has been research about monitoring crop based on UAS (not with LiDAR, see Ashapure et al., 2019), proving the workflow is applicable. Regional AB map could definitely help with the yield prediction, and help with the model building and validation. UAS-LiDAR make it much easier and more accurate for users to project canopy height and AB than UAS non-LiDAR. Thus, I predict there would be increasing amount of research proposed about filed scale observation, or use the observed result to calibrate model, in the future years.
Also, LiDAR provides detailed information about canopy height and structure, which was usually neglected or simplified in current optical-based remote sensing analysis and radiative transfer equations (RTMs). Maimaitijiang et al. (2020) proved that a combination of canopy structure and optical information achieved a better performance predicting yield, based on a deep learning network. It still requires a process-based explanation before the method transplanting to natural biomes like forest. But anyway, it provides a possibility to improve the application of optical remote sensing information.
Ashapure A, Jung J, Yeom J, et al. A novel framework to detect conventional tillage and no-tillage cropping system effect on cotton growth and development using multi-temporal UAS data[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 2019, 152: 49-64.
Maimaitijiang M, Sagan V, Sidike P, et al. Soybean yield prediction from UAV using multimodal data fusion and deep learning[J]. Remote Sensing of Environment, 2020, 237: 111599.
Zhao Y, Zeng Y, Zheng Z, et al. Forest species diversity mapping using airborne LiDAR and hyperspectral data in a subtropical forest in China[J]. Remote Sensing of Environment, 2018, 213: 104-114.