Abstract
This study conducts a comprehensive review of Deep Learning-based approaches for accurate object segmentation and detection in high-resolution imagery captured by Unmanned Aerial Vehicles (UAVs). The methodology employs three different existing algorithms tailored to detect roads, buildings, trees, and water bodies. These algorithms include Res-UNet for roads and buildings, DeepForest for trees, and WaterDetect for water bodies. To evaluate the effectiveness of this approach, the performance of each algorithm is compared with state-of-the-art (SOTA) models for each class. The results of the study demonstrate that the methodology outperforms SOTA models in all three classes, achieving an accuracy of 93% for roads and buildings using Res-U-Net, 95% for trees using DeepForest, and an impressive 98% for water bodies using Water Detect. The paper utilizes a Deep Learning-based approach for accurate object segmentation and detection in high-resolution UAV imagery, achieving superior performance to SOTA models, with reduced overfitting and faster training by employing three smaller models for each task.
Graphical Abstract
[http://dx.doi.org/10.1016/j.telpol.2018.06.002]
[http://dx.doi.org/10.3390/rs12101667]
[http://dx.doi.org/10.1109/JSTARS.2017.2735443]
[http://dx.doi.org/10.1080/01431161.2020.1788742]
[http://dx.doi.org/10.1080/17538947.2016.1269841]
[http://dx.doi.org/10.1007/978-94-007-2476-1_18]
[http://dx.doi.org/10.1109/LGRS.2019.2953261]
[http://dx.doi.org/10.1109/CVPR.2014.81]
[http://dx.doi.org/10.1109/CVPR.2015.7298965]
[http://dx.doi.org/10.1109/ICCV.2017.322]
[http://dx.doi.org/10.3390/s21062153] [PMID: 33808588]
[http://dx.doi.org/10.1007/s12596-021-00770-3]
[http://dx.doi.org/10.3390/electronics10060724]
[http://dx.doi.org/10.1109/MGRS.2021.3115137]
[http://dx.doi.org/10.3390/rs11121443]
[http://dx.doi.org/10.1016/j.ijcce.2021.11.005]
[http://dx.doi.org/10.3390/rs14030620]
[http://dx.doi.org/10.1109/IGARSS.2018.8517641]
[http://dx.doi.org/10.1080/15481603.2021.1974275]
[http://dx.doi.org/10.3390/rs14010223]
[http://dx.doi.org/10.3390/rs11111309]
[http://dx.doi.org/10.1016/j.rse.2010.03.002]
[http://dx.doi.org/10.1016/j.rse.2020.112209]
[http://dx.doi.org/10.1016/j.isprsjprs.2020.01.013]
[http://dx.doi.org/10.3390/rs70404026]