Abstract
Background: The development of deep learning technology has promoted the industrial intelligence, and automatic driving vehicles have become a hot research direction. As to the problem that pavement potholes threaten the safety of automatic driving vehicles, the pothole detection under complex environment conditions is studied.
Objective: The goal of the work is to propose a new model of pavement pothole detection based on convolutional neural network. The main contribution is that the Multi-level Feature Fusion Block and the Detector Cascading Block are designed and a series of detectors are cascaded together to improve the detection accuracy of the proposed model.
Methods: A pothole detection model is designed based on the original object detection model. In the study, the Transfer Connection Block in the Object Detection Module is removed and the Multi-level Feature Fusion Block is redesigned. At the same time, a Detector Cascading Block with multi-step detection is designed. Detectors are connected directly to the feature map and cascaded. In addition, the structure skips the transformation step.
Results: The proposed method can be used to detect potholes efficiently. The real-time and accuracy of the model are improved after adjusting the network parameters and redesigning the model structure. The maximum detection accuracy of the proposed model is 75.24%.
Conclusion: The Multi-level Feature Fusion Block designed enhances the fusion of high and low layer feature information and is conducive to extracting a large amount of target information. The Detector Cascade Block is a detector with cascade structure, which can realize more accurate prediction of the object. In a word, the model designed has greatly improved the detection accuracy and speed, which lays a solid foundation for pavement pothole detection under complex environmental conditions.
Keywords: Pavement pothole detection, automatic driving, security, convolutional neural network, cascade, detector
Graphical Abstract
[http://dx.doi.org/10.1016/j.measurement.2019.01.093]
[http://dx.doi.org/10.1016/j.autcon.2017.08.017]
[http://dx.doi.org/10.1007/978-3-319-11310-4_33]
[http://dx.doi.org/10.3390/ijgi5100182]
[http://dx.doi.org/10.1061/(ASCE)IS.1943-555X.0000489]
[http://dx.doi.org/10.1016/j.aei.2016.12.004]
[http://dx.doi.org/10.3390/s20020451] [PMID: 31941141]
[http://dx.doi.org/10.1061/40799(213)31]
[http://dx.doi.org/10.1145/1378600.1378605]
[http://dx.doi.org/10.1109/MIPRO.2016.7522334]
[http://dx.doi.org/10.1016/j.sbspro.2013.11.124]
[http://dx.doi.org/10.1016/j.aei.2018.09.002]
[http://dx.doi.org/10.1061/9780784412343.0070]
[http://dx.doi.org/10.1016/j.autcon.2018.09.019]
[http://dx.doi.org/10.1145/3355966.3355968]
[http://dx.doi.org/10.1109/SOLI.2018.8476795]
[http://dx.doi.org/10.1080/14680629.2019.1615533]
[http://dx.doi.org/10.1080/14680629.2019.1614969]
[http://dx.doi.org/10.1109/BigData.2018.8622025]
[http://dx.doi.org/10.1109/DICTA.2018.8615819]
[http://dx.doi.org/10.1145/3004725.3004729]
[http://dx.doi.org/10.1111/mice.12387]
[http://dx.doi.org/10.1023/B:VISI.0000029664.99615.94]
[http://dx.doi.org/10.1109/CVPR.2005.177]
[http://dx.doi.org/10.1109/CVPR.2014.81]
[http://dx.doi.org/10.1109/ICCV.2015.221]
[http://dx.doi.org/10.1007/978-3-319-46493-0_22]
[http://dx.doi.org/10.1109/ACCESS.2019.2923016]
[http://dx.doi.org/10.1007/s11263-009-0275-4]
[http://dx.doi.org/10.21629/JSEE.2017.01.01]
[http://dx.doi.org/10.1016/j.eswa.2019.01.014]
[http://dx.doi.org/10.1109/CVPR.2016.91]
[http://dx.doi.org/10.1109/CVPRW.2017.60]
[http://dx.doi.org/10.1109/CVPR.2018.00442]
[http://dx.doi.org/10.1016/j.autcon.2020.103124]
[http://dx.doi.org/10.1145/3301506.3301531]
[http://dx.doi.org/10.1016/j.neucom.2019.08.040]
[http://dx.doi.org/10.1109/CVPR.2018.00644]