Abstract
Background: Vehicles have become one of the most important means of transportation, and license plate is the only identifying mark of vehicles. License plate recognition technologies are being applied to a series of occasions, such as supervising road traffic violations, recovering stolen vehicles, monitoring wanted vehicles, and dispatching special vehicles. The license plate tilt phenomenon caused by various reasons has brought great trouble to its own recognition.
Objective: How to identify tilted license plates efficiently and accurately becomes the key to the automatic management of a large number of vehicles. Therefore, a real-time location, correction and segmentation algorithm based on the tilted license plate is proposed in this paper.
Methods: Firstly, an end-to-end deep convolutional neural network (CNN) is proposed for the location detection of the tilted license plate. The developed CNN is optimized by the most advanced RefineDet algorithm. By improving the object detection module, the representation ability of the CNN for small features and the accuracy of the detector are improved, so as to make the location regression and label prediction of the detected object more accurate. Secondly, the optimized perspective transformation algorithm is applied to correct the tilted license plate. According to the vertices coordinates of the bounding box detected by our CNN, the license plate area cropped out from the original image has a certain tilt angle, and the perspective transformation algorithm achieves the correction. Finally, digital image processing technology is used to segment the characters of license plates.
Results: The experimental results in the Chinese City Parking (CCP) dataset show that the proposed algorithm exhibits location average precisions improvements of 2.4-5.4% over the other algorithm.
Conclusion: The proposed algorithm achieves high-accuracy correction and real-time segmentation.
Graphical Abstract
[http://dx.doi.org/10.1109/ACCESS.2020.2994287]
[http://dx.doi.org/10.1109/CJECE.2018.2867591]
[http://dx.doi.org/10.1109/ICCAR.2018.8384708]
[http://dx.doi.org/10.1109/TITS.2011.2114346]
[http://dx.doi.org/10.1109/TITS.2017.2784093]
[http://dx.doi.org/10.1109/SIPROCESS.2019.8868545]
[http://dx.doi.org/10.1109/ICTC.2015.7354726]
[http://dx.doi.org/10.1109/ICPR.2018.8546291]
[http://dx.doi.org/10.1109/ICICEE.2012.180]
[http://dx.doi.org/10.1109/ICICEE.2012.259]
[http://dx.doi.org/10.1109/ICIEV.2019.8858580]
[http://dx.doi.org/10.1142/S0218001420500147]
[http://dx.doi.org/10.1109/TPAMI.2016.2577031] [PMID: 27295650]
[http://dx.doi.org/10.1007/978-3-319-46448-0_2]
[http://dx.doi.org/10.1109/CVPR.2018.00442]
[http://dx.doi.org/10.1007/978-3-030-01261-8_16]
[http://dx.doi.org/10.1109/ISCCSP.2010.5463482]
[http://dx.doi.org/10.1016/j.eswa.2021.115637] [PMID: 34334964]
[http://dx.doi.org/10.1016/j.knosys.2021.107432] [PMID: 34462624]