Abstract
At present, image recognition technology first classifies images and outputs category information through the neural network. The next step involves the search. Before retrieval, the feature database needs to be established, followed by one-to-one correspondence. This method is tedious, time-consuming and has low accuracy. In computer vision research, researchers have proposed various image recognition methods to be applied in various fields and made many research achievements. However, at present, the accuracy, stability and time efficiency cannot meet the needs of practical work. In terms of UAV image recognition, high accuracy and low consumption are required. Previous methods require huge databases, which increases the consumption of UAVs. Taking aerial transmission and line images as the research object, this paper proposes a method of image recognition based on chaotic synchronization. Firstly, the image is used as a function to construct a dynamic system, and the function structure and parameters are adjusted to realize chaos synchronization. In this process, different types of images are identified. At the same time, we research this dynamic system characteristics and realize the mechanism of image recognition. Compared with other methods, the self-built aerial image data set for bird's nest identification, iron frame identification and insulator identification has the characteristics of a high identification rate and less calculation time. It is preliminarily proven that the method of synchronous image recognition is practical, and also worthy of further research, verification and analysis.
Graphical Abstract
[http://dx.doi.org/10.26599/TST.2021.9010068]
[http://dx.doi.org/10.1016/j.patcog.2022.108814]
[http://dx.doi.org/10.2174/2666255813999201022113313]
[http://dx.doi.org/10.1117/1.JEI.23.6.063006]
[http://dx.doi.org/10.1109/TPAMI.2016.2533388] [PMID: 27824585]
[http://dx.doi.org/10.1007/s11063-017-9615-5]
[http://dx.doi.org/10.3390/e20040251] [PMID: 33265342]
[http://dx.doi.org/10.1109/TII.2021.3118135]
[http://dx.doi.org/10.1016/j.ijleo.2020.164477]
[http://dx.doi.org/10.1007/s11042-021-10959-0]
[http://dx.doi.org/10.1049/ipr2.12031]