Abstract
Background: On retrieving Spatiotemporal Chorography (STIC) information, one of the most important topics is how to quickly pinpoint the desired STIC text out of the massive chorography databases. Domestically, there are no diverse means to retrieve spatiotemporal information from the chorography database. Emerging techniques like data mining, Artificial Intelligence (AI), and Natural Language Processing (NLP) should be introduced into the informatization of chorography.
Objective: This study intends to devise an information retrieval method for STIC based on deep learning and fully demonstrates its feasibility.
Methods: Firstly, the authors explained the flow for retrieving and analyzing the data features of STIC texts and established a deep hash model for STIC texts. Next, the data matching flow was defined for STIC texts, the learned hash code was adopted as the memory address of STIC texts, and the hash Hamming distance of the text information was computed through linear search, thereby completing the task of STIC retrieval.
Results: Our STIC text feature extraction model learned better STIC text features than the contrastive method. It learned many hash features and differentiated between different information well when there were many hash bits.
Conclusion: In addition, our hash algorithm achieved the best retrieval accuracy among various methods. Finally, the hash features acquired by our algorithm can accelerate the retrieval speed of STIC texts. These experimental results demonstrate the effectiveness of the proposed model and algorithm.
Keywords: Deep Learning, Spatiotemporal Information of Chorography (STIC), Information Retrieval, Hash Code, Hamming Distance, Classification
Graphical Abstract
[http://dx.doi.org/10.1016/j.ijmedinf.2019.01.012] [PMID: 30784431]
[http://dx.doi.org/10.1016/j.knosys.2016.11.023]
[http://dx.doi.org/10.1088/1748-9326/abe74a]
[http://dx.doi.org/10.1785/0120210032]
[http://dx.doi.org/10.1785/0220200257]
[http://dx.doi.org/10.1007/s00024-018-1919-5]
[http://dx.doi.org/10.1785/0220150157]
[http://dx.doi.org/10.1016/j.epsl.2019.116056]
[http://dx.doi.org/10.1016/j.quascirev.2019.05.007]
[http://dx.doi.org/10.1016/j.epsl.2018.04.035]
[http://dx.doi.org/10.1130/GES01284.1]
[http://dx.doi.org/10.1007/s12210-016-0533-7]
[http://dx.doi.org/10.2352/issn.2168-3204.2019.1.0.4]
[http://dx.doi.org/10.2352/issn.2168-3204.2018.1.0.2]
[http://dx.doi.org/10.1145/3463914.3463925]
[http://dx.doi.org/10.1109/COMPSAC51774.2021.00233]
[http://dx.doi.org/10.1145/3472301.3484320]
[http://dx.doi.org/10.4218/etrij.2019-0458]
[http://dx.doi.org/10.1155/2021/2630254]
[http://dx.doi.org/10.1109/ICCES48766.2020.9138076]
[http://dx.doi.org/10.1109/ICOSP.2014.7015214]
[http://dx.doi.org/10.18280/ria.350204]
[http://dx.doi.org/10.18280/ts.370113]
[http://dx.doi.org/10.1002/int.20153]
[http://dx.doi.org/10.1109/TCYB.2020.2985716] [PMID: 32386178]
[http://dx.doi.org/10.1109/TPAMI.2020.3018491] [PMID: 32822292]