Abstract
Background: The study on facemask detection is of great significance because facemask detection is difficult, and the workload is heavy in places with a large number of people during the COVID-19 outbreak.
Objective: The study aims to explore new deep learning networks that can accurately detect facemasks and improve the network's ability to extract multi-level features and contextual information. In addition, the proposed network effectively avoids the interference of objects like masks. The new network could eventually detect masks wearers in the crowd.
Methods: A Multi-stage Feature Fusion Block (MFFB) and a Detector Cascade Block (DCB) are proposed and connected to the deep learning network for facemask detection. The network's ability to obtain information improves. The network proposed in the study is Double Convolutional Neural Networks (CNN) called DCNN, which can fuse mask features and face position information. During facemask detection, the network extracts the featural information of the object and then inputs it into the data fusion layer.
Results: The experiment results show that the proposed network can detect masks and faces in a complex environment and dense crowd. The detection accuracy of the network improves effectively. At the same time, the real-time performance of the detection model is excellent.
Conclusion: The two branch networks of the DCNN can effectively obtain the feature and position information of facemasks. The network overcomes the disadvantage that a single CNN is susceptible to the interference of the suspected mask objects. The verification shows that the MFFB and the DCB can improve the network's ability to obtain object information, and the proposed DCNN can achieve excellent detection performance.
Keywords: Face mask detection, MFFB, DCB, DCNN, data fusion, deep learning.
Graphical Abstract
[http://dx.doi.org/10.1109/ICIP.2018.8451144]
[http://dx.doi.org/10.1109/ICCAR.2016.7486730]
[http://dx.doi.org/10.1109/TIE.2017.2682037]
[http://dx.doi.org/10.1063/1.5113532]
[http://dx.doi.org/10.1016/j.compag.2019.105159]
[http://dx.doi.org/10.1016/j.agrformet.2020.107938]
[http://dx.doi.org/10.1016/j.aei.2019.101009]
[http://dx.doi.org/10.1109/IGARSS.2018.8518581]
[http://dx.doi.org/10.1016/j.comcom.2019.10.007]
[http://dx.doi.org/10.1016/j.compeleceng.2019.106473]
[http://dx.doi.org/10.1109/CAC.2018.8623277]
[http://dx.doi.org/10.1016/j.cie.2018.11.008]
[http://dx.doi.org/10.1016/j.eswa.2019.112977]
[http://dx.doi.org/10.1016/j.autcon.2020.103124]
[http://dx.doi.org/10.1016/j.patrec.2013.09.020]
[http://dx.doi.org/10.1109/MFI.2017.8170355]
[http://dx.doi.org/10.1145/3343031.3351064]
[http://dx.doi.org/10.1177/0361198120912742]
[http://dx.doi.org/10.1109/ACCESS.2020.2990477]
[http://dx.doi.org/10.3390/a13040080]
[http://dx.doi.org/10.1007/978-3-030-49161-1_27]
[http://dx.doi.org/10.3390/s20020532] [PMID: 31963641]
[http://dx.doi.org/10.1109/TITS.2019.2892413]
[http://dx.doi.org/10.1109/NCC48643.2020.9056035]
[http://dx.doi.org/10.1109/ICIVC47709.2019.8981026]
[http://dx.doi.org/10.1007/978-3-319-48881-3_6]
[http://dx.doi.org/10.3390/s18072004] [PMID: 29932136]
[http://dx.doi.org/10.1007/s00779-019-01278-1]
[http://dx.doi.org/10.1007/978-3-030-03335-4_16]
[http://dx.doi.org/10.1109/CVPR.2017.403]
[http://dx.doi.org/10.1049/iet-ipr.2018.6571]
[http://dx.doi.org/10.1109/TIP.2017.2670780] [PMID: 28221995]
[http://dx.doi.org/10.1016/j.cviu.2018.02.006]
[http://dx.doi.org/10.1016/j.neucom.2019.08.059]
[http://dx.doi.org/10.1109/CAC.2017.8243930]
[http://dx.doi.org/10.1007/s00521-018-3692-x]
[http://dx.doi.org/10.1109/IJCNN.2019.8852256]
[http://dx.doi.org/10.1109/ACCESS.2019.2959015]
[http://dx.doi.org/10.2991/ijcis.11.1.72]
[http://dx.doi.org/10.1109/ACCESS.2019.2932731]
[http://dx.doi.org/10.3390/rs11182176]
[http://dx.doi.org/10.1088/1742-6596/1314/1/012202]
[http://dx.doi.org/10.1109/CompComm.2018.8780579]
[http://dx.doi.org/10.2352/J.ImagingSci.Technol.2019.63.6.060402]
[http://dx.doi.org/10.1155/2019/7297960]
[http://dx.doi.org/10.1088/1755-1315/440/3/032093]
[http://dx.doi.org/10.1155/2020/3189691]
[http://dx.doi.org/10.1016/j.engappai.2020.103615]
[http://dx.doi.org/10.3390/s20082238] [PMID: 32326573]
[http://dx.doi.org/10.1016/j.cja.2020.02.024]
[http://dx.doi.org/10.3103/S1060992X2002006X]
[http://dx.doi.org/10.1109/ACCESS.2019.2939488]
[http://dx.doi.org/10.1109/MLBDBI48998.2019.00032]
[http://dx.doi.org/10.1109/TGRS.2019.2921111]
[http://dx.doi.org/10.1016/j.knosys.2020.105590]
[http://dx.doi.org/10.1117/12.2538020]
[http://dx.doi.org/10.1049/iet-ipr.2019.0833]
[http://dx.doi.org/10.1109/ACCESS.2020.2991439]
[http://dx.doi.org/10.1109/JSTARS.2020.2975606]
[http://dx.doi.org/10.1088/1757-899X/533/1/012062]
[http://dx.doi.org/10.1109/CVPR.2018.00442]
[http://dx.doi.org/10.1007/978-3-319-46448-0_2]
[http://dx.doi.org/10.1109/CVPR.2017.106]