Abstract
Introduction: Under complex illumination conditions, such as poor light sources and light changes rapidly, there are two disadvantages of current gamma transform in pre-processing face image: one is that the parameters of transformation need to be set based on experience; the other is that the details of the transformed image are not obvious enough.
Objective: To improve the current gamma transform.
Methods: This study proposes a weighted fusion algorithm of adaptive gamma transform and edge feature extraction. First, this paper proposes an adaptive gamma transform algorithm for face image pre-processing, that is, the parameter of transformation generated by calculation according to the specific gray value of the input face image. Secondly, this paper uses a Sobel edge detection operator to extract the edge information of the transformed image to get the edge detection image. Finally, this paper uses the adaptively transformed image and the edge detection image to obtain the final processing results through a weighted fusion algorithm.
Results: The contrast of the face image after pre-processing is appropriate, and the details of the image are obvious.
Conclusion: The method proposed in this paper can enhance the face image while retaining more face details, without human-computer interaction, and has a lower computational complexity degree.
Keywords: Complex illumination, face image, gamma transform, adaptive, Sobel operator, light source.
Graphical Abstract
[http://dx.doi.org/10.1109/ITNEC.2019.8729021]
[http://dx.doi.org/10.1109/ITCGI.2018.8602903]
[http://dx.doi.org/10.1109/ICECCT.2015.7226020]
[http://dx.doi.org/10.1109/CVPR.2018.00632]
[http://dx.doi.org/10.1109/TPAMI.2003.1177153]
[http://dx.doi.org/10.1109/TSP.2018.8441452]
[http://dx.doi.org/10.1109/ICPR.2018.8545659]
[http://dx.doi.org/10.1109/CRCSIT.2017.7965559]
[http://dx.doi.org/10.1109/ICIIBMS.2018.8549971]
[http://dx.doi.org/10.1109/TMI.2017.2779406] [PMID: 29621002]
[http://dx.doi.org/10.1109/34.927464]
[http://dx.doi.org/10.1109/34.908964]
[http://dx.doi.org/10.1109/TIP.2009.2028923] [PMID: 19651555]
[http://dx.doi.org/10.1109/CCECE.2016.7726848]
[http://dx.doi.org/10.1109/ACCESS.2018.2878603]
[http://dx.doi.org/10.1109/TSP.2019.8769097]
[http://dx.doi.org/10.1109/LSP.2015.2490202]
[http://dx.doi.org/10.1109/ACCESS.2019.2952899]
[http://dx.doi.org/10.1109/ICORAS.2016.7872603]
[http://dx.doi.org/10.1109/ATSIP.2018.8364467]
[http://dx.doi.org/10.1109/INFOCOMTECH.2018.8722388]
[http://dx.doi.org/10.1109/LSP.2015.2400211]