Abstract
Background: Aiming at the problems of color distortion, low clarity, and poor visibility of underwater images caused by a complex underwater environment, a wavelet fusion method, UIPWF, for underwater image enhancement is proposed.
Methods: First of all, an improved NCB color balance method is designed to identify and cut the abnormal pixels and balance the color of R, G, and B channels by an affine transformation. Then, the color correction map is converted to CIELab color space, and the L component is equalized with contrast limited adaptive histogram to obtain the brightness enhancement map. Finally, different fusion rules are designed for low-frequency and high-frequency components, the pixel level wavelet fusion of color balance image and brightness enhancement image is realized to improve the edge detail contrast on the basis of protecting the underwater image contour. Results: The experiments demonstrate that compared with the existing underwater image processing methods, UIPWF is highly effective in the underwater image enhancement task, improves the objective indicators greatly, and produces visually pleasing enhancement images with clear edges and reasonable color information. Conclusion: The UIPWF method can effectively mitigate the color distortion, improve the clarity and contrast, which is applicable for underwater image enhancement in different environments.Keywords: Image enhancement, underwater image, color correction, luminance enhancement, wavelet fusion, Dark Channel Prior (DCP).
Graphical Abstract
[http://dx.doi.org/10.1049/iet-ipr.2019.0117]
[http://dx.doi.org/10.1109/TPAMI.2010.168] [PMID: 20820075]
[http://dx.doi.org/10.1109/TIP.2011.2179666] [PMID: 22180510]
[http://dx.doi.org/10.1016/j.jvcir.2014.11.006]
[http://dx.doi.org/10.1109/TCSI.2017.2751671]
[http://dx.doi.org/10.1109/JOE.2018.2865045]
[http://dx.doi.org/10.6119/JMST.201808_26(4).0006]
[http://dx.doi.org/10.5201/ipol.2011.llmps-scb]
[http://dx.doi.org/10.1109/CVPR.2012.6247661]
[http://dx.doi.org/10.1109/TIP.2016.2612882] [PMID: 28113974]
[http://dx.doi.org/10.1016/j.neucom.2017.03.029]
[http://dx.doi.org/10.1109/TIP.2017.2759252] [PMID: 28981416]
[http://dx.doi.org/10.1016/j.inffus.2016.05.004]
[http://dx.doi.org/10.3724/SP.J.1089.2019.17405]
[http://dx.doi.org/10.2478/msr-2014-0014]
[http://dx.doi.org/10.1016/j.inpa.2017.06.001]
[http://dx.doi.org/10.1007/s11042-016-3453-8]
[http://dx.doi.org/10.1109/JOE.2015.2469915]
[http://dx.doi.org/10.1109/TMM.2014.2373812]
[http://dx.doi.org/10.1109/TIP.2015.2491020] [PMID: 26513783]