Abstract
Deep Learning (DL) models have demonstrated remarkable proficiency in image classification and recognition tasks, surpassing human capabilities. The observed enhancement in performance can be attributed to the utilization of extensive datasets. Nevertheless, DL models have huge data requirements. Widening the learning capability of such models from limited samples even today remains a challenge, given the intrinsic constraints of small datasets. The trifecta of challenges, encompassing limited labeled datasets, privacy, poor generalization performance, and the costliness of annotations, further compounds the difficulty in achieving robust model performance. Overcoming the challenge of expanding the learning capabilities of Deep Learning models with limited sample sizes remains a pressing concern even today. To address this critical issue, our study conducts a meticulous examination of established methodologies, such as Data Augmentation and Transfer Learning, which offer promising solutions to data scarcity dilemmas. Data Augmentation, a powerful technique, amplifies the size of small datasets through a diverse array of strategies. These encompass geometric transformations, kernel filter manipulations, neural style transfer amalgamation, random erasing, Generative Adversarial Networks, augmentations in feature space, and adversarial and meta- learning training paradigms.
Furthermore, Transfer Learning emerges as a crucial tool, leveraging pre-trained models to facilitate knowledge transfer between models or enabling the retraining of models on analogous datasets. Through our comprehensive investigation, we provide profound insights into how the synergistic application of these two techniques can significantly enhance the performance of classification tasks, effectively magnifying scarce datasets. This augmentation in data availability not only addresses the immediate challenges posed by limited datasets but also unlocks the full potential of working with Big Data in a new era of possibilities in DL applications.
Graphical Abstract
[http://dx.doi.org/10.1007/s10462-021-10004-4]
[http://dx.doi.org/10.1109/CVPR.2016.308]
[http://dx.doi.org/10.1038/nature21056] [PMID: 28117445]
[http://dx.doi.org/10.1186/s40537-019-0197-0]
[http://dx.doi.org/10.1007/s00521-020-05514-1]
[http://dx.doi.org/10.1002/int.22586]
[http://dx.doi.org/10.1109/LCOMM.2022.3145647]
[http://dx.doi.org/10.1016/j.scs.2023.104486]
[http://dx.doi.org/10.1016/j.engappai.2023.106082]
[http://dx.doi.org/10.1007/s10462-023-10466-8] [PMID: 37362885]
[http://dx.doi.org/10.1007/s13042-023-01999-z] [PMID: 37360881]
[http://dx.doi.org/10.1016/j.jksuci.2023.101567]
[http://dx.doi.org/10.3115/v1/D14-1162]
[http://dx.doi.org/10.1186/s40537-016-0043-6]
[http://dx.doi.org/10.1109/CVPR.2009.5206848]
[http://dx.doi.org/10.1109/TPAMI.2018.2857768] [PMID: 30028691]
[http://dx.doi.org/10.1109/CVPR.2014.220]
[http://dx.doi.org/10.1109/MIS.2009.36]
[http://dx.doi.org/10.1186/s40537-018-0151-6]
[http://dx.doi.org/10.1109/5.726791]
[http://dx.doi.org/10.1109/ACCESS.2017.2696121]
[http://dx.doi.org/10.1109/TMI.2017.2708987] [PMID: 28574346]
[http://dx.doi.org/10.1016/j.neuroimage.2018.03.045] [PMID: 29571715]
[http://dx.doi.org/10.1007/978-3-319-68612-7_71]
[http://dx.doi.org/10.1109/ISBI.2018.8363564]
[http://dx.doi.org/10.1109/TENSYMP50017.2020.9230778]
[http://dx.doi.org/10.1109/CVPR.2014.81]
[http://dx.doi.org/10.1109/CVPR.2016.91]
[http://dx.doi.org/10.1007/978-3-319-24574-4_28]
[http://dx.doi.org/10.1109/ACCESS.2018.2872698]
[http://dx.doi.org/10.5244/C.29.41]
[http://dx.doi.org/10.1007/s10462-016-9537-z]
[http://dx.doi.org/10.1109/CVPR.2016.514]
[http://dx.doi.org/10.1007/978-3-319-46454-1_35]
[http://dx.doi.org/10.1007/978-3-642-42051-1_16]
[http://dx.doi.org/10.1109/SSCI.2018.8628742]
[http://dx.doi.org/10.1007/978-3-319-46475-6_7]
[http://dx.doi.org/10.1109/CVPR.2017.241]
[http://dx.doi.org/10.1109/CVPR.2016.350]
[http://dx.doi.org/10.1609/aaai.v33i01.33014780]
[http://dx.doi.org/10.1109/CVPR.2005.177]
[http://dx.doi.org/10.1023/B:VISI.0000029664.99615.94]
[http://dx.doi.org/10.1142/S0218488598000094]
[http://dx.doi.org/10.1145/1553374.1553380]
[http://dx.doi.org/10.1007/978-3-319-10593-2_13]
[http://dx.doi.org/10.1109/CVPR.2017.19]
[http://dx.doi.org/10.1109/ICCV.2017.629]
[http://dx.doi.org/10.1016/j.neunet.2018.07.011] [PMID: 30092410]
[http://dx.doi.org/10.1016/j.neucom.2016.12.025]
[http://dx.doi.org/10.1109/CVPR.2018.00916]
[http://dx.doi.org/10.1109/TIP.2019.2916751] [PMID: 31107649]
[http://dx.doi.org/10.1109/ICASSP.2019.8683197]
[http://dx.doi.org/10.3389/fbioe.2022.806761] [PMID: 35237576]
[http://dx.doi.org/10.1609/aaai.v35i14.17491]
[http://dx.doi.org/10.1007/s00521-020-05437-x] [PMID: 33132536]
[http://dx.doi.org/10.3390/a15080287]
[http://dx.doi.org/10.3390/electronics10070837]
[http://dx.doi.org/10.1007/978-3-030-59719-1_6]
[http://dx.doi.org/10.1109/IGARSS52108.2023.10283456]
[http://dx.doi.org/10.3390/diagnostics12010135] [PMID: 35054303]
[http://dx.doi.org/10.1016/j.ecoinf.2023.102011]
[http://dx.doi.org/10.1016/j.bspc.2023.104875]
[http://dx.doi.org/10.1007/s13042-022-01555-1] [PMID: 35432624]
[http://dx.doi.org/10.1186/s40537-023-00727-2]
[http://dx.doi.org/10.1109/CVPR52729.2023.02291]
[http://dx.doi.org/10.1109/CIS.2019.00025]
[http://dx.doi.org/10.1186/s40537-017-0089-0]
[PMID: 33974543]