Abstract
Background: In computer vision applications, gait-based age estimation across several cameras is critical, especially when following the same person from various viewpoints.
Introduction: Gait-based age recognition is a very challenging task as it involves multiple hurdles, such as a change in the viewpoint of the person. The proposed system handles this problem by performing a sequence of tasks, such as GEI formation from silhouette, applying DCT on GEI and extracting the features and finally using MLP for age estimation. The proposed system proves its effectiveness by comparing the performance with state-of-the-art methods, conventional methods and deep learning-based methods. The performance of the system is estimated on OU-MVLP and OULP-Age datasets. The experimental results show the robustness of the system against viewing angle variations.
Objective: This study aimed to implement the system, which adopts a lightweight approach for gaitbased age estimation.
Methods: The proposed system uses a combination of the discrete cosine transform (DCT) and multilayer perceptron (MLP) on gait energy image (GEI) to perform age estimation.
Results: The performance of the system is extensively evaluated on the OU-MVLP and OULP-Age datasets.
Conclusion: The proposed system attains the best mean absolute error (MAE) of 5.05 (in years) for the OU-MVLP dataset and 5.65 for the OULP dataset.
Keywords: Gait, Age Estimation, GEI, OU-MVLP, OULP
Graphical Abstract
[http://dx.doi.org/10.2174/2666255813666191119101348]
[http://dx.doi.org/10.1109/ICPR.2006.67]
[http://dx.doi.org/10.1002/(SICI)1099-0720(199912)13:6<513::AID-ACP616>3.0.CO;2-8]
[http://dx.doi.org/10.1007/11744078_12]
[http://dx.doi.org/10.1109/TPAMI.2005.39] [PMID: 15688555]
[http://dx.doi.org/10.1109/TSMCC.2012.2192727]
[http://dx.doi.org/10.1109/CVPRW.2010.5544598]
[http://dx.doi.org/10.1109/TIFS.2010.2069560]
[http://dx.doi.org/10.1007/3-540-45344-X_43]
[http://dx.doi.org/10.1109/TBME.2005.845241] [PMID: 15887532]
[http://dx.doi.org/10.1109/IJCB.2011.6117531]
[http://dx.doi.org/10.1109/TSMCC.2007.913886]
[http://dx.doi.org/10.1109/TIP.2009.2020535] [PMID: 19447706]
[http://dx.doi.org/10.1016/S0022-3956(00)00017-0] [PMID: 11104839]
[http://dx.doi.org/10.1007/s11042-018-6049-7]
[http://dx.doi.org/10.1007/978-3-642-19309-5_34]
[http://dx.doi.org/10.1023/B:STCO.0000035301.49549.88]
[http://dx.doi.org/10.1145/130385.130401]
[http://dx.doi.org/10.1109/ICIP.2017.8296252]
[http://dx.doi.org/10.1109/ICB45273.2019.8987240]
[http://dx.doi.org/10.1109/BTAS46853.2019.9185973]
[http://dx.doi.org/10.1186/s41074-017-0035-2]
[http://dx.doi.org/10.1007/978-3-030-21074-8_5]
[http://dx.doi.org/10.1109/TBIOM.2021.3080300]
[http://dx.doi.org/10.1109/WACV48630.2021.00350]
[http://dx.doi.org/10.1109/APSIPA.2015.7415382]
[http://dx.doi.org/10.1049/iet-bmt.2016.0176]
[http://dx.doi.org/10.1049/iet-cvi.2017.0055]
[http://dx.doi.org/10.1016/j.matpr.2020.08.298]
[http://dx.doi.org/10.1109/ACCESS.2021.3095477]
[http://dx.doi.org/10.1109/ACCESS.2019.2901959]
[http://dx.doi.org/10.1109/TCSI.2021.3091001]
[http://dx.doi.org/10.1109/TPAMI.2006.38] [PMID: 16468626]
[http://dx.doi.org/10.1109/T-C.1974.223784]
[http://dx.doi.org/10.3923/jeasci.2018.436.440]
[http://dx.doi.org/10.1109/ACCESS.2019.2947269]
[http://dx.doi.org/10.1109/PAIS.2018.8598511]
[http://dx.doi.org/10.1109/SSD.2008.4632863]
[http://dx.doi.org/10.1109/ACV.1998.732884]
[http://dx.doi.org/10.1109/ICDCSW.2004.1284029]
[http://dx.doi.org/10.3844/jcssp.2007.740.746]
[http://dx.doi.org/10.1186/s41074-018-0039-6]
[http://dx.doi.org/10.1186/s41074-019-0054-2]
[http://dx.doi.org/10.1109/ICB.2016.7550060]
[http://dx.doi.org/10.3795/KSME-A.2017.41.11.1035]
[http://dx.doi.org/10.1007/978-981-10-4765-7_34]
[http://dx.doi.org/10.1109/ICOEI.2019.8862788]
[http://dx.doi.org/10.1007/978-981-16-0873-5_6]