Abstract
Human pose estimation has been a prevalent field of computer vision and sensing study. In recent years, it has made many advances that have helped humanity in the fields of sports, surveillance, healthcare, etc. Yoga is an ancient science intended to improve physical, mental and spiritual wellbeing. It involves many kinds of asanas or postures that a practitioner can perform. Thus, the benefits of pose estimation can also be used for Yoga to help users assume Yoga postures with better accuracy. The Yoga practitioner can detect their own current posture in real-time, and the pose estimation method can provide them with corrective feedback if they commit mistakes. Yoga pose estimation can also help with remote Yoga instruction by the expert teacher, which can be a boon during a pandemic. This paper reviews various Machine Learning, Artificial Intelligence-enabled techniques available for real-time pose estimation and research pursued recently. We classify them based on the input they use for estimating the individual's pose. We also discuss multiple Yoga posture estimation systems in detail. We discuss the most commonly used keypoint estimation techniques in the existing literature. In addition to this, we discuss the real-time performance of the presented works. The paper further discusses the datasets and evaluation metrics available for pose estimation.
Graphical Abstract
[http://dx.doi.org/10.3390/diagnostics12112750] [PMID: 36359592]
[http://dx.doi.org/10.3390/electronics11131950]
[http://dx.doi.org/10.1155/2018/7068349] [PMID: 29487619]
[http://dx.doi.org/10.1109/CVPR.2014.214]
[http://dx.doi.org/10.1109/CVPR.2016.512]
[http://dx.doi.org/10.1109/TPAMI.2019.2929257] [PMID: 31331883]
[http://dx.doi.org/10.1007/s00521-019-04232-7]
[http://dx.doi.org/10.1006/cviu.1995.1004]
[http://dx.doi.org/10.1109/AFGR.1996.557241]
[http://dx.doi.org/10.1016/j.dsp.2013.06.008]
[http://dx.doi.org/10.1109/LSP.2014.2301726]
[http://dx.doi.org/10.1109/TIP.2014.2364113] [PMID: 25347879]
[http://dx.doi.org/10.3390/s16121966] [PMID: 27898003]
[http://dx.doi.org/10.1109/CVPR.2019.00584]
[http://dx.doi.org/10.1109/CVPR.2016.533]
[http://dx.doi.org/10.1109/CVPR.2019.01225]
[http://dx.doi.org/10.1109/CVPR.2018.00762]
[http://dx.doi.org/10.1109/CVPRW56347.2022.00297]
[http://dx.doi.org/10.1109/CVPRW50498.2020.00203]
[http://dx.doi.org/10.5281/zenodo.3908559]
[http://dx.doi.org/10.1609/aaai.v34i07.6999]
[http://dx.doi.org/10.3390/rs1030243]
[http://dx.doi.org/10.18653/v1/D15-1167]
[http://dx.doi.org/10.36548/jiip.2021.2.003]
[http://dx.doi.org/10.1109/CVPR.2013.391]
[http://dx.doi.org/10.1109/R10-HTC.2017.8289047]
[http://dx.doi.org/10.1109/ICRAS.2018.8443267]
[http://dx.doi.org/10.1007/978-3-319-04114-8_42]
[http://dx.doi.org/10.1109/ISED.2018.8703985]
[http://dx.doi.org/10.1109/TMM.2019.2904880]
[http://dx.doi.org/10.1109/ICSGRC.2011.5991827]
[http://dx.doi.org/10.38124/IJISRT20SEP704]
[http://dx.doi.org/10.31979/etd.rkgu-pc9k]
[http://dx.doi.org/10.1109/ICCICT50803.2021.9509937]
[http://dx.doi.org/10.1109/AITC.2019.8920892]
[http://dx.doi.org/10.1007/s00521-020-05405-5]
[http://dx.doi.org/10.1109/CVPRW50498.2020.00527]
[PMID: 34566258]
[http://dx.doi.org/10.1088/1757-899X/1110/1/012002]
[http://dx.doi.org/10.1109/CSNT48778.2020.9115758]
[http://dx.doi.org/10.1007/s42979-022-01618-8] [PMID: 36785804]
[http://dx.doi.org/10.1155/2022/4311350] [PMID: 35371230]
[http://dx.doi.org/10.3390/s19235129] [PMID: 31771131]
[http://dx.doi.org/10.1109/TAI.2021.3096175]
[http://dx.doi.org/10.3390/electronics6030061]
[http://dx.doi.org/10.1109/JIOT.2019.2915095]
[http://dx.doi.org/10.1109/CVPR.2014.82]
[http://dx.doi.org/10.5772/62163]
[http://dx.doi.org/10.1109/CVPR.2016.308]
[http://dx.doi.org/10.1609/aaai.v31i1.11231]
[http://dx.doi.org/10.1007/978-3-319-10602-1_48]
[http://dx.doi.org/10.1007/s11263-009-0273-6]
[http://dx.doi.org/10.1109/TPAMI.2013.248] [PMID: 26353306]
[http://dx.doi.org/ 10.1109/CVPR.2017.492]
[http://dx.doi.org/10.1186/s12938-020-00762-7] [PMID: 32326957]
[http://dx.doi.org/10.1109/CVPR.2011.5995316]
[http://dx.doi.org/10.1007/s11042-015-3177-1]