Generic placeholder image

International Journal of Sensors, Wireless Communications and Control

Editor-in-Chief

ISSN (Print): 2210-3279
ISSN (Online): 2210-3287

Systematic Review Article

Knowledge Mapping of Human Activity Recognition Techniques for Assistive Living

Author(s): Preeti Agarwal* and Mansaf Alam

Volume 13, Issue 4, 2023

Published on: 18 September, 2023

Page: [203 - 225] Pages: 23

DOI: 10.2174/2210327913666230911113149

Price: $65

conference banner
Abstract

Purpose: Human Activity Recognition (HAR) is a subject of research that identifies an individual’s activities for assistive living. The proliferation of ICT and sensor technology prompted HAR to flourish beyond unfathomable levels, having immense human-centric applications. The development of accurate HAR systems involves complex statistical and computational tasks from signal acquisition to activity classification. This research aims to conduct a systematic review of recent techniques proposed for each stage of HAR application development.

Methodology: The review is conducted following Kitchenham principles, using Scopus and Web of Science databases. Firstly, research questions were formulated, followed by the search strategy definition. Based on assessment criteria, 193 papers are shortlisted and thoroughly analyzed to extract research- related information.

Results: The techniques identified in 193 articles are comprehensively mapped from four aspects: data acquisition, data preprocessing and feature engineering, learning algorithm, and evaluation. Each technique is examined for its strengths and limitations to assist application developers in selecting the best one for their needs. The prevailing challenges and upcoming research opportunities are thoroughly explored.

Conclusion: The ever-expanding literature in the field necessitated an update to the status of HAR literature. Compared to other reviews that focused on specific methods, fields of application, and datatypes, to the best of our understanding, this is the first evaluation of its kind that provides a broader mapping of HAR approaches. The findings of this analysis will provide researchers and newcomers in the field an up-to-date and holistic view of the complete body of work in this area.

Next »
Graphical Abstract

[1]
Lara OD, Labrador MA. Labrador, A survey on human activity recognition using wearable sensors. IEEE Commun Surv Tutor 2013; 15(3): 1192-209.
[http://dx.doi.org/10.1109/SURV.2012.110112.00192]
[2]
Agarwal P, Alam M. Edge optimized and personalized lifelogging framework using ensembled metaheuristic algorithms. Comput Electr Eng 2022; 100: 107884.
[http://dx.doi.org/10.1016/j.compeleceng.2022.107884]
[3]
Azar SM, Ghadimi Atigh M, Nickabadi A, Alahi A. Convolutional relational machine for group activity recognition. IEEE/CVF Conference on Computer Vision and Pattern Recognition. 7892-901.
[http://dx.doi.org/10.1109/CVPR.2019.00808]
[4]
Chen K, Zhang D, Yao L, Guo B, Yu Z, Liu Y. Deep learning for sensor-based human activity recognition: Overview, challenges, and opportunities. ACM Comput Surv 2022; 54(4): 1-40.
[http://dx.doi.org/10.1145/3447744]
[5]
Wang J, Chen Y, Hao S, Peng X, Hu L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit Lett 2019; 119: 3-11.
[http://dx.doi.org/10.1016/j.patrec.2018.02.010]
[6]
Agarwal P, Alam M. Quantum-inspired support vector machines for human activity recognition in industry 4.0. In: In Proceedings of Data Analytics and Management: ICDAM. 2022; 90: pp. 281-90.
[http://dx.doi.org/10.1007/978-981-16-6289-8_24]
[7]
Agarwal P, Alam M. A lightweight deep learning model for human activity recognition on edge devices. Procedia Comput Sci 2020; 167: 2364-73.
[http://dx.doi.org/10.1016/j.procs.2020.03.289]
[8]
Kumari P, Mathew L, Syal P. Increasing trend of wearables and multimodal interface for human activity monitoring: A review. Biosens Bioelectron 2017; 90: 298-307.
[http://dx.doi.org/10.1016/j.bios.2016.12.001] [PMID: 27931004]
[9]
Demrozi F, Pravadelli G, Bihorac A, Rashidi P. Human activity recognition using inertial, physiological and environmental sensors: A comprehensive survey. IEEE Access 2020; 8: 210816-36.
[http://dx.doi.org/10.1109/ACCESS.2020.3037715] [PMID: 33344100]
[10]
Wang Y, Cang S, Yu H. A survey on wearable sensor modality centred human activity recognition in health care. Expert Syst Appl 2019; 137: 167-90.
[http://dx.doi.org/10.1016/j.eswa.2019.04.057]
[11]
Wang A, Chen G, Yang J, Zhao S, Chang CY. A comparative study on human activity recognition using inertial sensors in a smartphone. IEEE Sens J 2016; 16(11): 4566-78.
[http://dx.doi.org/10.1109/JSEN.2016.2545708]
[12]
Minh Dang L, Min K, Wang H, Jalil Piran M, Hee Lee C, Moon H. Sensor-based and vision-based human activity recognition: A comprehensive survey. Pattern Recognit 2020; 108: 107561.
[http://dx.doi.org/10.1016/j.patcog.2020.107561]
[13]
Ramanujam E, Perumal T, Padmavathi S. Human activity recognition with smartphone and wearable sensors using deep learning techniques: A review. IEEE Sens J 2021; 21(12): 13029-40.
[http://dx.doi.org/10.1109/JSEN.2021.3069927]
[14]
Cornacchia M, Ozcan K, Zheng Y, Velipasalar S. A survey on activity detection and classification using wearable sensors. IEEE Sens J 2017; 17(2): 386-403.
[http://dx.doi.org/10.1109/JSEN.2016.2628346]
[15]
Sousa Lima W, Souto E, El-Khatib K, Jalali R, Gama J. Human activity recognition using inertial sensors in a smartphone: An overview. Sensors 2019; 19(14): 3213.
[http://dx.doi.org/10.3390/s19143213] [PMID: 31330919]
[16]
Carvalho LI, Sofia RC. A review on scaling mobile sensing platforms for human activity recognition: challenges and recommendations for future research. IoT 2020; 1(2): 451-73.
[http://dx.doi.org/10.3390/iot1020025]
[17]
Fu B, Damer N, Kirchbuchner F, Kuijper A. Sensing technology for human activity recognition: A comprehensive survey. IEEE Access 2020; 8: 83791-20.
[http://dx.doi.org/10.1109/ACCESS.2020.2991891]
[18]
De-La-Hoz-Franco E, Ariza-Colpas P, Quero JM, Espinilla M. Sensor-based datasets for human activity recognition: A systematic review of literature. IEEE Access 2018; 6: 59192-210.
[http://dx.doi.org/10.1109/ACCESS.2018.2873502]
[19]
Kitchenham B, Charters SM. Guidelines for performing systematic literature reviews in software engineering. In: Technical Report EBSE Durham University Joint Report. 2007.
[20]
Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Int J Surg 2010; 8(5): 336-41.
[http://dx.doi.org/10.1016/j.ijsu.2010.02.007] [PMID: 20171303]
[21]
Chavarriaga R, Sagha H, Calatroni A, et al. The opportunity challenge: A benchmark database for on-body sensor-based activity recognition. Pattern Recognit Lett 2013; 34(15): 2033-42.
[http://dx.doi.org/10.1016/j.patrec.2012.12.014]
[22]
Godfrey A, Hetherington V, Shum H, Bonato P, Lovell NH, Stuart S. From A to Z: Wearable technology explained. Maturitas 2018; 113: 40-7.
[http://dx.doi.org/10.1016/j.maturitas.2018.04.012] [PMID: 29903647]
[23]
Wang Y, Cang S, Yu H. A data fusion-based hybrid sensory system for older people’s daily activity and daily routine recognition. IEEE Sens J 2018; 18(16): 6874-88.
[http://dx.doi.org/10.1109/JSEN.2018.2833745]
[24]
Shoaib M, Hans S, Havinga PJ. Towards physical activity recognition using smartphone sensors. IEEE 10t int conf ubiq intell comp. 2013; p. 80-7.
[http://dx.doi.org/10.1109/UIC-ATC.2013.43]
[25]
Hasegawa T. Smartphone sensor-based human activity recognition robust to different sampling rates. IEEE Sens J 2021; 21(5): 6930-41.
[http://dx.doi.org/10.1109/JSEN.2020.3038281]
[26]
Micucci D, Mobilio M, Napoletano P. UniMiB SHAR: A dataset for human activity recognition using acceleration data from smartphones. Appl Sci 2017; 7(10): 1101.
[http://dx.doi.org/10.3390/app7101101]
[27]
Khan AM, Lee Y-K, Lee S-Y, Kim T-S. Human activity recognition via an accelerometer-enabled-smartphone using kernel discriminant analysis. 5th int conffut inform techn 1-6.
[http://dx.doi.org/10.1109/FUTURETECH.2010.5482729]
[28]
Alruban A, Alobaidi H, Clarke N, Li F. Physical activity recognition by utilising smartphone sensor signals Proceedings of the 8th int conf pattern recogn appl meth. 342-51.
[http://dx.doi.org/10.5220/0007271903420351]
[29]
Almaslukh B, Artoli A, Al-Muhtadi J. A robust deep learning approach for position-independent smartphone-based human activity recognition. Sensors 2018; 18(11): 3726.
[http://dx.doi.org/10.3390/s18113726] [PMID: 30388855]
[30]
Bharti P, De D, Chellappan S, Das SK. HuMAn: Complex activity recognition with multi-modal multi-positional body sensing. IEEE Trans Mobile Comput 2019; 18(4): 857-70.
[http://dx.doi.org/10.1109/TMC.2018.2841905]
[31]
Hassan MM, Uddin MZ, Mohamed A, Almogren A. A robust human activity recognition system using smartphone sensors and deep learning. Future Gener Comput Syst 2018; 81: 307-13.
[http://dx.doi.org/10.1016/j.future.2017.11.029]
[32]
Ghosh S, Kim S, Ijaz MF, Singh PK, Mahmud M. Classification of mental stress from wearable physiological sensors using image-encoding-based deep neural network. Biosensors 2022; 12(12): 1153.
[http://dx.doi.org/10.3390/bios12121153] [PMID: 36551120]
[33]
Hoang ML, Carratù M, Paciello V, Pietrosanto A. Body temperature—indoor condition monitor and activity recognition by mems accelerometer based on IoT-alert system for people in quarantine due to COVID-19. Sensors 2021; 21(7): 2313.
[http://dx.doi.org/10.3390/s21072313] [PMID: 33810301]
[34]
Cheng J, Sundholm M, Zhou B, Hirsch M, Lukowicz P. Smart-surface: Large scale textile pressure sensors arrays for activity recognition. Pervasive Mobile Comput 2016; 30: 97-112.
[http://dx.doi.org/10.1016/j.pmcj.2016.01.007]
[35]
Bhattacharya S, Lane ND. From smart to deep: Robust activity recognition on smartwatches using deep learning. IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops. Sydney, NSW, Australia. 2016; pp. 2016: 1-6.
[http://dx.doi.org/10.1109/PERCOMW.2016.7457169]
[36]
Ruan W, Sheng QZ, Xu P, Yang L, Gu T, Shangguan L. Making sense of doppler effect for multi-modal hand motion detection. IEEE Trans Mobile Comput 2018; 17(9): 2087-100.
[http://dx.doi.org/10.1109/TMC.2017.2762677]
[37]
Liu J, Chen J, Jiang H, Jia W. Activity recognition in wearable ECG monitoring aided by accelerometer data. In: IEEE international symposium on circuits and systems (ISCAS). IEEE 2018; pp. 1-4.
[38]
Hussain F, Hussain F, Ehatisham-ul-Haq M, Azam MA. Activity-aware fall detection and recognition based on wearable sensors. IEEE Sens J 2019; 19(12): 4528-36.
[http://dx.doi.org/10.1109/JSEN.2019.2898891]
[39]
Zia ur Rehman M. Multiday EMG-based classification of hand motions with deep learning techniques. Sensors 2018; 18(8): 2479.
[40]
Mehrang S, Pietila J, Tolonen J, et al. Human activity recognition using a single optical heart rate monitoring wristband equipped with triaxial accelerometer. IFMBE Proc 2018; 65: 587-90.
[http://dx.doi.org/10.1007/978-981-10-5122-7_147]
[41]
Meyer J, Frank A, Schlebusch T, Kasneci E. A CNN-based human activity recognition system combining a laser feedback interferometry eye movement sensor and an IMU for context-aware smart glasses. Proc ACM Interact Mob Wearable Ubiquitous Technol 2021; 5(4): 1-24.
[http://dx.doi.org/10.1145/3494998]
[42]
Yoon S, Sim JK, Cho YH. A flexible and wearable human stress monitoring patch. Sci Rep 2016; 6(1): 23468.
[http://dx.doi.org/10.1038/srep23468] [PMID: 27004608]
[43]
Esgalhado F, Fernandes B, Vassilenko V, Batista A, Russo S. The application of deep learning algorithms for PPG signal processing and classification. Computers 2021; 10(12): 158.
[http://dx.doi.org/10.3390/computers10120158]
[44]
Lorussi F, Carbonaro N, De Rossi D, Paradiso R, Veltink P, Tognetti A. Wearable textile platform for assessing stroke patient treatment in daily life conditions. Front Bioeng Biotechnol 2016; 4(MAR): 28.
[http://dx.doi.org/10.3389/fbioe.2016.00028] [PMID: 27047939]
[45]
Xiao L, Wu K, Tian X, Luo J. Activity-specific caloric expenditure estimation from kinetic energy harvesting in wearable devices. Pervasive Mobile Comput 2020; 67: 101185.
[http://dx.doi.org/10.1016/j.pmcj.2020.101185]
[46]
Ao SI, Gelman L, Karimi HR, Tiboni M. Advances in machine learning for sensing and condition monitoring. Appl Sci 2022; 12(23): 12392.
[http://dx.doi.org/10.3390/app122312392]
[47]
Yao L, Sheng QZ, Li X, et al. Compressive representation for device-free activity recognition with passive RFID signal strength. IEEE Trans Mobile Comput 2018; 17(2): 293-306.
[http://dx.doi.org/10.1109/TMC.2017.2706282]
[48]
Li X, He Y, Jing X. A survey of deep learning-based human activity recognition in radar. Remote Sens 2019; 11(9): 1068.
[http://dx.doi.org/10.3390/rs11091068]
[49]
Fan YC, Tseng YH, Wen CY. A novel deep neural network method for har-based team training using body-worn inertial sensors. Sensors 2022; 22(21): 8507.
[http://dx.doi.org/10.3390/s22218507] [PMID: 36366202]
[50]
Ye W, Chen H, Li B. Using an End-to-End convolutional network on radar signal for human activity classification. IEEE Sens J 2019; 19(24): 12244-52.
[http://dx.doi.org/10.1109/JSEN.2019.2938997]
[51]
Anwary A, Yu H, Vassallo M. An automatic gait feature extraction method for identifying gait asymmetry using wearable sensors. Sensors 2018; 18(3): 676.
[http://dx.doi.org/10.3390/s18020676] [PMID: 29495299]
[52]
Li F, Shirahama K, Nisar M, Köping L, Grzegorzek M. Comparison of feature learning methods for human activity recognition using wearable sensors. Sensors 2018; 18(3): 679.
[http://dx.doi.org/10.3390/s18020679] [PMID: 29495310]
[53]
Fontana JM, Higgins JA, Schuckers SC, et al. Energy intake estimation from counts of chews and swallows. Appetite 2015; 85: 14-21.
[http://dx.doi.org/10.1016/j.appet.2014.11.003] [PMID: 25447016]
[54]
Zhu R, Xiao Z, Li Y, et al. Efficient human activity recognition solving the confusing activities via deep ensemble learning. IEEE Access 2019; 7: 75490-9.
[http://dx.doi.org/10.1109/ACCESS.2019.2922104]
[55]
Pires IM, Hussain F, Garcia NM, Zdravevski E. An efficient data imputation technique for human activity recognition. in Proceedings of the 12th IADIS International Conference e-Health EH 2020 - Part of the 14th Multi Conference on Computer Science and Information Systems. 47-54.
[http://dx.doi.org/10.21203/rs.3.rs-40843/v1]
[56]
Zheng X, Wang M, Ordieres-Meré J. Comparison of data preprocessing approaches for applying deep learning to human activity recognition in the context of industry 4.0. Sensors 2018; 18(7): 2146.
[http://dx.doi.org/10.3390/s18072146] [PMID: 29970873]
[57]
Alsheikh MA, Selim A, Niyato D, Doyle L, Lin S, Tan H-P. Deep activity recognition models with triaxial accelerometers. in AAAI Workshop - Technical Report. 16: 8-13.
[58]
Ponce H, Martínez-Villaseñor M, Miralles-Pechuán L. A novel wearable sensor-based human activity recognition approach using artificial hydrocarbon networks. Sensors 2016; 16(7): 1033.
[http://dx.doi.org/10.3390/s16071033] [PMID: 27399696]
[59]
Dehghani A, Sarbishei O, Glatard T, Shihab E. A quantitative comparison of overlapping and non-overlapping sliding windows for human activity recognition using inertial sensors. Sensors 2019; 19(22): 5026.
[http://dx.doi.org/10.3390/s19225026] [PMID: 31752158]
[60]
Sun J, Fu Y, Li S, He J, Xu C, Tan L. Sequential human activity recognition based on deep convolutional network and extreme learning machine using wearable sensors. J Sens 2018; 2018: 1-10.
[http://dx.doi.org/10.1155/2018/8580959]
[61]
Shirahama K, Grzegorzek M. On the generality of codebook approach for sensor-based human activity recognition. Electronics 2017; 6(2): 44.
[http://dx.doi.org/10.3390/electronics6020044]
[62]
Xiao F. A deep learning method for complex human activity recognition using virtual wearable sensors.Meng X, Xie X, Yue Y, Ding Z. Spatial Data and Intelligence Lecture Notes in Computer Science. Cham: Springer 2021; 12567: pp. 261-70.
[http://dx.doi.org/10.1007/978-3-030-69873-7_19]
[63]
Lago P, Inoue S. Comparing feature learning methods for human activity recognition: Performance study in new user scenario. International Conference on Imaging, Vision and Pattern Recognition, icIVPR 2019 with International Conference on Activity and Behavior Computing. 2019: 118-23.
[http://dx.doi.org/10.1109/ICIEV.2019.8858548]
[64]
Wang Z, Yang Z, Dong T. A review of wearable technologies for elderly care that can accurately track indoor position, recognize physical activities and monitor vital signs in real time. Sensors 2017; 17(2): 341.
[http://dx.doi.org/10.3390/s17020341] [PMID: 28208620]
[65]
Gu F, Khoshelham K, Valaee S, Shang J, Zhang R. Locomotion activity recognition using stacked denoising autoencoders. IEEE Internet Things J 2018; 5(3): 2085-93.
[http://dx.doi.org/10.1109/JIOT.2018.2823084]
[66]
Zebin T, Scully PJ, Ozanyan KB. Human activity recognition with inertial sensors using a deep learning approach. 2016 IEEE SENSORS. 30 October 2016 - 03 November 2016; Orlando, FL, USA. 2016.
[http://dx.doi.org/10.1109/ICSENS.2016.7808590]
[67]
Khan SS, Taati B. Detecting unseen falls from wearable devices using channel-wise ensemble of autoencoders. Expert Syst Appl 2017; 87: 280-90.
[http://dx.doi.org/10.1016/j.eswa.2017.06.011]
[68]
Chen K, Yao L, Zhang D, Wang X, Chang X, Nie F. A semisupervised recurrent convolutional attention model for human activity recognition. IEEE Trans Neural Netw Learn Syst 2020; 31(5): 1747-56.
[http://dx.doi.org/10.1109/TNNLS.2019.2927224] [PMID: 31329134]
[69]
Mandal M, Singh PK, Ijaz MF, Shafi J, Sarkar R. A tri-stage wrapper-filter feature selection framework for disease classification. Sensors 2021; 21(16): 5571.
[http://dx.doi.org/10.3390/s21165571] [PMID: 34451013]
[70]
Sahoo KK, Ghosh R, Mallik S, Roy A, Singh PK, Zhao Z. Wrapper-based deep feature optimization for activity recognition in the wearable sensor networks of healthcare systems. Sci Rep 2023; 13(1): 965.
[http://dx.doi.org/10.1038/s41598-022-27192-w] [PMID: 36653370]
[71]
Sarkar A, Hossain SKS, Sarkar R. Human activity recognition from sensor data using spatial attention-aided CNN with genetic algorithm. Neural Comput Appl 2023; 35(7): 5165-91.
[http://dx.doi.org/10.1007/s00521-022-07911-0] [PMID: 36311167]
[72]
Younsi M, Diaf M, Siarry P. Comparative study of orthogonal moments for human postures recognition. Eng Appl Artif Intell 2023; 120: 105855.
[http://dx.doi.org/10.1016/j.engappai.2023.105855]
[73]
Bian S, Liu M, Zhou B, Lukowicz P. The state-of-the-art sensing techniques in human activity recognition: A survey. Sensors 2022; 22(12): 4596.
[http://dx.doi.org/10.3390/s22124596] [PMID: 35746376]
[74]
Kundu AS, Mazumder O, Lenka PK, Bhaumik S. Hand gesture recognition based omnidirectional wheelchair control using IMU and EMG sensors. J Intell Robot Syst 2018; 91(3-4): 529-41.
[http://dx.doi.org/10.1007/s10846-017-0725-0]
[75]
Gu F, Chung MH, Chignell M, Valaee S, Zhou B, Liu X. A survey on deep learning for human activity recognition. ACM Comput Surv 2022; 54(8): 1-34.
[http://dx.doi.org/10.1145/3472290]
[76]
San-Segundo R, Montero JM, Barra-Chicote R, Fernández F, Pardo JM. Feature extraction from smartphone inertial signals for human activity segmentation. Signal Processing 2016; 120: 359-72.
[http://dx.doi.org/10.1016/j.sigpro.2015.09.029]
[77]
Wang A, Zhao S, Zheng C, Chen H, Liu L, Chen G. HierHAR: Sensor-based data-driven hierarchical human activity recognition. IEEE Sens J 2021; 21(3): 3353-65.
[http://dx.doi.org/10.1109/JSEN.2020.3023860]
[78]
Lv M, Chen L, Chen T, Chen G. Bi-view semi-supervised learning based semantic human activity recognition using accelerometers. IEEE Trans Mobile Comput 2018; 17(9): 1991-2001.
[http://dx.doi.org/10.1109/TMC.2018.2793913]
[79]
Wang H, Ke R, Li J, An Y, Wang K, Yu L. A correlation-based binary particle swarm optimization method for feature selection in human activity recognition. Int J Distrib Sens Netw 2018; 14(4)
[http://dx.doi.org/10.1177/1550147718772785]
[80]
Soleimani E, Nazerfard E. Cross-subject transfer learning in human activity recognition systems using generative adversarial networks. Neurocomputing 2021; 426: 26-34.
[http://dx.doi.org/10.1016/j.neucom.2020.10.056]
[81]
Hong JH, Ramos J, Dey AK. Toward personalized activity recognition systems with a semipopulation approach. IEEE Trans Hum Mach Syst 2016; 46(1): 101-12.
[http://dx.doi.org/10.1109/THMS.2015.2489688]
[82]
Siirtola P, Röning J. Incremental learning to personalize human activity recognition models: The importance of human AI collaboration. Sensors 2019; 19(23): 5151.
[http://dx.doi.org/10.3390/s19235151] [PMID: 31775243]
[83]
Tasmin M. Comparative study of classifiers on human activity recognition by different feature engineering techniques 2020 IEEE 10th International Conference on Intelligent Systems (IS). 28-30 August 2020; Varna, Bulgaria. 2020; pp. 93-101.
[http://dx.doi.org/10.1109/IS48319.2020.9199934]
[84]
Igwe OM, Wang Y, Giakos GC, Fu J. Human activity recognition in smart environments employing margin setting algorithm. J Ambient Intell Humaniz Comput 2022; 13(7): 3669-81.
[http://dx.doi.org/10.1007/s12652-020-02229-y]
[85]
Subasi A, Radhwan M, Kurdi R, Khateeb K. IoT based mobile healthcare system for human activity recognition. 2018 15th Learning and Technology Conference (L&T). 25-26 February 2018; Jeddah, Saudi Arabia. 2018; pp. 29-34 29-34.
[http://dx.doi.org/10.1109/LT.2018.8368507]
[86]
Maswadi K, Ghani NA, Hamid S, Rasheed MB. Human activity classification using decision tree and naïve bayes classifiers. Multimedia Tools Appl 2021; 80(14): 21709-26.
[http://dx.doi.org/10.1007/s11042-020-10447-x]
[87]
Damodaran N, Haruni E, Kokhkharova M, Schäfer J. Device free human activity and fall recognition using WiFi channel state information (CSI). CCF Trans Perv Comp Interac 2020; 2(1): 1-17.
[http://dx.doi.org/10.1007/s42486-020-00027-1]
[88]
Franco P, Martinez JM, Kim YC, Ahmed MA. IoT based approach for load monitoring and activity recognition in smart homes. IEEE Access 2021; 9: 45325-39.
[http://dx.doi.org/10.1109/ACCESS.2021.3067029]
[89]
Bozkurt F. A comparative study on classifying human activities using classical machine and deep learning methods. Arab J Sci Eng 2022; 47(2): 1507-21.
[http://dx.doi.org/10.1007/s13369-021-06008-5]
[90]
Xu Z, Wang G, Guo X. Sensor-based activity recognition of solitary elderly via stigmergy and two-layer framework. Eng Appl Artif Intell 2020; 95: 103859.
[http://dx.doi.org/10.1016/j.engappai.2020.103859]
[91]
Akula A, Shah AK, Ghosh R. Deep learning approach for human action recognition in infrared images. Cogn Syst Res 2018; 50: 146-54.
[http://dx.doi.org/10.1016/j.cogsys.2018.04.002]
[92]
Long J, Sun W, Yang Z, Raymond OI. Asymmetric residual neural network for accurate human activity recognition. Information 2019; 10(6): 203.
[http://dx.doi.org/10.3390/info10060203]
[93]
Mekruksavanich S, Jitpattanakul A. Deep convolutional neural network with rnns for complex activity recognition using wrist-worn wearable sensor data. Electronics 2021; 10(14): 1685.
[http://dx.doi.org/10.3390/electronics10141685]
[94]
Papagiannaki A, Zacharaki E, Kalouris G, et al. Recognizing physical activity of older people from wearable sensors and inconsistent data. Sensors 2019; 19(4): 880.
[http://dx.doi.org/10.3390/s19040880] [PMID: 30791587]
[95]
Yacchirema D, de Puga JS, Palau C, Esteve M. Fall detection system for elderly people using IoT and ensemble machine learning algorithm. Pers Ubiquitous Comput 2019; 23(5-6): 801-17.
[http://dx.doi.org/10.1007/s00779-018-01196-8]
[96]
Manzi A, Dario P, Cavallo F. A human activity recognition system based on dynamic clustering of skeleton data. Sensors 2017; 17(5): 1100.
[http://dx.doi.org/10.3390/s17051100] [PMID: 28492486]
[97]
Ma H, Zhang Z, Li W, Lu S. Unsupervised human activity representation learning with multi-task deep clustering. Proc ACM Interact Mob Wearable Ubiquitous Technol 2021; 5(1): 1-25.
[http://dx.doi.org/10.1145/3448074]
[98]
Xu S, Tang Q, Jin L, Pan Z. A cascade ensemble learning model for human activity recognition with smartphones. Sensors 2019; 19(10): 2307.
[http://dx.doi.org/10.3390/s19102307] [PMID: 31109126]
[99]
Choudhury NA, Moulik S, Roy DS. Physique-based human activity recognition using ensemble learning and smartphone sensors. IEEE Sens J 2021; 21(15): 16852-60.
[http://dx.doi.org/10.1109/JSEN.2021.3077563]
[100]
Subasi A, Dammas DH, Alghamdi RD, et al. Sensor based human activity recognition using adaboost ensemble classifier. Procedia Comput Sci 2018; 140: 104-11.
[http://dx.doi.org/10.1016/j.procs.2018.10.298]
[101]
Padmaja B, Prasa V, Sunitha K. A novel random split point procedure using extremely randomized (Extra) trees ensemble method for human activity recognition. EAI Endorsed Trans Pervasive Health Technol 2020; 6(22): 164824.
[http://dx.doi.org/10.4108/eai.28-5-2020.164824]
[102]
Berlin SJ, John M. R-STDP based spiking neural network for human action recognition. Appl Artif Intell 2020; 34(9): 656-73.
[http://dx.doi.org/10.1080/08839514.2020.1765110]
[103]
Lu Y, Li Y, Velipasalar S. Efficient human activity classification from egocentric videos incorporating actor-critic reinforcement learning. 2019 IEEE International Conference on Image Processing (ICIP). 22-25 September 2019; Taipei, Taiwan. 2019; pp. 564-8.
[http://dx.doi.org/10.1109/ICIP.2019.8803823]
[104]
Zhou X, Liang W, Wang KIK, Wang H, Yang LT, Jin Q. Deep-learning-enhanced human activity recognition for internet of healthcare things. IEEE Internet Things J 2020; 7(7): 6429-38.
[http://dx.doi.org/10.1109/JIOT.2020.2985082]
[105]
Possas R, Caceres SP, Ramos F. Egocentric activity recognition on a budget. Proc IEEE Comp SocConf Comp VisPatt Recogn. 5967-76.
[http://dx.doi.org/10.1109/CVPR.2018.00625]
[106]
Liu G, Ma R, Hao Q. A reinforcement learning based design of compressive sensing systems for human activity recognition. 2018 IEEE SENSORS. 28-31 October 2018; New Delhi, India. 2018; pp. 1-4.
[http://dx.doi.org/10.1109/ICSENS.2018.8589690]
[107]
Li J, Tian L, Chen L, Wang H, Cao T, Yu L. Optimal feature selection for activity recognition based on ant colony algorithm. Proc14th IEEE Conf Ind Electr Appl ICIEA 2019; 2356-62.
[http://dx.doi.org/10.1109/ICIEA.2019.8834380]
[108]
Fan C, Gao F. Enhanced human activity recognition using wearable sensors via a hybrid feature selection method. Sensors 2021; 21(19): 6434.
[http://dx.doi.org/10.3390/s21196434] [PMID: 34640754]
[109]
Jalal A, Batool M, Kim K. Stochastic recognition of physical activity and healthcare using tri-axial inertial wearable sensors. Appl Sci 2020; 10(20): 7122.
[http://dx.doi.org/10.3390/app10207122]
[110]
Nguyen TDT, Huynh T-T, Pham H-A. An improved human activity recognition by using genetic algorithm to optimize feature vector. Proc 2018 10th Int Conf Knowl Sys Eng KSE 2018; 123-8.
[http://dx.doi.org/10.1109/KSE.2018.8573335]
[111]
Mocanu I, Axinte D, Cramariuc O, Cramariuc B. Human activity recognition with convolution neural network using TIAGo Robot. 2018 41st Int Conf Telecommun Signal Proce (TSP). 04-06 July 2018; Athens, Greece. 2018; pp. 1-4.
[http://dx.doi.org/10.1109/TSP.2018.8441486]
[112]
Abo El-Maaty AM, Wassal AG. Hybrid GA-PCA feature selection approach for inertial human activity recognition. Proc 2018 IEEE Symp Ser Compu Intell SSCI 2018. 18-21 November 2018; Bangalore, India. 2019; pp. 1027-32.
[http://dx.doi.org/10.1109/SSCI.2018.8628702]
[113]
Baldominos A, Saez Y, Isasi P. Model selection in committees of evolved convolutional neural networks using genetic algorithms. Yin H, Camacho D, Novais P, Tallón-Ballesteros A. Intelligent Data Engineering and Automated Learning. Lecture Notes in Computer ScienceCham: Springer 2018; 11314: pp. 364-73.
[http://dx.doi.org/10.1007/978-3-030-03493-1_39]
[114]
Arshad S, Feng C, Yu R, Liu Y. Leveraging transfer learning in multiple human activity recognition using WiFi signal. 2019 IEEE 20th International Symposium on "A World of Wireless, Mobile and Multimedia Networks" (WoWMoM). 10-12 June 2019; Washington, DC, USA. 2019; pp. 1-10.
[http://dx.doi.org/10.1109/WoWMoM.2019.8793019]
[115]
Ding R, Li X, Nie L, et al. Empirical study and improvement on deep transfer learning for human activity recognition. Sensors 2018; 19(1): 57.
[http://dx.doi.org/10.3390/s19010057] [PMID: 30586875]
[116]
Hoelzemann A, van Laerhoven K. Digging deeper: Towards a better understanding of transfer learning for human activity recognition. ISWC '20: Proceedings of the 2020 ACM International Symposium on Wearable Computers. 50-4.
[http://dx.doi.org/10.1145/3410531.3414311]
[117]
Wang J, Chen Y, Zheng VW, Huang M. Deep transfer learning for cross-domain activity recognition. ACM Int Conf ProcSer 1-8.
[http://dx.doi.org/10.1145/3265689.3265705]
[118]
Mutegeki R, Han DS. Feature-representation transfer learning for human activity recognition. 2019 International Conference on Information and Communication Technology Convergence (ICTC). 16-18 October 2019; Jeju, Korea (South). 2019; pp. 18-20.
[http://dx.doi.org/10.1109/ICTC46691.2019.8939979]
[119]
al Hafiz Khan MA, Roy N. TransAct: Transfer learning enabled activity recognition. 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops). 13-17 March 2017; Kona, HI, USA. 2017; pp. 545-50.
[http://dx.doi.org/10.1109/PERCOMW.2017.7917621]
[120]
Kalouris G, Zacharaki EI, Megalooikonomou V. Improving CNN-based activity recognition by data augmentation and transfer learning. IEEE International Conference on Industrial Informatics (INDIN). 22-25 July 2019; Helsinki, Finland. 2019; pp. 1387-94.
[http://dx.doi.org/10.1109/INDIN41052.2019.8972135]
[121]
Rokni SA, Ghasemzadeh H. Autonomous training of activity recognition algorithms in mobile sensors: A transfer learning approach in context-invariant views. IEEE Trans Mobile Comput 2018; 17(8): 1764-77.
[http://dx.doi.org/10.1109/TMC.2018.2789890]
[122]
Md Md, Faridee AZ, Abdullah Al Hafiz K, Pathak N, Roy N. AugToAct: Scaling complex human activity recognition with few labels. ACM International Conference Proceeding Series. 162-71.
[http://dx.doi.org/10.1145/3360774.3360831]
[123]
Mutegeki R, Han DS. A CNN-LSTM approach to human activity recognition. 2020 International Conference on Artificial Intelligence in Information and Communication, ICAIIC 2020. 19-21 February 2020; Fukuoka, Japan. 2020; pp. 362-6.
[http://dx.doi.org/10.1109/ICAIIC48513.2020.9065078]
[124]
Bianchi V, Bassoli M, Lombardo G, Fornacciari P, Mordonini M, De Munari I. IoT wearable sensor and deep learning: An integrated approach for personalized human activity recognition in a smart home environment. IEEE Internet Things J 2019; 6(5): 8553-62.
[http://dx.doi.org/10.1109/JIOT.2019.2920283]
[125]
Yu T, Zhuang Y, Mengshoel OJ, Yagan O. Hybridizing personal and impersonal machine learning models for activity recognition on mobile devices. MobiCASE’16: Proceedings of the 8th EAI International Conference on Mobile Computing, Applications and Services. 117-26.
[http://dx.doi.org/10.4108/eai.30-11-2016.2267108]
[126]
Zeng M, Yu T, Wang X, Nguyen LT, Mengshoel OJ, Lane I. Semisupervised convolutional neural networks for human activity recognition. 2017 IEEE International Conference on Big Data (Big Data). 11-14 December 2017; Boston, MA, USA. 2017; pp. 522-9.
[http://dx.doi.org/10.1109/BigData.2017.8257967]
[127]
Balabka D. Semi-supervised learning for human activity recognition using adversarial autoencoders. UbiComp/ISWC '19 Adjunct: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers. 685-8.
[http://dx.doi.org/10.1145/3341162.3344854]
[128]
Bi H, Perello-Nieto M, Santos-Rodriguez R, Flach P, Craddock I. An active semi-supervised deep learning model for human activity recognition. J Ambient Intell Humaniz Comput 2022.
[http://dx.doi.org/10.1007/s12652-022-03768-2]
[129]
Yang SH, Baek DG, Thapa K. Semi-supervised adversarial learning using lstm for human activity recognition. Sensors 2022; 22(13): 4755.
[http://dx.doi.org/10.3390/s22134755] [PMID: 35808248]
[130]
Mohmed A. Clustering-based fuzzy finite state machine for human activity recognitionAdvances in Computational Intelligence Systems Advances in Intelligent Systems and ComputingCham. Springer 2019; p. 840.
[http://dx.doi.org/10.1007/978-3-319-97982-3_22]
[131]
Brena RF, Garcia-Ceja E. A crowdsourcing approach for personalization in human activities recognition. Intell Data Anal 2017; 21(3): 721-38.
[http://dx.doi.org/10.3233/IDA-170884]
[132]
He H, Tan Y, Zhang W. A wavelet tensor fuzzy clustering scheme for multi-sensor human activity recognition. Eng Appl Artif Intell 2018; 70: 109-22.
[http://dx.doi.org/10.1016/j.engappai.2018.01.004]
[133]
Wang X, Lu Y, Wang D, Liu L, Zhou H. Using jaccard distance measure for unsupervised activity recognition with smartphone accelerometers. Song S, Renz M, Moon YS. Web and Big Data. Lecture Notes in Computer ScienceCham: Springer 2017; 10612: pp. 74-83.
[http://dx.doi.org/10.1007/978-3-319-69781-9_8]
[134]
Bota P, Silva J, Folgado D, Gamboa H. A semi-automatic annotation approach for human activity recognition. Sensors 2019; 19(3): 501.
[http://dx.doi.org/10.3390/s19030501] [PMID: 30691040]
[135]
Yu H, Lu J, Liu A, Wang B, Li R, Zhang G. Real-time prediction system of train carriage load based on multi-stream fuzzy learning. In: IEEE Transactions on Intelligent Transportation Systems. 2022; 23: pp. (9)15155-65.
[http://dx.doi.org/10.1109/TITS.2021.3137446]
[136]
Yu H, Lu J, Zhang G. Topology learning-based fuzzy random neural networks for streaming data regression. IEEE Trans Fuzzy Syst 2022; 30(2): 412-25.
[http://dx.doi.org/10.1109/TFUZZ.2020.3039681]

Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy