Abstract
Introduction: The position and pose measurement of the rehabilitation robot plays a very important role in patient rehabilitation movement, and the non-contact real-time robot position and pose measurement is of great significance. Rehabilitation training is a relatively complicated process, so it is very important to detect the training process of the rehabilitation robot in real-time and its accuracy. The method of deep learning has a very good effect on monitoring the rehabilitation robot state.
Methods: The structure sketch and the 3D model of the 3-PRS ankle rehabilitation robot are established, and the mechanism kinematics is analyzed to obtain the relationship between the driving input - the three slider heights - and the position and pose parameters. The whole network of the position and pose measurement is composed of two stages: (1) measuring the slider heights using the convolutional neural network (CNN) based on the robot image and (2) calculating the position and pose parameter using the backpropagation neural network (BPNN) based on the measured slider heights from the CNN. According to the characteristics of continuous variation of the slider heights, a CNN with regression is proposed and established to measure the robot slider height. Based on the data calculated by using the inverse kinematics of the 3-PRS ankle rehabilitation robot, a BPNN is established to solve the forward kinematics for the position and pose.
Results: The experimental results show that the regression CNN measures the slider height and then the BPNN accurately measures the corresponding position and pose. Eventually, the position and pose parameters are obtained from the robot image. Compared with the traditional robot position and pose measurement method, the proposed method has significant advantages.
Conclusion: The proposed 3-PRS ankle rehabilitation position and pose method can not only reduce the experiment period and cost, but also has excellent timeliness and precision. The proposed approach can help the medical staff to monitor the status of the rehabilitation robot and help the patient in rehabilitation training.
Discussion: The goal of the work is to construct a new position and pose detection network based on the combination of the regression CNN and the BPNN. The main contribution is to measure the position and pose of the 3-PRS ankle rehabilitation robot in real-time, which improves the measurement accuracy and the efficiency of the medical staff work.
Keywords: Position and pose, rehabilitation robot, regression CNN, BPNN, position and pose measurement, rehabilitation training.
Graphical Abstract
[http://dx.doi.org/10.1590/1678-457x.14616]
[http://dx.doi.org/10.1007/978-3-030-03538-9_23]
[http://dx.doi.org/10.1541/ieejias.140.314]
[http://dx.doi.org/10.1007/s10439-018-2062-2] [PMID: 29855755]
[http://dx.doi.org/10.11159/cdsr19.145]
[http://dx.doi.org/10.1109/LRA.2020.2972892]
[http://dx.doi.org/10.1109/70.313094]
[http://dx.doi.org/10.1007/978-3-319-95282-6_40]
[http://dx.doi.org/10.1016/j.rcim.2016.08.001] [PMID: 27885312]
[http://dx.doi.org/10.1007/978-3-030-40605-9_28]
[http://dx.doi.org/10.1109/TRO.2005.861482]
[http://dx.doi.org/10.1016/j.mechmachtheory.2011.05.010]
[http://dx.doi.org/10.1007/s10846-016-0413-5]
[http://dx.doi.org/10.1007/s10846-018-0970-x]
[http://dx.doi.org/10.1109/SII.2019.8700452]
[http://dx.doi.org/10.1109/METROI4.2018.8428330]
[http://dx.doi.org/10.1051/matecconf/202030904004]
[http://dx.doi.org/10.1177/0278364917710318]
[http://dx.doi.org/10.1109/SITIS.2016.17]
[http://dx.doi.org/10.1016/B978-0-12-810408-8.00002-X]
[http://dx.doi.org/10.1038/s41591-018-0300-7] [PMID: 30617339]
[http://dx.doi.org/10.1016/j.cmpb.2020.105316] [PMID: 31951873]
[http://dx.doi.org/10.1109/ICIEA.2013.6566641]
[http://dx.doi.org/10.1109/ROBOT.2009.5152473]
[http://dx.doi.org/10.1109/AMC.2016.7496400]
[http://dx.doi.org/10.1007/978-3-642-33885-4_60]
[http://dx.doi.org/10.1109/ICCV.2013.256]
[http://dx.doi.org/10.1177/0020294019847712]
[http://dx.doi.org/10.1007/978-3-319-09489-2_15]
[http://dx.doi.org/10.3390/s17102164] [PMID: 28934102]
[http://dx.doi.org/10.22260/ISARC2018/0120]
[http://dx.doi.org/10.1007/978-3-030-27538-9_46]
[http://dx.doi.org/10.1109/COASE.2018.8560564]
[http://dx.doi.org/10.1007/978-3-030-11292-9_20]
[http://dx.doi.org/10.1109/SII46433.2020.9026297]
[http://dx.doi.org/10.1016/j.mechatronics.2011.03.008]
[http://dx.doi.org/10.1109/CVPR.2018.00214]
[http://dx.doi.org/10.31181/dmame1802079s]
[http://dx.doi.org/10.1109/CVPR.2016.390]
[http://dx.doi.org/10.1109/TPAMI.2020.2976014] [PMID: 32091993]
[http://dx.doi.org/10.1007/s11760-019-01538-w]
[http://dx.doi.org/10.1109/WACV.2018.00224]
[http://dx.doi.org/10.1109/ICASSP.2019.8683429]
[http://dx.doi.org/10.1007/s12559-018-9563-z]
[http://dx.doi.org/10.1146/annurev-vision-082114-035447] [PMID: 28532370]
[http://dx.doi.org/10.1109/ICCSP.2017.8286426]
[http://dx.doi.org/10.1038/s41592-018-0234-5] [PMID: 30573820]
[http://dx.doi.org/10.1007/978-981-15-1816-4_7]
[http://dx.doi.org/10.1007/978-981-15-1816-4_1]
[http://dx.doi.org/10.31614/cmes.2019.04676]
[http://dx.doi.org/10.4043/29904-MS]
[http://dx.doi.org/10.1007/978-981-13-6772-4_76]
[http://dx.doi.org/10.1007/978-3-642-16167-4_47]