Journal Home > Volume 26 , Issue 6

In this paper, we present the design and implementation of an avatar-based interactive system that facilitates rehabilitation for people who have received total knee replacement surgeries. The system empowers patients to carry out exercises prescribed by a clinician at the home settings more effectively. Our system helps improve accountability for both patients and clinicians. The primary sensing modality is the Microsoft Kinect sensor, which is a depth camera that comes with a Software Development Kit (SDK). The SDK provides access to 3-dimensional skeleton joint positions to software developers, which significantly reduces the challenges in developing accurate motion tracking systems, especially for use at home. However, the Kinect sensor is not well-equipped to track foot orientation and its subtle movements. To overcome this issue, we augment the system with a commercial off-the-shelf Inertial Measurement Unit (IMU). The two sensing modalities are integrated where the Kinect serves as the primary sensing modality and the IMU is used for exercises where Kinect fails to produce accurate measurement. In this pilot study, we experiment with four rehabilitation exercises, namely, quad set, side-lying hip abduction, straight raise leg, and ankle pump. The Kinect is used to assess the first three exercises, and the IMU is used to assess the ankle pump exercise.


menu
Abstract
Full text
Outline
About this article

Towards Rehabilitation at Home After Total Knee Replacement

Show Author's information Wenbing Zhao( )Shunkun YangXiong Luo
Department of Electrical Engineering and Computer Science, Cleveland State University, Cleveland, OH 44115, USA
School of Reliability and Systems Engineering, Beihang University, Beijing 100191, China
School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing 100083, China

Abstract

In this paper, we present the design and implementation of an avatar-based interactive system that facilitates rehabilitation for people who have received total knee replacement surgeries. The system empowers patients to carry out exercises prescribed by a clinician at the home settings more effectively. Our system helps improve accountability for both patients and clinicians. The primary sensing modality is the Microsoft Kinect sensor, which is a depth camera that comes with a Software Development Kit (SDK). The SDK provides access to 3-dimensional skeleton joint positions to software developers, which significantly reduces the challenges in developing accurate motion tracking systems, especially for use at home. However, the Kinect sensor is not well-equipped to track foot orientation and its subtle movements. To overcome this issue, we augment the system with a commercial off-the-shelf Inertial Measurement Unit (IMU). The two sensing modalities are integrated where the Kinect serves as the primary sensing modality and the IMU is used for exercises where Kinect fails to produce accurate measurement. In this pilot study, we experiment with four rehabilitation exercises, namely, quad set, side-lying hip abduction, straight raise leg, and ankle pump. The Kinect is used to assess the first three exercises, and the IMU is used to assess the ankle pump exercise.

Keywords: virtual reality, rehabilitation, physical therapy, total knee replacement, avatar, repetition count, range of motion

References(30)

[1]
I. M. Tomek, D. C. Goodman, A. R. Esty, J. E. Bell, and E. S. Fisher, Trends and regional variation in hip, knee and shoulder replacement, https://folio.iupui.edu/bitstream/handle/10244/1014/Joint_Replacement_0410.pdf?sequence=1, 2010.
[2]
P. Cram, X. Lu, S. L. Kates, J. A. Singh, Y. Li, and B. R. Wolf, Total knee arthroplasty volume, utilization, and outcomes among medicare beneficiaries, 1991-2010, JAMA, vol. 308, no. 12, pp. 1227-1236, 2012.
[3]
J. A. Kleim and T. A. Jones, Principles of experience-dependent neural plasticity: Implications for rehabilitation after brain damage, J. Speech Lang. Hear. Res., vol. 51, no. 1, pp. S225-S239, 2008.
[4]
D. Tino and C. Hillis, The full can exercise as the recommended exercise for strengthening the supraspinatus while minimizing impingement, Strength Cond. J., vol. 32, no. 5, pp. 33-35, 2010.
[5]
R. Lun and W. B. Zhao, A survey of applications and human motion recognition with microsoft kinect, Int. J. Pattern Recognit. Artif. Intell., vol. 29, no. 5, p. 1555008, 2015.
[6]
W. B. Zhao, A concise tutorial on human motion tracking and recognition with microsoft kinect, Sci. China Inf. Sci., vol. 59, no. 9, p. 93101, 2016.
[7]
E. Knippenberg, J. Verbrugghe, I. Lamers, S. Palmaers, A. Timmermans, and A. Spooren, Markerless motion capture systems as training device in neurological rehabilitation: A systematic review of their use, application, target population and efficacy, J. Neuroeng. Rehabil., vol. 14, no. 1, p. 61, 2017.
[8]
A. Da Gama, P. Fallavollita, V. Teichrieb, and N. Navab, Motor rehabilitation using kinect: A systematic review, Games Health J., vol. 4, no. 2, pp. 123-135, 2015.
[9]
W. B. Zhao, D. D. Espy, M. A. Reinthal, and H. Feng, A feasibility study of using a single kinect sensor for rehabilitation exercises monitoring: A rule based approach, in Proc. 2014 IEEE Symp. on Computational Intelligence in Healthcare and e-Health, Orlando, FL, USA, 2014, pp. 1-8.
DOI
[10]
W. B. Zhao, H. Feng, R. Lun, D. D. Espy, and M. A. Reinthal, A kinect-based rehabilitation exercise monitoring and guidance system, in Proc. 2014 IEEE 5th Int. Conf. on Software Engineering and Service Science, Beijing, China, 2014, pp. 762-765.
DOI
[11]
W. B. Zhao, M. A. Reinthal, D. D. Espy, and X. Luo, Rule-based human motion tracking for rehabilitation exercises: Realtime assessment, feedback, and guidance, IEEE Access, vol. 5, pp. 21 382-21 394, 2017.
[12]
W. B. Zhao, R. Lun, D. D. Espy, and M. A. Reinthal, Rule based realtime motion assessment for rehabilitation exercises, in Proc. 2014 IEEE Symp. on Computational Intelligence in Healthcare and e-health, Orlando, FL, USA, 2014, pp. 133-140.
DOI
[13]
W. B. Zhao, D. D. Deborah, M. A. Reinthal, B. Ekelman, G. Goodman, and J. Niederriter, Privacy-aware human motion tracking with realtime haptic feedback, in Proc. 2015 IEEE Int. Conf. on Mobile Services, New York, NY, USA, 2015, pp. 446-453.
DOI
[14]
W. B. Zhao, R. Lun, C. Gordon, A. B. M. Fofana, D. D. Espy, M. A. Reinthal, B. Ekelman, G. D. Goodman, J. E. Niederriter, and X. Luo, A human-centered activity tracking system: Toward a healthier workplace, IEEE Trans. Human-Mach. Syst., vol. 47, no. 3, pp. 343-355, 2017.
[15]
W. B. Zhao, Q. Wu, A. Reinthal, and N. Zhang, Design, implementation, and field testing of a privacy-aware compliance tracking system for bedside care in nursing homes, Appl. Syst. Innov., vol. 1, no. 1, p. 3, 2018.
[16]
M. O’Reilly, B. Caulfield, T. Ward, W. Johnston, and C. Doherty, Wearable inertial sensor systems for lower limb exercise detection and evaluation: A systematic review, Sports Med., vol. 48, no. 5, pp. 1221-1246, 2018.
[17]
W. Y. Wong, M. S. Wong, and K. H. Lo, Clinical applications of sensors for human posture and movement analysis: A review, Prosthet. Orthot. Int., vol. 31, no. 1, pp. 62-75, 2007.
[18]
C. Y. Chiang, K. H. Chen, K. C. Liu, S. J. P. Hsu, and C. T. Chan, Data collection and analysis using wearable sensors for monitoring knee range of motion after total knee arthroplasty, Sensors, vol. 17, no. 2, p. 418, 2017.
[19]
Y. P. Huang, Y. Y. Liu, W. H. Hsu, L. J. Lai, and M. S. Lee, Monitoring and assessment of rehabilitation progress on range of motion after total knee replacement by sensor-based system, Sensors, vol. 20, no. 6, p. 1703, 2020.
[20]
M. E. Kayaalp, A. N. Agres, J. Reichmann, M. Bashkuev, G. N. Duda, and R. Becker, Validation of a novel device for the knee monitoring of orthopaedic patients, Sensors, vol. 19, no. 23, p. 5193, 2019.
[21]
S. R. Small, G. S. Bullock, S. Khalid, K. Barker, M. Trivella, and A. J. Price, Current clinical utilisation of wearable motion sensors for the assessment of outcome following knee arthroplasty: A scoping review, BMJ Open, vol. 9, no. 12, p. e033832, 2019.
[22]
L. Donath, R. Rössler, and O. Faude, Effects of virtual reality training (exergaming) compared to alternative exercise training and passive control on standing balance and functional mobility in healthy community-dwelling seniors: A meta-analytical review, Sports Med., vol. 46, no. 9, pp. 1293-1309, 2016.
[23]
H. S. Lee, Y. J. Park, and S. W. Park, The effects of virtual reality training on function in chronic stroke patients: A systematic review and meta-analysis, BioMed Res. Int., vol. 2019, p. 7595639, 2019.
[24]
M. Yates, A. Kelemen, and C. S. Lanyi, Virtual reality gaming in the rehabilitation of the upper extremities post-stroke, Brain Inj., vol. 30, no. 7, pp. 855-863, 2016.
[25]
M. J. Chen, Y. Li, X. Luo, W. P. Wang, L. Wang, and W. B. Zhao, A novel human activity recognition scheme for smart health using multilayer extreme learning machine, IEEE Internet Things J., vol. 6, no. 2, pp. 1410-1418, 2019.
[26]
W. B. Zhao, D. D. Espy, and M. A. Reinthal, Assessment of sit-to-stand movements using a single kinect sensor: A preliminary study in healthy subjects, Int. J. Healthc. Inf. Syst. Inf., vol. 14, no. 1, pp. 29-43, 2019.
[27]
R. Jurdak, K. Klues, B. Kusy, C. Richter, K. Langendoen, and M. Brunig, Opal: A multiradio platform for high throughput wireless sensor networks, IEEE Embedd. Syst. Lett., vol. 3, no. 4, pp. 121-124, 2011.
[28]
A. Burns, B. R. Greene, M. J. McGrath, T. J. O’Shea, B. Kuris, S. M. Ayer, F. Stroiescu, and V. Cionca, SHIMMERTM—A wireless sensor platform for noninvasive biomedical research, IEEE Sens. J., vol. 10, no. 9, pp. 1527-1534, 2010.
[29]
M. Euston, P. Coote, R. Mahony, J. Kim, and T. Hamel, A complementary filter for attitude estimation of a fixed-wing UAV, in Proc. 2008 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Nice, France, 2008, pp. 340-345.
DOI
[30]
W. B. Zhao, R. Lun, D. D. Espy, and M. A. Reinthal, Realtime motion assessment for rehabilitation exercises: Integration of kinematic modeling with fuzzy inference, J. Artif. Intell. Soft Comput. Res., vol. 4, no. 4, pp. 267-285, 2014.
Publication history
Copyright
Rights and permissions

Publication history

Received: 08 August 2020
Accepted: 08 September 2020
Published: 09 June 2021
Issue date: December 2021

Copyright

© The author(s) 2021.

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return