A subscription to JoVE is required to view this content. Sign in or start your free trial.
The protocol described here aims to enhance the quantitative evaluation of upper limb deficits, with the goal of developing additional technology for remote assessment both in the clinic and at home. Virtual reality and biosensor technologies are combined with standard clinical techniques to provide insights into the functioning of the neuromuscular system.
The ability to move allows us to interact with the world. When this ability is impaired, it can significantly reduce one's quality of life and independence and may lead to complications. The importance of remote patient evaluation and rehabilitation has recently grown due to limited access to in-person services. For example, the COVID-19 pandemic unexpectedly resulted in strict regulations, reducing access to non-emergent healthcare services. Additionally, remote care offers an opportunity to address healthcare disparities in rural, underserved, and low-income areas where access to services remains limited.
Improving accessibility through remote care options would limit the number of hospital or specialist visits and render routine care more affordable. Finally, the use of readily available commercial consumer electronics for at-home care can enhance patient outcomes due to improved quantitative observation of symptoms, treatment efficacy, and therapy dosage. While remote care is a promising means to address these issues, there is a crucial need to quantitatively characterize motor impairment for such applications. The following protocol seeks to address this knowledge gap to enable clinicians and researchers to obtain high-resolution data on complex movement and underlying muscle activity. The ultimate goal is to develop a protocol for remote administration of functional clinical tests.
Here, participants were instructed to perform a medically-inspired Box and Block task (BBT), which is frequently used to assess hand function. This task requires subjects to transport standardized cubes between two compartments separated by a barrier. We implemented a modified BBT in virtual reality to demonstrate the potential of developing remote assessment protocols. Muscle activation was captured for each subject using surface electromyography. This protocol allowed for the acquisition of high-quality data to better characterize movement impairment in a detailed and quantitative manner. Ultimately, these data have the potential to be used to develop protocols for virtual rehabilitation and remote patient monitoring.
Movement is how we interact with the world. While everyday activities such as picking up a glass of water or walking to work may seem simple, even these movements rely on complex signaling between the central nervous system, muscles, and limbs1. As such, personal independence and quality of life are highly correlated to the level of an individual's limb function2,3. Neurological damage, as in spinal cord injury (SCI) or peripheral nerve injury, can result in permanent motor deficits, thereby diminishing one's ability to execute even simple activities of daily living4,5. According to the National Institute of Neurological Disorders and Stroke, over 100 million people in the United States experience motor deficits, with stroke as one of the leading causes6,7,8. Due to the nature of these injuries, patients often require prolonged care in which quantitative motor assessment and remote treatment may be beneficial.
Current practices for treating movement disorders often require both initial and ongoing clinical assessment of function through observation by trained experts such as physical or occupational therapists. Standard validated clinical tests often require trained professionals to administer them, with specific time constraints and subjective scoring of predefined movements or functional tasks. However, even in healthy individuals, identical movements can be accomplished with varying combinations of joint angles. This concept is termed musculoskeletal redundancy.
Functional clinical tests often do not account for the individual redundancy underlying inter-subject variability. For clinicians and researchers alike, distinguishing between normal variability caused by redundancy and pathological changes in movement remains a challenge. Standardized clinical assessments performed by well-trained raters utilize low-resolution scoring systems to reduce inter-rater variability and improve test validity. However, this introduces ceiling effects, thus lowering the sensitivity and predictive validity for subjects who may have mild movement deficits9,10. Furthermore, these clinical tests cannot differentiate if deficits are caused by passive body mechanics or active muscle coordination, which may be of importance during initial diagnosis and when designing a patient-specific rehabilitation plan. Randomized clinical trials have revealed inconsistent efficacy of treatment plans formulated based on evidence provided by these clinical tests11,12,13. Several studies have emphasized the need for quantitative, user-friendly clinical metrics that may be used to guide the design of future interventions14,15.
In previous studies, we demonstrated the implementation of automated movement assessment using readily available consumer motion capture devices in post-stroke arm impairment, as well as the evaluation of shoulder function after chest surgery in breast cancer patients16,17. Additionally, we have shown that using active joint moments to estimate muscle moments of specific active movements is a more sensitive measure of motor deficits after stroke compared to joint angles18. Motion capture and surface electromyography (EMG) may therefore be of critical importance in the assessment of patients who are diagnosed as asymptomatic by standard clinical tests, but who may still be experiencing movement difficulties, fatigue, or pain. This paper describes a system that may enable detailed and quantitative characterization of movement during standard clinical tests for the future development of methods for at-home evaluation and rehabilitation in movement-impaired patient populations.
Virtual reality (VR) can be used to construct an immersive user experience while modeling everyday tasks. Typically, VR systems track the hand movements of the user to allow for simulated interactions with the virtual environment. The protocol we describe here uses consumer VR products for motion capture to quantify the assessment of motor deficits, similar to other studies demonstrating the use of off-the-shelf video game controllers in quantitative evaluation of impairment after stroke or shoulder surgery16,17. In addition, EMG is a non-invasive measure of the neural activity underlying muscular contraction19. As such, EMG may be used to indirectly evaluate the quality of the neural control of movement and provide a detailed assessment of motor function. Muscle and nerve damage may be detected by EMG, and disorders such as muscular dystrophy and cerebral palsy are commonly monitored using this technique20,21. Furthermore, EMG may be used to track changes in muscle strength or spasticity, which may not be evident in kinematic assessments22,23, as well as fatigue and muscle coactivation. Metrics such as these are critical in considering rehabilitation progress23,24,25.
The experimental paradigm described here seeks to leverage a combination of VR and EMG to address the limitations of traditional clinical assessment tools. Here, participants were asked to perform a modified Box and Block task (BBT)26 using real objects and in VR. The standard BBT is a clinical tool used in the general assessment of gross upper extremity function, in which subjects are asked to move as many 2.5 cm blocks as possible from one compartment, over a partition, to an adjoining compartment within one minute. While often used to reliably assess deficits in patients with stroke or other neuromuscular conditions (e.g., upper extremity paresis, spastic hemiplegia), normative data have also been reported for healthy children and adults, ages 6-8926. A virtual movement assessment is used to simulate functional aspects of the validated clinical test performed in real life. VR is used here to decrease required hardware while allowing for the provision of standardized instructions and programmed, automated scoring. As such, constant supervision by trained professionals would no longer be necessary.
The BBT in this study has been simplified to focus on capturing the reaching and grasping of one block at a time that appears in the same location. This maximized the reproducibility of the movements and minimized the inter-subject variability in recorded data. Lastly, virtual reality headsets can be purchased for as little as $300 and have the potential to house multiple assessments. Once programmed, this would significantly decrease the cost associated with typical professional evaluation and allow for increased accessibility of these standard, validated clinical tests in both clinical and remote/at-home settings.
Experimental procedures were approved by the West Virginia University Institutional Review Board (IRB), protocol # 1311129283, and adhered to the principles of the Declaration of Helsinki. Risks from this protocol are minor but it is necessary to explain all procedures and potential risks to participants and written, informed consent was acquired with documentation approved by the institutional ethical review board.
1. System characteristics and design
NOTE: The setup for this protocol consists of the following elements: (1) EMG sensors and base, (2) EMG data acquisition (DAQ) software, (3) a motion capture system, and (4) a VR headset with corresponding software. These components are visualized in Figure 1.
Figure 1: Experimental equipment setup. (A) The marker motion capture cameras are positioned on the floor and in the ceiling around the experimental space, establishing an optimal space for tracking motion. A dedicated computer is used to run the motion capture software and save the data. (B) The headset used to display the modified BBT in VR is connected to a dedicated computer where the virtual assessment and task data are saved. (C) The EMG base is connected to a dedicated computer where muscle activity data is recorded and saved during the task execution. EMG sensors and LED markers for motion capture are both placed on the subject's arm during the session (see Figure 2). Abbreviations: VR = virtual reality; EMG = electromyography. Please click here to view a larger version of this figure.
2. Experimental procedures
NOTE: a visual representation of the experimental flow described in this protocol is shown in Figure 2.
Figure 2: Experimental protocol, VR task, and subject setup. (A) Flow diagram describing the experimental protocol used here. (B) Example view of modified BBT implemented in VR environment. Anatomical measurements are used to calibrate the VR task, ensuring that the virtual table spawns at the correct relative location. (C) Placement of LED motion capture markers and EMG sensors on the subject. EMG sensors are placed on the muscles of interest and LED motion capture markers are positioned over bony landmarks. Abbreviations: VR = virtual reality; EMG = electromyography; LED = light-emitting diode. Please click here to view a larger version of this figure.
EMG, kinematic, and force data obtained from subjects using this protocol can be used to characterize movements across repetitions of the same task, as well as during different tasks. Data shown here represent results from healthy control participants to demonstrate the feasibility of this setup. Representative EMG profiles recorded from a healthy subject performing the modified BBT in VR are shown in Figure 3. High muscle activation of the anterior deltoid (DELT_A) can be seen, suggesting t...
EMG system
The hardware of the EMG system consists of 15 EMG sensors used to obtain muscle activation data. A commercially available Application Programming Interface (API) was used to generate custom EMG recording software. The VR system hardware consists of a virtual reality headset used to display the immersive VR environment and a cable to link the headset to the dedicated computer where the virtual assessment task is stored. The software consists of 3D computer graphics software to create and ...
The authors have no conflicts of interest to declare.
This work was supported by the Office of the Assistant Secretary of Defense for Health Affairs through the Restoring Warfighters with Neuromusculoskeletal Injuries Research Program (RESTORE) under Award No. W81XWH-21-1-0138. Opinions, interpretations, conclusions, and recommendations are those of the authors and are not necessarily endorsed by the Department of Defense.
Name | Company | Catalog Number | Comments |
Armless Chair | N/A | A chair for subjects to sit in should be armless so that their arms are not interfered with. | |
Computer | Dell Technologies | Three computers were used to accompany the data acquisition equipment. | |
Leap Motion Controller | Ultraleap | Optical hand tracking module that captures the hand and finger movement. The controller has two 640 x 240-pixel near-infrared cameras (120 Hz), which are capable of tracking movement up to 60 cm from the device and in a 140 x 120Β° field of view. This device was attached to the VR headset or secured above the head during movement. | |
MATLAB | MathWorks, Inc.Β | Programming platform used to develop custom data acquisition software | |
Oculus Quest 2 | Meta | Immersive virtual reality headset equipped with hand tracking ability through 4 infrared build-in cameras (72-120 Hz). Can be substituted with other similar devices (ex. HTC Vive, HP Reverb, Playstation VR). | |
Oculus Quest 2 Link cable | Meta | Used to connect the headset to the computer where the VR game was stored | |
PhaseSpace Motion Capture | PhaseSpace, Inc. | Markered motion capture system, consisting of a server, cameras with 60Β° field of view, red light emitting diode (LED) as markers, and a calibration object | |
Trigno Wireless System | Delsys, Inc. | By Delsys Inc., includes EMG, accelerometer, force sensors, a base station, and collection software. The Trigno-MATLAB Application Programming Interface (API) was used to develop custom recording software. | |
UnReal Engine 4 | Epic Games | Software used to create and run the modified Box and Block Task in VR |
Request permission to reuse the text or figures of this JoVE article
Request PermissionThis article has been published
Video Coming Soon
Copyright Β© 2025 MyJoVE Corporation. All rights reserved