Published on in Vol 9, No 1 (2022): Jan-Mar

Remote Assessments of Hand Function in Neurological Disorders: Systematic Review

Remote Assessments of Hand Function in Neurological Disorders: Systematic Review

Remote Assessments of Hand Function in Neurological Disorders: Systematic Review

Authors of this article:

Arpita Gopal1 Author Orcid Image ;   Wan-Yu Hsu1 Author Orcid Image ;   Diane D Allen2 Author Orcid Image ;   Riley Bove1 Author Orcid Image

Review

1Weill Institute of Neurosciences, University of California San Francisco, San Francisco, CA, United States

2Department of Physical Therapy and Rehabilitation Science, University of California San Francisco/San Francisco State University, San Francisco, CA, United States

Corresponding Author:

Arpita Gopal, BSc, DPT

Weill Institute of Neurosciences

University of California San Francisco

1651 4th Street

Room 622A

San Francisco, CA, 94143

United States

Phone: 1 415 353 8903

Fax:1 415 353 2633

Email: arpita.gopal@ucsf.edu


Background: Loss of fine motor skills is observed in many neurological diseases, and remote monitoring assessments can aid in early diagnosis and intervention. Hand function can be regularly assessed to monitor loss of fine motor skills in people with central nervous system disorders; however, there are challenges to in-clinic assessments. Remotely assessing hand function could facilitate monitoring and supporting of early diagnosis and intervention when warranted.

Objective: Remote assessments can facilitate the tracking of limitations, aiding in early diagnosis and intervention. This study aims to systematically review existing evidence regarding the remote assessment of hand function in populations with chronic neurological dysfunction.

Methods: PubMed and MEDLINE, CINAHL, Web of Science, and Embase were searched for studies that reported remote assessment of hand function (ie, outside of traditional in-person clinical settings) in adults with chronic central nervous system disorders. We excluded studies that included participants with orthopedic upper limb dysfunction or used tools for intervention and treatment. We extracted data on the evaluated hand function domains, validity and reliability, feasibility, and stage of development.

Results: In total, 74 studies met the inclusion criteria for Parkinson disease (n=57, 77% studies), stroke (n=9, 12%), multiple sclerosis (n=6, 8%), spinal cord injury (n=1, 1%), and amyotrophic lateral sclerosis (n=1, 1%). Three assessment modalities were identified: external device (eg, wrist-worn accelerometer), smartphone or tablet, and telerehabilitation. The feasibility and overall participant acceptability were high. The most common hand function domains assessed included finger tapping speed (fine motor control and rigidity), hand tremor (pharmacological and rehabilitation efficacy), and finger dexterity (manipulation of small objects required for daily tasks) and handwriting (coordination). Although validity and reliability data were heterogeneous across studies, statistically significant correlations with traditional in-clinic metrics were most commonly reported for telerehabilitation and smartphone or tablet apps. The most readily implementable assessments were smartphone or tablet-based.

Conclusions: The findings show that remote assessment of hand function is feasible in neurological disorders. Although varied, the assessments allow clinicians to objectively record performance in multiple hand function domains, improving the reliability of traditional in-clinic assessments. Remote assessments, particularly via telerehabilitation and smartphone- or tablet-based apps that align with in-clinic metrics, facilitate clinic to home transitions, have few barriers to implementation, and prompt remote identification and treatment of hand function impairments.

JMIR Rehabil Assist Technol 2022;9(1):e33157

doi:10.2196/33157

Keywords



Background

Normally functioning human hands allow everyday participation in self-care, work, and leisure activities that involve precise grip and object manipulation [1]. Specifically, daily activities and fine motor tasks require finger dexterity, thumb-finger opposition, and hand opening-closing, which adapt to task requirements, including those needed to navigate the digital world. [2] Unfortunately, chronic disorders of the central nervous system (CNS) can impair hand function even during the early stages of the disease [3]. Damage to the CNS, including the spinal cord, can result in tremor, spasticity, sensory loss, weakness, and coordination loss in the upper limbs, which can negatively impact the ability to adapt to task requirements, thus limiting independence in activities of daily living (ADL) and quality of life [3]. For example, most individuals with Parkinson disease (PD) develop hand tremors over the course of the disorder, leading to difficulty with precise finger and hand movements [4]. In addition, ischemic strokes occur most commonly in the cortical regions supplied by the middle cerebral artery [5], affecting areas of the motor and sensory cortices responsible for the fine motor activity of the hands [6]. In these disorders and others, evaluating hand function at regular intervals can detect changes signaling neurological decline, or monitor response to disease-modifying therapies, symptomatic therapies, or rehabilitation.

Although assessments of hand function are routinely performed in clinics, clinicians have an increasing interest in deploying tools to measure hand function remotely. In-home remote monitoring of function, in general, provides benefits to patients by increasing convenience, reducing travel, and providing the ability to capture data more frequently. Over the past decade, many studies have examined remote monitoring devices in neurological and nonneurological populations [7,8]. For example, in multiple sclerosis (MS), studies have shown that continuous remote monitoring of ambulatory step count can capture—and even predict—changes in MS-related disability and can serve as a longitudinal outcome measure for targeted interventions [9,10]. To date, reviews have mainly focused on lower extremity function or overall physical activity [11]; in fact, the methodological discrepancies in remote device use and reporting regarding hand function have yielded conflicting results in terms of validity, reliability, and ease of clinical use.

Objectives

In this systematic review, we evaluate the existing evidence regarding remote assessment devices for hand function in populations with chronic CNS disorders. We specifically examine evidence of validity, reliability, and feasibility for each domain of hand function and the stage of development of the assessments. Our findings are expected to facilitate ready implementation of remote assessment of hand function in prevalent neurological disorders.


Eligibility Criteria

This review was structured using the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) [12] framework. Studies were included based on the following criteria: (1) participants had chronic neurological pathologies of the CNS, (2) participants were aged ≥18 years, (3) the studies were peer reviewed and original, (4) the studies were designed to objectively assess hand function, and (5) the assessments were deployable remotely (ie, outside of traditional in-person clinical settings). Studies were excluded if they were (1) conducted in participants with orthopedic impairments of the wrist or hand, (2) conducted in nonhuman primates, (3) designed as an intervention to improve an aspect of hand function (as the intent was to focus on assessment tools rather than a change of function), or (4) not published in English.

Search Procedures

A literature search was performed using the following databases: PubMed and MEDLINE, CINAHL, Web of Science, and Embase. The search was conducted using both Medical Subject Heading terms and the following keywords independently and in combination: remote, assessment, outcome, test, measurement, hand, upper extremity, arm, and function. Independently, 2 researchers (AG and WYH) assessed articles for relevance and adherence to the eligibility criteria. Studies were recursively searched to identify cited and cited-by articles.

Data Extraction and Categorization

To evaluate the methodological quality of the included studies, we used the National Institutes of Health quality assessment for observational cohort and cross-sectional studies [13]. Each study was evaluated according to 8 criteria. The overall study quality was assessed as good (>5 criteria met), fair (4-5 criteria met), or poor (<5 criteria met).

The data were extracted (AG) and checked (WYH); discrepancies were resolved through discussion with the senior author (RB). The variables of interest included participant demographics, study design and duration, device type and modality, disease-specific severity levels, comparison assessments, and stage of development and implementation (to understand whether assessments were currently available for use). Participant satisfaction with the study protocol and assessment and time taken to complete the novel assessment were extracted when available. Extracted statistical data included concurrent validity (defined as the comparison between a new test and a well-established one [14]) and reliability (defined as a measure of stability or consistency [15]).

The selected studies evaluated many variables relating to hand function. To compare the most salient domains across studies, we classified assessments into the following hand function domains based on the Functional Repertoire of the Hand established by the American Journal of Occupational Therapy [16]: (1) finger tapping, which is the speed and accuracy of finger taps onto a prespecified target; (2) whole hand grasp, which is the range of motion and coordination of full hand movement; (3) pincer grasp, which is the range of motion and coordination of thumb to index finger movement; (4) hand tremor, which is the quantification of tremor distal to the wrist at rest; (5) reaction time, which is the time taken to respond to a predetermined stimulus using only fingers; (6) pinch and grip strength, which is the quantification of the maximum pinching and gripping strength; (7) finger dexterity, which is the in-hand manipulation of an object; (8) handwriting, which is the clarity and accuracy in drawing or writing; (9) ADL, encompassing tasks required for self-care independence [17]; and (10) instrumental ADL (IADL), encompassing tasks required for household or community-level independence [18].


Search Strategy

A search of databases in June 2021 identified 1295 studies, and 33 additional studies were identified through recursive searches. After title and abstract screening and removal of duplicates, 9.42% (122/1295) of studies remained, and the full texts were assessed for eligibility based on the inclusion and exclusion criteria. Approximately 41% (50/122) of full-text studies were excluded for not meeting the inclusion criteria. The final 74 studies were confirmed by a second reviewer (WYH) to have met all eligibility criteria. The PRISMA diagram of the search process is outlined in Figure 1, and individual studies are summarized in Multimedia Appendix 1 [19-90]. Of the 74 studies reviewed, 49 (66%) were rated good in terms of overall methodological quality, 14 (19%) were rated fair, and 9 (12%) were rated poor. Study quality is summarized in Multimedia Appendix 2 [19-90].

Figure 1. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) diagram outlining study selection.
View this figure

Modalities of Hand Function Assessment

Across the included studies, 3 different modalities of assessment devices were used, summarized in Multimedia Appendix 1. The most frequently used assessment was an external device specific to hand assessment, with the most common types being wrist-worn accelerometers [19-37] and specialized keyboards [38-47]. These designated external devices allowed the collection of information on reaction time, finger tapping speed, and finger dexterity. Although many study authors noted that their external devices were able to capture granular, specific data, many devices were developed under proprietary agreements and are not currently commercially available. The second most common type of assessment was generic smartphone- or tablet-based electronic devices adapted for hand assessment [48-59] or suites of assessments [60-66]. These assessments included an app designed to test finger tapping speed and the accuracy of drawing and tracing various shapes. Such apps facilitated the gathering of data on specific hand function domains at a relatively low cost for people who already had these electronic devices. Finally, 4% (3/74) of studies used telerehabilitation platforms to validate remote administration of well-established in-clinic assessments [67-69]. For example, Amano et al [69] validated the administration of the Fugl-Meyer Assessment and Action Research Arm Test (ARAT) via telehealth platforms, allowing clinical researchers to gather standardized outcome data through secure telehealth tools.

Most of the included studies (51/74, 69%) performed same-day, cross-sectional validation experiments where participants completed novel and comparative assessments at the same time point. However, 28% (21/74) of studies [23,25,36,37,42,46,47,61,63,64,66,70-80] remotely monitored participants’ hand function longitudinally. The duration of the remote monitoring period was 3 days [37] to 3 years [77]. Participant retention and adherence were reported by 5% (4/74) studies [61,66,75,76], all of which had >90% participant retention.

Target Population

The included studies targeted 5 populations of patients with neurological conditions. Most studies (57/74, 77%) included individuals with PD [37,45,65,67,68,70,73,81]. Other populations evaluated were those with stroke (9/74, 12%) [71,72,82] and MS (6/74, 8%) [59,66,91]. Neurological conditions designated as spinal cord injury [83] and amyotrophic lateral sclerosis [47] were described in 1% (1/74) of studies each.

Most included studies evaluated individuals with mild to moderate disease severity on average, as graded by established disease-specific metrics (eg, the Movement Disorder Society–Unified Parkinson’s Disease Rating Scale [MDS-UPDRS] and the Expanded Disability Status Scale for people with MS) [37,45,59,65,67,69,70,73,82]. Only 8% (6/74) of studies specified the inclusion criteria to limit recruitment to participants with mild to moderate disease severity [37,41,53,58,63,71].

The sample sizes of studies varied between 1 (case study) [26] and 495 participants [66] in the experimental groups. Most studies (41/74, 55%) included control groups of healthy individuals or those with nonneurological conditions in determining the discriminant validity of the assessments (Multimedia Appendix 1).

Validity and Reliability

Validity data were reported by 73% (54/74) of heterogeneous studies for comparison with well-established in-clinic assessments (Table 1). Approximately 12% (9/74) of studies examining external devices reported high, statistically significant correlations with well-established assessments [19,20,47,50,52,72,73,83,91]. In addition, 8% (6/74) of studies using smartphone assessments [28,49,52,66,79,84] and 1% (1/74) of studies using telerehabilitation [69] found moderate to high, statistically significant correlations with well-established assessments.

Table 1. Validity and reliability.
StudyComparison assessmentValidityReliability
Adams [46]a
  • Hand tremor (AUCb=0.76)
Aghanavesi et al [48]MDS-UPDRSc
  • Finger tapping (r=0.23)
  • Handwriting (r=0.46)
Interrater reliability:
  • Finger tapping (r=0.61)
  • Handwriting (r=0.65)
Akram et al [38]MDS-UPDRS
  • Finger tapping (r=−0.49; P<.001)
Albani et al [73]MDS-UPDRS
  • Finger tapping (ICCd=0.73)
Amano et al [69]In-clinic assessment
  • Finger dexterity (r=0.99)
  • Whole hand grasp (r=0.99)
  • Pincer grasp (r=0.99)
Interrater reliability:
  • Finger dexterity (r=0.99)
Arora et al [70]MDS-UPDRS
  • Finger tapping (mean error of 1.26 UPDRSe points)
Arroyo-Gallego et al [49]MDS-UPDRS
  • Finger tapping (AUC=0.85; P<.001)
Bazgir et al [50]MDS-UPDRS
  • Hand tremor (97% accuracy)
Bochniewicz et al [82]ARATf
  • IADLg (r=−0.14; P=.70)
Boroojerdi et al [37]MDS-UPDRS
  • Finger tapping (r=0.291)
  • Hand tremor (r=0.746)
Burdea et al [71]
Cabrera-Martos et al [67]In-clinic assessmentInterrater reliability:
  • Finger dexterity (r=0.89)
  • Finger tapping (r=1.0)
  • Hand tremor (r=0.99)
Cai et al [19]MDS-UPDRS
  • Hand tremor (r2=0.95)
Channa et al [20]MDS-UPDRS
  • Hand tremor (91.7% accuracy)
Cole et al [21]MDS-UPDRS
Creagh et al [59]9HPTh
  • Handwriting: dominant hand (r2=0.39) and nondominant hand (r2=0.41)
Cunningham et al [74]
Dai et al [22]MDS-UPDRS
  • Finger tapping (r=−0.970; P<.01)
  • Hand tremor (r=0.93; P<.001)
Interrater agreement (Kendall W):
  • Finger tapping (0.86)
  • Hand tremor (0.84)
Dubuisson et al [91]9HPT
  • Finger dexterity (r=0.9; P<.001)
Ferreira et al [23]MDS-UPDRS
Giancardo et al [39]MDS-UPDRS
  • Finger tapping (AUC=0.75)
Giuffrida et al [24]MDS-UPDRS
  • Hand tremor (r=0.89)
Goetz et al [75]MDS-UPDRS
Halloran et al [25]CAHAIi
  • ADLj (r= 0.63; P<.001)
Heijmans et al [26]ESMk app (tremor questionnaire)
  • Hand tremor (r=0.43)
Hoffman et al [68]In-clinic assessment
  • Hand tremor (83.3% agreement)
  • Handwriting (41.6% agreement)
Interrater reliability:
  • Finger dexterity (r=0.99)
Hssayeni et al [27]MDS-UPDRS
  • Hand tremor (r=0.84)
Iakovakis et al [52]MDS-UPDRS
  • Finger tapping (AUC=0.92)
Iakovakis et al [51]MDS-UPDRS
  • Finger tapping (r=0.66)
Jeon et al [28]MDS-UPDRS
  • Hand tremor (85.5% agreement)
Jha et al [60]MDS-UPDRS
  • Hand tremor (κ=0.68; P<.001, substantial)
  • Finger tapping (κ=0.54; P<.001, moderate)
Interrater agreement:
  • Hand tremor (96%)
  • Finger tapping (50%)
Kim et al [29]MDS-UPDRS
  • Hand tremor (85% accuracy)
Interrater reliability:
  • Hand tremor (r=0.78)
Kleinholdermann et al [85]MDS-UPDRS
  • Finger tapping (r=0.445)
Kostikis et al [81]MDS-UPDRS
  • Hand tremor,: right hand (r=0.75; P<.001) and left hand (r=0.85; P<.001)
Lam et al [41]9HPT
  • Finger dexterity (r=−0.553)
Test–retest reliability:
  • Finger dexterity (ICC 0.601)
Lee et al [53]MDS-UPDRS
  • Finger tapping (AUC=0.92, 95% CI 0.88-0.96)
Lee et al [84]FMAl
  • Whole hand grasp (92% accuracy)
Lee et al [54]MDS-UPDRS
Lin et al [88]
Lipsmeier et al [61]MDS-UPDRS
  • Finger tapping (t=2.18; P=.03)
  • Hand tremor (t=2.17; P=.03)
Test–retest reliability:
  • Finger tapping (ICC=0.64)
  • Hand tremor (ICC=0.90)
Londral et al [47]Test–retest reliability:
  • r=0.96; P=.09
Lopez-Blanco et al [76]MDS-UPDRS
  • Hand tremor (r=0.81; P<.001)
Interrater reliability:
  • Hand tremor (ICC=0.89)
Mahadevan et al [30]MDS-UPDRS
  • Hand tremor (r=0.67; P<.001)
Interrater reliability:
  • Hand tremor (ICC=0.75)
Matarazzo et al [42]UPDRS-3
Memedi et al [77]Visual assessment
  • Handwriting (85% accuracy)
Test–retest reliability:
  • Handwriting (ICC=0.69)
Mera et al [31]
Mitsi et al [65]MDS-UPDRS
Noyce et al [43]MDS-UPDRS
  • Finger tapping (r=0.53)
Orozco-Arroyave et al [62]UPDRS-3
Pan et al [63]MDS-UPDRS
  • Hand tremor (r=0.81)
Papadopoulos et al [40,55]MDS-UPDRS
Powers et al [78]MDS-UPDRS
  • Hand tremor (r=0.72)
Pratap et al [66]Longitudinal Neuro-QoLm scores
  • Finger tapping (β=.40; P<.001)
Prochazka and Kowalczewski [83]ARAT and FMA
  • Finger dexterity (r2=0.49)
  • Whole hand grasp, (r2=0.88)
  • Pincer grasp (r2=0.88)
Test–retest reliability:
  • 0.67% (SD 3.6)
Rigas et al [32]MDS-UPDRS
  • Hand tremor (87% accuracy)
Salarian et al [87]MDS-UPDRS
  • Hand tremor (r=0.87; P<.001)
San-Segundo et al [33]
Sanchez-Perez et al [34]MDS-UPDRS
Schallert et al [56]
Shribman et al [44]9HPT
  • Finger tapping (r=0.926)
Sigcha et al [79]MDS-UPDRS
  • Hand tremor (r=0.969)
Simonet et al [57]MDS-UPDRS
  • Finger tapping (r=−0.49)
Stamatakis et al [35]MDS-UPDRS
  • Finger tapping (Goodman–Kruskal index=0.961)
Tavares et al [86]MDS-UPDRS
  • Finger tapping (r=0.67; P<.001)
Trager et al [45]MDS-UPDRS
  • Finger dexterity (r=0.14; P=.43)
  • Finger tapping (r=0.58; P<.001)
Westin et al [80]MDS-UPDRS
  • Handwriting (r=0.41)
Test–retest reliability:
  • Handwriting (r=0.71)
Wissel et al [58]MDS-UPDRS
  • Finger tapping (r=0.55)
Test–retest reliability:
  • Finger tapping (r>0.75)
Wu et al [89]MDS-UPDRS
  • Hand tremor (r=−0.798)
Yu et al [72]FMA
  • Finger dexterity (r2=0.70)
  • Pinch strength (r2=0.72)
Zambrana et al [90]
Zhan et al [64]MDS-UPDRS
  • Finger tapping (mean 71%, SD 0.4%)
Zhang et al [36]MDS-UPDRS
  • Hand tremor (85.9% accuracy)

aData unavailable.

bAUC: area under the curve.

cMDS-UPDRS: Movement Disorder Society–Unified Parkinson’s Disease Rating Scale.

dICC: interclass coefficient.

eUPDRS: Unified Parkinson’s Disease Rating Scale.

fARAT: Action Research Arm Test.

gIADL: instrumental activities of daily living.

h9HPT: 9-hole peg test.

iCAHAI: Chedoke Arm and Hand Inventory.

jADL: activities of daily living.

kESM: experience sampling method.

lFMA: Fugl-Meyer Assessment.

mQoL: quality of life.

Of the 74 studies, 15 (20%) heterogeneous studies reported reliability statistics; 2 (3%) telerehabilitation assessments [68,69] revealed a high, statistically significant interrater reliability; and 1 (1%) external device assessment [76] revealed a high, although statistically insignificant reliability.

Hand Function Domain, Based on the Functional Repertoire of the Hand

Finger Tapping Speed

The most common hand function domain assessed was finger tapping speed [22,31,35,37-39,42-45,48,49,51-54,57,58,60-62, 64-67,70,73,75,80,85,86,92]. Finger tapping can provide clinicians with an understanding of fine motor control and stiffness, especially in individuals with spasticity. Of the included studies that examined finger tapping, Albani et al [73] reported the highest correlation with MDS-UPDRS scores in participants with PD. In their study, the authors used an external device, a gesture-based tracking system involving a specialized depth camera and gloves with colored markers, to track and quantify fine hand movements. The MDS-UPDRS item on finger tapping relies on visual assessments of finger tapping (eg, interruptions in the tapping rhythm), and specialized equipment such as an external device aid in quantifying finger tapping capability [73].

Hand Tremor

The second most commonly assessed domain was hand tremor, a prevalent impairment in many neurological disorders. Quantifying tremors can help determine the efficacy of pharmacological and rehabilitative therapies. The studies that examined this domain were conducted in participants with PD [19-21,23,24,26-30,32-34,36,37,46,50,55,60,61,63,64,67,68,74-76,78,79,81,87]. Hoffman et al [68] found a 100% agreement of their visual examination of hand tremor at rest in their evaluation of telerehabilitation administration of the MDS-UPDRS assessment in comparison with in-clinic evaluation. Sigcha et al [79] developed a novel smartphone app using an internal gyroscope and accelerometer to measure resting hand tremors. This method had a strong correlation (r=0.97) with in-clinic MDS-UPDRS resting hand tremor scores.

Finger Dexterity

The third most commonly assessed domain was finger dexterity [41,45,47,67,68,72,83,88,91]. Finger dexterity assessment tasks included manipulation of small objects (eg, the 9-hole peg test [9HPT] and the coin rotation test), which are useful metrics of fine motor control required for ADL, such as buttoning clothing. Finger dexterity was examined in all 5 of the neurological conditions examined in this review. Of the included studies examining participants with PD, Cabrera-Martos et al [67] found a mean difference of 0.3 (SD 1.2) in scores between telerehabilitation and in-clinic administration of the coin rotation task [93] in the affected limb. Similarly, using telerehabilitation to examine the pinch domain of participants with stroke, Amano et al [69] reported a Spearman ρ of 0.99 between telerehabilitation and in-clinic administered items. In participants with MS, Dubuisson et al [91] validated an external device, a cardboard 9HPT with a correlation of 0.96 between this novel assessment tool and a standard, plastic 9HPT.

Handwriting

Approximately 8% (6/74) of studies [48,56,59,68,77,80] examined handwriting accuracy, a specific and sensitive measure of fine motor coordination. The greatest accuracy in comparison with in-clinic assessments was reported by Hoffman et al [68], who found a high percentage of agreement (85%) between in-clinic measures and an external telemetry device of the MDS-UPDRS item for handwriting.

Specific Functions

Specific functional domains were evaluated by 11% (8/74) of studies. Grip and pinch strength were examined in 4% (3/74) of studies [68,72,83] using remote deployment of these standard in-clinic metrics. Prochazka et al [83] evaluated the validity of a novel external device to collect force data from grip and pinch tasks and found a coefficient of determination (R2) of 0.88 between the remote device and in-clinic administered ARAT. Only 4% (3/74) of studies [25,68,82] specifically examined ADL and IADL. Hoffman et al [68] compared in-clinic and telerehabilitation-administered functional independence measures and found 100% agreement in scores for eating and 91.7% agreement for dressing. Bochniewicz et al [82] developed a wrist-worn accelerometer to capture and quantify disability in individuals after stroke. The protocol simulated IADL such as doing laundry and shopping in a grocery store, and the authors reported 88.4% accuracy compared with ARAT scores of upper extremity functional use.

Participant Acceptability

In populations with PD, 9% (7/74) of studies reported participant acceptability and usability of assessments. Albani et al [73] found that participants rated the hand gesture–based tracking system 5.9/7 on a poststudy usability questionnaire, indicating ease of use, high interface quality, and usefulness. In 4% (3/74) of studies [24,30,37], participants using wearable sensors to monitor hand tremors and finger tapping found the devices comfortable and easy to use. Both Goetz et al [75] and Ferreira et al [23] reported >80% of participant satisfaction with external devices to examine hand tremors. Mitsi et al [65] found that 76% of participants using a tablet-based assessment for finger tapping [65] and reaction time found it easy to use, with an additional 63% reporting willingness to use it long-term to monitor disease activity.

In populations with stroke, Burdea et al [71] asked both participants and caregivers to provide feedback on their video game–like assessment and intervention using a 5-point study-specific Likert scale (higher scores indicating statement agreement). Participants reported that the device was moderately easy to use (mean score 3.1/5.0), that they would encourage others to use it (mean score 4.3/5.0), and that they liked the system overall (mean score 4.2/5.0). However, participants encountered some technical difficulties during use (mean score 2.2/5.0). Caregivers also found the device setup appropriate for the home environment and easy to use (mean score 3.5/5.0).

In people with MS, Dubuisson et al [91] reported that 66.7% of participants preferred the portable in-home 9HPT in comparison with the standard in-clinic version.

Safety

Only 3% (2/74) of studies reported safety data [37,68]. Hoffman et al [68] reported that participants who received assessment via telerehabilitation were accompanied by a researcher to ensure safety. Boroojerdi et al [37] used a wearable patch and reported no adverse skin reactions at the application site or device malfunction. Adverse events were not reported in any of the included studies.

Stage of Development and Implementation

As the assessments in this review were novel, the availability for clinical implementation varied. Most studies (44/74, 59%) evaluated assessments requiring specialized equipment for implementation. These devices included specialized cameras, wearable devices, electromyography, and specialized keyboards. Although not an application, the cardboard 9HPT developed by Dubuisson et al [91] was designed specifically to be environmentally friendly, cost-effective, and used by patients at home. The remaining external devices evaluated in this review were designated as developmental, with a need for subsequent safety and prospective studies on usability before clinical use.

Approximately 3% (2/74) of studies using telerehabilitation methods required videoconferencing devices and a stable internet connection for both providers and patients for implementation. However, although Hoffman et al [68] similarly used telerehabilitation methods, their protocol required participants to use clinical equipment during in-home assessments (eg, a hand dynamometer and the 9HPT), potentially limiting widespread implementation.

A smartphone or tablet-based application was used in 27% (20/74) of studies to administer assessments. The FLOODLIGHT application studied by Creagh et al [59] is currently available for download for iOS and Android devices. The remaining applications were study-specific developments but, given compatible devices and secure broadband internet connection availability, have limited barriers to implementation.


Principal Findings

The purpose of this review was to systematically gather available literature on remote assessments for monitoring hand function in people with central, chronic, and neurological diseases. The search yielded 74 studies that met the inclusion criteria, and 71 unique assessments were examined for validity, reliability, and clinical implementation. A wide variety of metrics were collected on a number of hand function domains, including the amplitude of finger tapping, finger dexterity, hand tremor, and ADL independence. Altogether, the studies provide a number of insights; however, to date, no single tool, or combination of tools, validly and reliably captures hand function across these major neurological conditions.

Many of the studies were of good quality, and several study characteristics were found to enhance their quality. Including controls with nonneurological conditions as a comparison, when available, helped demonstrate the discriminant validity of the novel assessments examined. Most studies included participants with lower disability status, which likely allowed for more dynamic testing of hand function domains. Unfortunately, most of the included studies reported statistically insignificant associations with standard in-clinic metrics. As prior literature suggests that traditional in-clinic assessments have limited granularity for upper limb function in populations with neurological conditions, differences between the novel assessments and these traditional in-clinic tests could indicate that the new tools capture additional aspects of function (eg, quantifying pincer grasp) relative to the traditional in-clinic assessments or vice versa. In addition, few studies reported reliability, especially interrater reliability, suggesting the need for more research and that the included tools remain primarily in the development phase.

The most commonly assessed hand function domain was finger tapping speed, with moderate to high agreement across comparison assessments. The finger tapping test is a valid and reliable measure of bradykinesia in PD [94] and a predictor of ADL independence in acute stroke [95]. It is relatively simple to quantify finger tapping in-clinic or via a smartphone or tablet app by counting the number of finger taps within a specific time frame. Although overall construct validity and participant satisfaction were high, further work in other hand function domains will help determine the most salient predictors of ADL independence and response to treatment and intervention.

This review highlights important aspects of the feasibility of remote evaluations. Participant and caregiver satisfaction, when reported, were moderate to high for these technologically innovative assessments. This suggests that participants found the novel assessments easy to use and effective in evaluating their hand function despite being nontraditional. Further, 28% (21/74) of the included studies demonstrated the feasibility of remotely monitoring hand function over multiple days. This is a key finding, as long-term monitoring of hand function in a patient’s natural environment has the potential to identify changes in real time, allowing for timely intervention modifications.

Regarding patient safety, although the included assessments were noninvasive and posed a relatively low safety risk, ensuring the secure transfer of data, especially with internet-based communication (eg, telerehabilitation and smartphone or tablet-based apps) between patient and clinician, is critical to confidentiality and Health Insurance Portability Accountability Act compliance. Future studies should report on data storage and encryption methodologies.

The assessments evaluated were in varying stages of development and implementation. The most readily implementable types of assessment were those using telerehabilitation or smartphone- or tablet-based apps. According to 2019 data, 85% of Americans own a smartphone, and 93% use the internet regularly, of whom 75% use a home high-speed broadband network [96]. Given these statistics, telerehabilitation and application-based assessments, if interoperable across devices, might be relatively accessible for most patients. Lower costs could make clinical implementation less of a challenge. Furthermore, with no specialized devices to purchase or distribute to patients, clinics could similarly benefit from these cost-effective measures.

Limitations

A major limitation of this review is the heterogeneity of hand function domains evaluated, which, when compounded with the methodological variability (in comparison assessments, inclusion criteria, and statistical approaches), made it difficult to compare the various tools. Future studies that include more homogeneous patient populations and standardized reporting of correlation coefficients with comparison assessments will facilitate analysis across domains and assessment types. A second limitation was the paucity of studies conducting repeated trials of the assessments, limiting the identification of any practice effects with use of a new device. In repeated trials of smartphone-based assessments, performance improved in the first 10 trials because of a practice effect, followed by a narrowing of variance as the practice effect waned and familiarity with the assessment increased [97]. Follow-up studies should include repeated trials, preferably over multiple days, to capture these effects and fluctuations in disease progression. Third, the effect of confounding variables (eg, disease-modifying therapies, age, and disease duration) was infrequently described in validity statistics; the generalizability of this review should proceed with caution. Fourth, all tools included require active participant engagement as opposed to passive monitoring (eg, collecting data on dexterity as a participant types to complete a survey). Passive monitoring may be able to capture similar metrics with a reduced participant time burden. Finally, we may have missed relevant studies published in non-English languages.

Conclusions

This review suggests that remote assessments can be valid and reliable tools for measuring hand function impairments in chronic neurological diseases and that doing so is clinically feasible and acceptable to patients. In the past decade, personal smartphone and computer ownership have become commonplace; with it, patients and health care providers are able to communicate in real time, opening new avenues for care delivery and disease monitoring. We highlight the current potential to implement remote assessments via telerehabilitation and smartphone- or tablet-based apps. As interventions for ambulation and lower extremity function become increasingly robust, these methods will allow clinicians to reliably assess multiple domains of hand function to monitor disease progression and response to interventions.

Conflicts of Interest

WYH is supported by the National Multiple Sclerosis Society (FG-1908-34831). RB receives research support to University of California, San Francisco from Biogen and Roche Genentech, as well as personal fees for consulting from Alexion, Biogen, EMD Serono, Genzyme Sanofi, Novartis, and Roche Genentech.

Multimedia Appendix 1

Summary of studies.

DOCX File , 41 KB

Multimedia Appendix 2

Quality assessment of studies.

DOCX File , 31 KB

  1. Institute for Quality and Efficiency in Health Care (IQWiG). How do hands work? Informedhealth. Cologne, Germany: Institute for Quality and Efficiency in Health Care (IQWiG); 2021.   URL: https://www.informedhealth.org/how-do-hands-work.html [accessed 2021-06-17]
  2. Chen CC, Kasven N, Karpatkin HI, Sylvester A. Hand strength and perceived manual ability among patients with multiple sclerosis. Arch Phys Med Rehabil 2007;88(6):794-797. [CrossRef] [Medline]
  3. Butler DP, Murray A, Horwitz M. Hand manifestations of neurological disease: some alternatives to consider. Br J Gen Pract 2016;66(647):331-332 [FREE Full text] [CrossRef] [Medline]
  4. Baumann CR. Epidemiology, diagnosis and differential diagnosis in Parkinson's disease tremor. Parkinsonism Relat Disord 2012;18 Suppl 1:S90-S92. [CrossRef] [Medline]
  5. Ng YS, Stein J, Ning M, Black-Schaffer RM. Comparison of clinical characteristics and functional outcomes of ischemic stroke in different vascular territories. Stroke 2007;38(8):2309-2314. [CrossRef] [Medline]
  6. Jang SH, Chang MC. Motor outcomes of patients with a complete middle cerebral artery territory infarct. Neural Regen Res 2013;8(20):1892-1897 [FREE Full text] [CrossRef] [Medline]
  7. Soon S, Svavarsdottir H, Downey C, Jayne DG. Wearable devices for remote vital signs monitoring in the outpatient setting: an overview of the field. BMJ Innov 2020;6(2):55-71. [CrossRef]
  8. Ganeshan R, Enriquez AD, Freeman JV. Remote monitoring of implantable cardiac devices: current state and future directions. Curr Opin Cardiol 2018;33(1):20-30. [CrossRef] [Medline]
  9. Block VJ, Lizée A, Crabtree-Hartman E, Bevan CJ, Graves JS, Bove R, et al. Continuous daily assessment of multiple sclerosis disability using remote step count monitoring. J Neurol 2017;264(2):316-326 [FREE Full text] [CrossRef] [Medline]
  10. Block VJ, Bove R, Zhao C, Garcha P, Graves J, Romeo AR, et al. Association of continuous assessment of step count by remote monitoring with disability progression among adults with multiple sclerosis. JAMA Netw Open 2019;2(3):e190570 [FREE Full text] [CrossRef] [Medline]
  11. Block VA, Pitsch E, Tahir P, Cree BA, Allen DD, Gelfand JM. Remote physical activity monitoring in neurological disease: a systematic review. PLoS One 2016;11(4):e0154335 [FREE Full text] [CrossRef] [Medline]
  12. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ 2009;339:b2535 [FREE Full text] [CrossRef] [Medline]
  13. Study quality assessment tools. National Heart, Lung, and Blood Institute. 2021.   URL: https://www.nhlbi.nih.gov/health-topics/study-quality-assessment-tools [accessed 2021-05-12]
  14. Glen S. Concurrent validity definition and examples. Statistics How To. 2015.   URL: https://www.statisticshowto.com/concurrent-validity/ [accessed 2021-04-06]
  15. Glen S. Reliability and validity in research: definitions, examples. Statistics How To. 2016.   URL: https://www.statisticshowto.com/reliability-validity-definitions-examples/ [accessed 2021-04-06]
  16. Kimmerle M, Mainwaring L, Borenstein M. The functional repertoire of the hand and its application to assessment. Am J Occup Ther 2003;57(5):489-498. [CrossRef] [Medline]
  17. Edemekong PF, Bomgaars DL, Sukumaran S, Levy SB. Activities of daily living. Treasure Island, FL: StatPearls Publishing; 2021.
  18. Guo HJ, Sapra A. Instrumental activity of daily living. Treasure Island, FL: StatPearls Publishing; 2021.
  19. Cai G, Lin Z, Dai H, Xia X, Xiong Y, Horng SJ, et al. Quantitative assessment of parkinsonian tremor based on a linear acceleration extraction algorithm. Biomed Signal Process Control 2018;42:53-62. [CrossRef]
  20. Channa A, Ifrim RC, Popescu D, Popescu N. A-WEAR bracelet for detection of hand tremor and bradykinesia in Parkinson's patients. Sensors (Basel) 2021;21(3):981 [FREE Full text] [CrossRef] [Medline]
  21. Cole BT, Roy SH, De Luca CJ, Nawab SH. Dynamical learning and tracking of tremor and dyskinesia from wearable sensors. IEEE Trans Neural Syst Rehabil Eng 2014;22(5):982-991. [CrossRef] [Medline]
  22. Dai H, Cai G, Lin Z, Wang Z, Ye Q. Validation of inertial sensing-based wearable device for tremor and bradykinesia quantification. IEEE J Biomed Health Inform 2021;25(4):997-1005. [CrossRef] [Medline]
  23. Ferreira JJ, Godinho C, Santos AT, Domingos J, Abreu D, Lobo R, et al. Quantitative home-based assessment of Parkinson's symptoms: the SENSE-PARK feasibility and usability study. BMC Neurol 2015;15:89 [FREE Full text] [CrossRef] [Medline]
  24. Giuffrida JP, Riley DE, Maddux BN, Heldman DA. Clinically deployable Kinesia technology for automated tremor assessment. Mov Disord 2009;24(5):723-730. [CrossRef] [Medline]
  25. Halloran S, Tang L, Guan Y, Shi JQ, Eyre J. Remote monitoring of stroke patients' rehabilitation using wearable accelerometers. In: Proceedings of the 23rd International Symposium on Wearable Computers. New York, NY: Association for Computing Machinery; 2019 Presented at: ISWC '19; September 9-13, 2019; London, UK p. 72-77. [CrossRef]
  26. Heijmans M, Habets J, Kuijf M, Kubben P, Herff C. Evaluation of Parkinson's disease at home: predicting tremor from wearable sensors. Annu Int Conf IEEE Eng Med Biol Soc 2019;2019:584-587. [CrossRef] [Medline]
  27. Hssayeni MD, Jimenez-Shahed J, Burack MA, Ghoraani B. Wearable sensors for estimation of Parkinsonian tremor severity during free body movements. Sensors (Basel) 2019;19(19):4215 [FREE Full text] [CrossRef] [Medline]
  28. Jeon H, Lee W, Park H, Lee HJ, Kim SK, Kim HB, et al. Automatic classification of tremor severity in Parkinson's disease using a wearable device. Sensors (Basel) 2017;17(9):2067 [FREE Full text] [CrossRef] [Medline]
  29. Kim HB, Lee WW, Kim A, Lee HJ, Park HY, Jeon HS, et al. Wrist sensor-based tremor severity quantification in Parkinson's disease using convolutional neural network. Comput Biol Med 2018;95:140-146. [CrossRef] [Medline]
  30. Mahadevan N, Demanuele C, Zhang H, Volfson D, Ho B, Erb MK, et al. Development of digital biomarkers for resting tremor and bradykinesia using a wrist-worn wearable device. NPJ Digit Med 2020;3:5 [FREE Full text] [CrossRef] [Medline]
  31. Mera TO, Heldman DA, Espay AJ, Payne M, Giuffrida JP. Feasibility of home-based automated Parkinson's disease motor assessment. J Neurosci Methods 2012;203(1):152-156 [FREE Full text] [CrossRef] [Medline]
  32. Rigas G, Tzallas AT, Tsipouras MG, Bougia P, Tripoliti EE, Baga D, et al. Assessment of tremor activity in the Parkinson's disease using a set of wearable sensors. IEEE Trans Inf Technol Biomed 2012;16(3):478-487. [CrossRef] [Medline]
  33. San-Segundo R, Zhang A, Cebulla A, Panev S, Tabor G, Stebbins K, et al. Parkinson's disease tremor detection in the wild using wearable accelerometers. Sensors (Basel) 2020;20(20):5817 [FREE Full text] [CrossRef] [Medline]
  34. Sanchez-Perez LA, Sanchez-Fernandez LP, Shaout A, Martinez-Hernandez JM, Alvarez-Noriega MJ. Rest tremor quantification based on fuzzy inference systems and wearable sensors. Int J Med Inform 2018;114:6-17. [CrossRef] [Medline]
  35. Stamatakis J, Ambroise J, Crémers J, Sharei H, Delvaux V, Macq B, et al. Finger tapping clinimetric score prediction in Parkinson's disease using low-cost accelerometers. Comput Intell Neurosci 2013;2013:717853 [FREE Full text] [CrossRef] [Medline]
  36. Zhang A, De la Torre F, Hodgins J. Comparing laboratory and in-the-wild data for continuous Parkinson's disease tremor detection. Annu Int Conf IEEE Eng Med Biol Soc 2020;2020:5436-5441. [CrossRef] [Medline]
  37. Boroojerdi B, Ghaffari R, Mahadevan N, Markowitz M, Melton K, Morey B, et al. Clinical feasibility of a wearable, conformable sensor patch to monitor motor symptoms in Parkinson's disease. Parkinsonism Relat Disord 2019;61:70-76. [CrossRef] [Medline]
  38. Akram N, Li H, Ben-Joseph A, Budu C, Gallagher DA, Bestwick JP, et al. Developing and assessing a new web-based tapping test for measuring distal movement in Parkinson's disease: a Distal Finger Tapping test. Sci Rep 2022;12(1):386 [FREE Full text] [CrossRef] [Medline]
  39. Giancardo L, Sánchez-Ferro A, Arroyo-Gallego T, Butterworth I, Mendoza CS, Montero P, et al. Computer keyboard interaction as an indicator of early Parkinson's disease. Sci Rep 2016;6:34468 [FREE Full text] [CrossRef] [Medline]
  40. Papadopoulos A, Iakovakis D, Klingelhoefer L, Bostantjopoulou S, Chaudhuri KR, Kyritsis K, et al. Unobtrusive detection of Parkinson's disease from multi-modal and in-the-wild sensor data using deep learning techniques. Sci Rep 2020;10(1):21370 [FREE Full text] [CrossRef] [Medline]
  41. Lam KH, Meijer KA, Loonstra FC, Coerver E, Twose J, Redeman E, et al. Real-world keystroke dynamics are a potentially valid biomarker for clinical disability in multiple sclerosis. Mult Scler 2021;27(9):1421-1431 [FREE Full text] [CrossRef] [Medline]
  42. Matarazzo M, Arroyo-Gallego T, Montero P, Puertas-Martín V, Butterworth I, Mendoza CS, et al. Remote monitoring of treatment response in Parkinson's disease: the habit of typing on a computer. Mov Disord 2019;34(10):1488-1495. [CrossRef] [Medline]
  43. Noyce AJ, Nagy A, Acharya S, Hadavi S, Bestwick JP, Fearnley J, et al. Bradykinesia-akinesia incoordination test: validating an online keyboard test of upper limb function. PLoS One 2014;9(4):e96260 [FREE Full text] [CrossRef] [Medline]
  44. Shribman S, Hasan H, Hadavi S, Giovannoni G, Noyce AJ. The BRAIN test: a keyboard-tapping test to assess disability and clinical features of multiple sclerosis. J Neurol 2018;265(2):285-290 [FREE Full text] [CrossRef] [Medline]
  45. Trager MH, Wilkins KB, Koop MM, Bronte-Stewart H. A validated measure of rigidity in Parkinson's disease using alternating finger tapping on an engineered keyboard. Parkinsonism Relat Disord 2020;81:161-164 [FREE Full text] [CrossRef] [Medline]
  46. Adams WR. High-accuracy detection of early Parkinson's disease using multiple characteristics of finger movement while typing. PLoS One 2017;12(11):e0188226 [FREE Full text] [CrossRef] [Medline]
  47. Londral A, Pinto S, de Carvalho M. Markers for upper limb dysfunction in Amyotrophic Lateral Sclerosis using analysis of typing activity. Clin Neurophysiol 2016;127(1):925-931. [CrossRef] [Medline]
  48. Aghanavesi S, Nyholm D, Senek M, Bergquist F, Memedi M. A smartphone-based system to quantify dexterity in Parkinson's disease patients. Inform Med Unlocked 2017;9:11-17. [CrossRef]
  49. Arroyo-Gallego T, Ledesma-Carbayo MJ, Sanchez-Ferro A, Butterworth I, Mendoza CS, Matarazzo M, et al. Detection of motor impairment in Parkinson's disease via mobile touchscreen typing. IEEE Trans Biomed Eng 2017;64(9):1994-2002. [CrossRef] [Medline]
  50. Bazgir O, Habibi SA, Palma L, Pierleoni P, Nafees S. A classification system for assessment and home monitoring of tremor in patients with Parkinson's disease. J Med Signals Sens 2018;8(2):65-72 [FREE Full text] [Medline]
  51. Iakovakis D, Chaudhuri KR, Klingelhoefer L, Bostantjopoulou S, Katsarou Z, Trivedi D, et al. Screening of Parkinsonian subtle fine-motor impairment from touchscreen typing via deep learning. Sci Rep 2020;10(1):12623 [FREE Full text] [CrossRef] [Medline]
  52. Iakovakis D, Hadjidimitriou S, Charisis V, Bostantzopoulou S, Katsarou Z, Hadjileontiadis LJ. Touchscreen typing-pattern analysis for detecting fine motor skills decline in early-stage Parkinson's disease. Sci Rep 2018;8(1):7663 [FREE Full text] [CrossRef] [Medline]
  53. Lee CY, Kang SJ, Hong SK, Ma HI, Lee U, Kim YJ. A validation study of a smartphone-based finger tapping application for quantitative assessment of bradykinesia in Parkinson's disease. PLoS One 2016;11(7):e0158852 [FREE Full text] [CrossRef] [Medline]
  54. Lee U, Kang SJ, Choi JH, Kim YJ, Ma HI. Mobile application of finger tapping task assessment for early diagnosis of Parkinson's disease. Electron lett 2016;52(24):1976-1978. [CrossRef]
  55. Papadopoulos A, Kyritsis K, Klingelhoefer L, Bostanjopoulou S, Chaudhuri KR, Delopoulos A. Detecting Parkinsonian tremor from IMU data collected in-the-wild using deep multiple-instance learning. IEEE J Biomed Health Inform 2020;24(9):2559-2569. [CrossRef] [Medline]
  56. Schallert W, Fluet MC, Kesselring J, Kool J. Evaluation of upper limb function with digitizing tablet-based tests: reliability and discriminative validity in healthy persons and patients with neurological disorders. Disabil Rehabil 2020:1-9. [CrossRef] [Medline]
  57. Simonet C, Galmes MA, Lambert C, Rees RN, Haque T, Bestwick JP, et al. Slow motion analysis of repetitive tapping (SMART) test: measuring bradykinesia in recently diagnosed Parkinson's disease and idiopathic anosmia. J Parkinsons Dis 2021;11(4):1901-1915. [CrossRef] [Medline]
  58. Wissel BD, Mitsi G, Dwivedi AK, Papapetropoulos S, Larkin S, López Castellanos JR, et al. Tablet-based application for objective measurement of motor fluctuations in Parkinson disease. Digit Biomark 2017;1(2):126-135 [FREE Full text] [CrossRef] [Medline]
  59. Creagh AP, Simillion C, Scotland A, Lipsmeier F, Bernasconi C, Belachew S, et al. Smartphone-based remote assessment of upper extremity function for multiple sclerosis using the Draw a Shape Test. Physiol Meas 2020;41(5):054002. [CrossRef] [Medline]
  60. Jha A, Menozzi E, Oyekan R, Latorre A, Mulroy E, Schreglmann SR, et al. The CloudUPDRS smartphone software in Parkinson's study: cross-validation against blinded human raters. NPJ Parkinsons Dis 2020;6(1):36 [FREE Full text] [CrossRef] [Medline]
  61. Lipsmeier F, Taylor KI, Kilchenmann T, Wolf D, Scotland A, Schjodt-Eriksen J, et al. Evaluation of smartphone-based testing to generate exploratory outcome measures in a phase 1 Parkinson's disease clinical trial. Mov Disord 2018;33(8):1287-1297 [FREE Full text] [CrossRef] [Medline]
  62. Orozco-Arroyave JR, Vásquez-Correa JC, Klumpp P, Pérez-Toro PA, Escobar-Grisales D, Roth N, et al. Apkinson: the smartphone application for telemonitoring Parkinson's patients through speech, gait and hands movement. Neurodegener Dis Manag 2020;10(3):137-157. [CrossRef] [Medline]
  63. Pan D, Dhall R, Lieberman A, Petitti DB. A mobile cloud-based Parkinson's disease assessment system for home-based monitoring. JMIR Mhealth Uhealth 2015;3(1):e29 [FREE Full text] [CrossRef] [Medline]
  64. Zhan A, Mohan S, Tarolli C, Schneider RB, Adams JL, Sharma S, et al. Using smartphones and machine learning to quantify Parkinson disease severity: the mobile Parkinson disease score. JAMA Neurol 2018;75(7):876-880 [FREE Full text] [CrossRef] [Medline]
  65. Mitsi G, Mendoza EU, Wissel BD, Barbopoulou E, Dwivedi AK, Tsoulos I, et al. Biometric digital health technology for measuring motor function in Parkinson's disease: results from a feasibility and patient satisfaction study. Front Neurol 2017;8:273 [FREE Full text] [CrossRef] [Medline]
  66. Pratap A, Grant D, Vegesna A, Tummalacherla M, Cohan S, Deshpande C, et al. Evaluating the utility of smartphone-based sensor assessments in persons with multiple sclerosis in the real-world using an app (elevateMS): observational, prospective pilot digital health study. JMIR Mhealth Uhealth 2020;8(10):e22108 [FREE Full text] [CrossRef] [Medline]
  67. Cabrera-Martos I, Ortiz-Rubio A, Torres-Sánchez I, López-López L, Rodríguez-Torres J, Carmen Valenza M. Agreement between face-to-face and tele-assessment of upper limb functioning in patients with Parkinson disease. PM R 2019;11(6):590-596. [CrossRef] [Medline]
  68. Hoffmann T, Russell T, Thompson L, Vincent A, Nelson M. Using the Internet to assess activities of daily living and hand function in people with Parkinson's disease. NeuroRehabilitation 2008;23(3):253-261. [Medline]
  69. Amano S, Umeji A, Uchita A, Hashimoto Y, Takebayashi T, Kanata Y, et al. Reliability of remote evaluation for the Fugl-Meyer assessment and the action research arm test in hemiparetic patients after stroke. Top Stroke Rehabil 2018;25(6):432-437. [CrossRef] [Medline]
  70. Arora S, Venkataraman V, Zhan A, Donohue S, Biglan KM, Dorsey ER, et al. Detecting and monitoring the symptoms of Parkinson's disease using smartphones: a pilot study. Parkinsonism Relat Disord 2015;21(6):650-653. [CrossRef] [Medline]
  71. Burdea GC, Grampurohit N, Kim N, Polistico K, Kadaru A, Pollack S, et al. Feasibility of integrative games and novel therapeutic game controller for telerehabilitation of individuals chronic post-stroke living in the community. Top Stroke Rehabil 2020;27(5):321-336 [FREE Full text] [CrossRef] [Medline]
  72. Yu L, Xiong D, Guo L, Wang J. A remote quantitative Fugl-Meyer assessment framework for stroke patients based on wearable sensor networks. Comput Methods Programs Biomed 2016;128:100-110. [CrossRef] [Medline]
  73. Albani G, Ferraris C, Nerino R, Chimienti A, Pettiti G, Parisi F, et al. An integrated multi-sensor approach for the remote monitoring of Parkinson's disease. Sensors (Basel) 2019;19(21):4764 [FREE Full text] [CrossRef] [Medline]
  74. Cunningham L, Mason S, Nugent C, Moore G, Finlay D, Craig D. Home-based monitoring and assessment of Parkinson's disease. IEEE Trans Inf Technol Biomed 2011;15(1):47-53. [CrossRef] [Medline]
  75. Goetz CG, Stebbins GT, Wolff D, DeLeeuw W, Bronte-Stewart H, Elble R, et al. Testing objective measures of motor impairment in early Parkinson's disease: feasibility study of an at-home testing device. Mov Disord 2009;24(4):551-556 [FREE Full text] [CrossRef] [Medline]
  76. López-Blanco R, Velasco MA, Méndez-Guerrero A, Romero JP, Del Castillo MD, Serrano JI, et al. Smartwatch for the analysis of rest tremor in patients with Parkinson's disease. J Neurol Sci 2019;401:37-42. [CrossRef] [Medline]
  77. Memedi M, Sadikov A, Groznik V, Žabkar J, Možina M, Bergquist F, et al. Automatic spiral analysis for objective assessment of motor symptoms in Parkinson's disease. Sensors (Basel) 2015;15(9):23727-23744 [FREE Full text] [CrossRef] [Medline]
  78. Powers R, Etezadi-Amoli M, Arnold EM, Kianian S, Mance I, Gibiansky M, et al. Smartwatch inertial sensors continuously monitor real-world motor fluctuations in Parkinson's disease. Sci Transl Med 2021;13(579):eabd7865. [CrossRef] [Medline]
  79. Sigcha L, Pavón I, Costa N, Costa S, Gago M, Arezes P, et al. Automatic resting tremor assessment in Parkinson's disease using smartwatches and multitask convolutional neural networks. Sensors (Basel) 2021;21(1):291 [FREE Full text] [CrossRef] [Medline]
  80. Westin J, Ghiamati S, Memedi M, Nyholm D, Johansson A, Dougherty M, et al. A new computer method for assessing drawing impairment in Parkinson's disease. J Neurosci Methods 2010;190(1):143-148. [CrossRef] [Medline]
  81. Kostikis N, Hristu-Varsakelis D, Arnaoutoglou M, Kotsavasiloglou C. A smartphone-based tool for assessing Parkinsonian hand tremor. IEEE J Biomed Health Inform 2015;19(6):1835-1842. [CrossRef] [Medline]
  82. Bochniewicz EM, Emmer G, McLeod A, Barth J, Dromerick AW, Lum P. Measuring functional arm movement after stroke using a single wrist-worn sensor and machine learning. J Stroke Cerebrovasc Dis 2017;26(12):2880-2887. [CrossRef] [Medline]
  83. Prochazka A, Kowalczewski J. A fully automated, quantitative test of upper limb function. J Mot Behav 2015;47(1):19-28. [CrossRef] [Medline]
  84. Lee S, Lee YS, Kim J. Automated evaluation of upper-limb motor function impairment using Fugl-Meyer assessment. IEEE Trans Neural Syst Rehabil Eng 2018;26(1):125-134. [CrossRef] [Medline]
  85. Kleinholdermann U, Wullstein M, Pedrosa D. Prediction of motor Unified Parkinson's Disease Rating Scale scores in patients with Parkinson's disease using surface electromyography. Clin Neurophysiol 2021;132(7):1708-1713. [CrossRef] [Medline]
  86. Taylor Tavares AL, Jefferis GS, Koop M, Hill BC, Hastie T, Heit G, et al. Quantitative measurements of alternating finger tapping in Parkinson's disease correlate with UPDRS motor disability and reveal the improvement in fine motor control from medication and deep brain stimulation. Mov Disord 2005;20(10):1286-1298. [CrossRef] [Medline]
  87. Salarian A, Russmann H, Wider C, Burkhard PR, Vingerhoets FJ, Aminian K. Quantification of tremor and bradykinesia in Parkinson's disease using a novel ambulatory monitoring system. IEEE Trans Biomed Eng 2007;54(2):313-322. [CrossRef] [Medline]
  88. Lin SD, Butler JE, Boswell-Ruys CL, Hoang P, Jarvis T, Gandevia SC, et al. The frequency of bowel and bladder problems in multiple sclerosis and its relation to fatigue: a single centre experience. PLoS One 2019;14(9):e0222731 [FREE Full text] [CrossRef] [Medline]
  89. Wu H, Zhang Y, Wu X, Yang F. Assessment of upper limb tremors in patients with Parkinson’s disease based on displacement and acceleration information. In: 5th International Conference on Automation, Control and Robotics Engineering. 2020 Presented at: CARE '20; September 19-20, 2020; Dalian, China p. 177-182. [CrossRef]
  90. Zambrana C, Idelsohn-Zielonka S, Claramunt-Molet M, Almenara-Masbernat M, Opisso E, Tormos JM, et al. Monitoring of upper-limb movements through inertial sensors – preliminary results. Smart Health 2019;13:100059. [CrossRef]
  91. Dubuisson N, Bauer A, Buckley M, Gilbert R, Paterson A, Marta M, et al. Validation of an environmentally-friendly and affordable cardboard 9-hole peg test. Mult Scler Relat Disord 2017;17:172-176. [CrossRef] [Medline]
  92. de Araújo AC, Santos EG, de Sá KS, Furtado VK, Santos FA, de Lima RC, et al. Hand resting tremor assessment of healthy and patients with Parkinson's disease: an exploratory machine learning study. Front Bioeng Biotechnol 2020;8:778 [FREE Full text] [CrossRef] [Medline]
  93. Mendoza JE, Apostolos GT, Humphreys JD, Hanna-Pladdy B, O'Bryant SE. Coin rotation task (CRT): a new test of motor dexterity. Arch Clin Neuropsychol 2009;24(3):287-292. [CrossRef] [Medline]
  94. Růžička E, Krupička R, Zárubová K, Rusz J, Jech R, Szabó Z. Tests of manual dexterity and speed in Parkinson's disease: not all measure the same. Parkinsonism Relat Disord 2016;28:118-123. [CrossRef] [Medline]
  95. de Groot-Driessen D, van de Sande P, van Heugten C. Speed of finger tapping as a predictor of functional outcome after unilateral stroke. Arch Phys Med Rehabil 2006;87(1):40-44. [CrossRef] [Medline]
  96. Mobile fact sheet. Pew Research Center. 2021.   URL: https://www.pewresearch.org/internet/fact-sheet/mobile/ [accessed 2021-05-13]
  97. Bove R, White CC, Giovannoni G, Glanz B, Golubchikov V, Hujol J, et al. Evaluating more naturalistic outcome measures: a 1-year smartphone study in multiple sclerosis. Neurol Neuroimmunol Neuroinflamm 2015;2(6):e162 [FREE Full text] [CrossRef] [Medline]


9HPT: 9-hole peg test
ADL: activities of daily living
ARAT: Action Research Arm Test
CNS: central nervous system
IADL: instrumental activities of daily living
MDS-UPDRS: Movement Disorder Society–Unified Parkinson’s Disease Rating Scale
MS: multiple sclerosis
PD: Parkinson disease
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses


Edited by T Leung; submitted 25.08.21; peer-reviewed by D Costa, K Harrington; comments to author 07.01.22; revised version received 17.01.22; accepted 26.01.22; published 09.03.22

Copyright

©Arpita Gopal, Wan-Yu Hsu, Diane D Allen, Riley Bove. Originally published in JMIR Rehabilitation and Assistive Technology (https://rehab.jmir.org), 09.03.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Rehabilitation and Assistive Technology, is properly cited. The complete bibliographic information, a link to the original publication on https://rehab.jmir.org/, as well as this copyright and license information must be included.