Original Paper
Abstract
Background: Artificial intelligence (AI)–based gait analysis systems are increasingly applied in rehabilitation settings for objective and quantitative assessment of gait function. However, despite their potential, clinical adoption remains limited due to insufficient consideration of usability, user experience, and integration into actual clinical workflows.
Objective: This study aimed to conduct a formative evaluation of a prototype AI-based gait analysis system (MediStep M).
Methods: A mixed methods formative usability evaluation was conducted with 5 licensed physical therapists. Qualitative data were collected through focus group interviews, and quantitative usability was measured using the system usability scale (SUS). A scenario-based usability assessment was applied to identify user interface challenges, workflow issues, and potential design improvements.
Results: Participants identified major usability barriers, including limited accessibility of the power button, absence of battery status indicators, burdensome manual calibration, and insufficient clinical detail in the gait analysis reports. They also emphasized the need for wireless operation, improved portability, and integration with hospital electronic medical record systems. The mean SUS score was 57 (grade D), indicating suboptimal usability and the need for iterative design refinements.
Conclusions: Although AI-based gait analysis systems hold promise for enhancing rehabilitation outcomes, key usability challenges must be resolved before clinical implementation. Improvements in hardware portability, automated calibration, data management, and user interface design are essential to ensure safety, efficiency, and clinical applicability. These findings provide evidence-based insights to guide iterative development and promote user-centered innovation in AI-based rehabilitation technologies.
doi:10.2196/80748
Keywords
Introduction
Gait analysis plays a critical role in the rehabilitation of individuals with neurological or musculoskeletal impairments [-]. Quantitative assessment of gait function enables clinicians to objectively evaluate motor performance, establish evidence-based treatment plans, and monitor functional recovery over time [,]. However, conventional gait analysis systems such as treadmill-based or marker-dependent motion capture platforms require specialized laboratories, trained operators, and controlled environments [,]. These limitations restrict their feasibility for routine clinical application and hinder broader accessibility in diverse rehabilitation settings [,].
Recent advancements in artificial intelligence (AI), computer vision, and depth-sensing technologies have facilitated the development of markerless, AI-driven gait analysis systems capable of automatically estimating joint coordinates and computing spatiotemporal gait parameters [-]. Such systems have the potential to make gait analysis more portable, cost-effective, and clinically scalable [,,]. Compared to conventional systems such as Zebris, which rely on treadmill-based assessments within constrained spatial setups, emerging overground AI-based systems are designed to enable natural walking conditions with minimal setup requirements [,].
Despite these technological advancements, many AI-based gait analysis systems continue to face significant usability challenges that limit their integration into real-world rehabilitation environments. Reported issues include labor-intensive calibration procedures, unclear user feedback interfaces, hardware nonportability, and poor alignment with clinical workflows. Previous research in human factors and digital rehabilitation has emphasized that usability—particularly intuitiveness of the user interface, workflow efficiency, and error prevention—is a primary determinant of successful clinical adoption and user satisfaction [].
To address these challenges, we conducted a formative evaluation of a prototype AI-based gait analysis system, MediStep M. The evaluation was performed with physical therapists, who represent the primary end users of gait analysis technologies in clinical practice. A mixed methods approach was employed, combining qualitative focus group interviews (FGIs) with quantitative assessments using the system usability scale (SUS). This approach aimed to identify use-related problems, interface limitations, and potential design improvements to guide iterative enhancement of AI-based rehabilitation technologies and to ensure their alignment with clinical usability requirements.
Methods
Study Design
This study employed a formative evaluation design to identify use-related problems and interface improvement needs of a prototype AI-based gait analysis system (MediStep M). The evaluation followed the usability engineering framework described in IEC 62366-1:2020, focusing on user-system interaction and workflow alignment.
Materials
MediStep M is a prototype AI-based gait analysis system commissioned by the manufacturer for formative evaluation. The system was designed to enable objective assessment of gait deterioration in older adults and to monitor functional recovery following therapeutic interventions. Currently, in the prototype stage, the system consists of general-purpose hardware (a tablet device) running the MediStep M software (-).


Using the tablet’s built-in camera, the system captures side-view videos of participants performing back-and-forth walking tasks. Through markerless pose estimation technology, the software automatically extracts full-body joint coordinates and computes spatiotemporal gait parameters. These parameters are visualized and quantified to generate comprehensive gait reports, including stride length, walking speed, arm swing amplitude, upper-body inclination, and gait asymmetry. The generated reports are intended to assist clinicians in evaluating gait characteristics and identifying deviations from normative patterns.
Participant Recruitment
Five licensed physical therapists participated in the study. All participants had a minimum of 5 years of professional experience in gait assessment and rehabilitation. Recruitment was conducted through the National Rehabilitation Center via an internal call for participation.
The inclusion criteria were as follows: (1) licensed physical therapists with clinical experience in gait analysis or rehabilitation, (2) current engagement in gait-related clinical practice, and (3) prior exposure to gait analysis systems or equivalent medical devices.
Participants who were not actively involved in patient treatment or who performed administrative duties only were excluded. The final sample consisted of 4 male therapists and 1 female therapist, reflecting diverse experience levels across inpatient and outpatient rehabilitation contexts.
Procedures
Formative Evaluation Procedures
The formative evaluation was conducted at the National Rehabilitation Center in Seoul, Republic of Korea. Each session was conducted in a controlled clinical environment and lasted approximately 90 minutes ().
| Composition | Details | Time (min) |
| Orientation |
| 10 |
| Guidance on consent |
| 10 |
| Perform focus group interview |
| 50 |
| Perform SUSa |
| 20 |
aSUS: system usability scale.
The evaluation consisted of 3 main stages:
- Product demonstration: The facilitator conducted a live demonstration of MediStep M according to predefined clinical use scenarios. This included device setup, calibration, gait data acquisition, and report interpretation. The demonstration followed the manufacturer’s standard workflow and was intended to simulate typical clinical usage conditions.
- Participant interaction and observation: Participants observed the demonstration, and when appropriate, were given the opportunity to interact with the system interface (eg, initiating data capture, reviewing gait reports) to better understand its functionality. During this phase, the facilitator encouraged participants to verbalize their thoughts and perceptions regarding usability issues and potential improvements.
- FGI and SUS assessment: After the demonstration, participants took part in a 60-minute FGI to provide detailed feedback on user interface intuitiveness, workflow integration, and areas for improvement. Subsequently, SUS was administered to quantify perceived usability. Reverse-scored items (2, 4, 6, and 8) were appropriately adjusted before calculating individual and mean SUS scores.
Task Scenarios
shows the different scenarios in product demonstration, including device setup, calibration, gait data acquisition, and report interpretation.
| Task | Subtask | |
| 1 | Powering on the device |
|
| 2 | Camera calibration |
|
| 3 | Using the MediStep application |
|
| 4 | Gait analysis |
|
| 5 | Interpretation of results |
|
| 6 | Member management |
|
aAI: artificial intelligence.
FGI Method
An FGI is a qualitative research method that collects data through in-depth discussions with intentionally selected participants on a specific topic [,]. This approach encourages individuals to openly share their experiences and engage with others’ insights, thereby providing diverse perspectives and rich, detailed information on the subject matter []. FGIs are considered an appropriate research method for analyzing and identifying areas for improvement in gait analysis software interfaces from the perspective of clinicians with experience in using similar medical devices (). The FGI questionnaire () was structured based on the type of questions, including introductory, transitional, key, and closing questions [].

| Stage | Questions |
| Opening questions |
|
| Transition questions |
|
| Key questions |
|
| Ending questions |
|
SUS Evaluation Tool
The SUS is a quantitative evaluation tool used to assess how users perceive system usability []. The SUS comprises 10 items rated on a Likert scale (1=strongly disagree to 5=strongly agree) (). The total score was converted to a range of 0-100 points to evaluate overall system usability. Moreover, the SUS scores were assigned a letter grade from “F” (0-60 points) to “A” (91-100 points) according to a standardized grading system [,].
| Items | Statement |
| Utility | I think that I would like to use this system frequently. |
| Complexity | I found the system unnecessarily complex. |
| Simplicity | I thought the system was easy to use. |
| Professionalism (technician support) | I think that I would need the support of a technical person to be able to use this system. |
| Integration | I found the various functions in the system were well integrated. |
| Unity | I thought there was too much inconsistency in this system. |
| Learnability | I would imagine that most people would learn to use this system very quickly. |
| Convenience | I found the system very cumbersome to use. |
| Satisfaction | I felt very confident using the system. |
| Professionalism (prior learning) | I needed to learn a lot of things before I could get going with this system. |
Data Analysis
Qualitative data from FGIs were transcribed verbatim and analyzed using an inductive thematic analysis approach. Two independent researchers coded the transcripts to identify recurring patterns, usability barriers, and improvement suggestions. Themes were derived through iterative comparison and consensus discussions.
Quantitative data from SUS were analyzed descriptively to calculate mean scores, standard deviations, and percentile rankings based on established SUS benchmarks. Both qualitative and quantitative findings were triangulated to derive comprehensive insights into the system’s usability and user needs.
Ethical Considerations
The study protocol was reviewed and approved by the institutional review board of the National Rehabilitation Center (government-affiliated national rehabilitation center in Seoul, Republic of Korea; NRC-2024-06-046). All participants provided written informed consent prior to participation and were informed that participation was voluntary and that they could withdraw from the study at any time without any penalty or disadvantage. Participants received a nonmonetary gift with a value of up to KRW 50,000 (US $34.05) as a token of appreciation for their participation. To ensure confidentiality, all identifiable personal data were removed, and all photographic materials were anonymized prior to analysis and publication.
Results
Participant Characteristics
Five licensed physical therapists participated in the evaluation, with clinical experience ranging from 7 to 23 years (). All participants had prior experience using similar gait analysis systems (eg, Zebris, Joynt, or Morning Walk) and reported a usage frequency of 1-3 times per week.
| ID | Sex | Experience | Similar medical device | |||
| Used before | Model name | Experience of use | Frequency of use | |||
| P1 | Female | 8 y 8 mo | Yes | Zebris | 1 y | 1 time/wk |
| P2 | Male | 18 y 9 mo | Yes | Zebris | 1 y | 3 times/wk |
| P3 | Male | 23 y | Yes | Joynt | 1 y | 1 time/wk |
| P4 | Male | 7 y 8 mo | Yes | Morning Walk | 2 y 2 mo | 3 times/wk |
| P5 | Male | 15 y 7 mo | Yes | Zebris | 1 y | 2-3 times/mo |
Qualitative Findings (FGI)
FGIs revealed several key themes regarding usability, functionality, and areas for improvement of the AI-based gait analysis system.
Theme 1: Importance of Real-Time, Quantitative, and Understandable Feedback
The participants (P1-P5; as indicated in ) appreciated the ability of gait analysis devices to provide quantitative assessments, particularly for aspects that are difficult to measure visually, such as gait symmetry and alignment. They emphasized that real-time feedback not only enhances clinicians' evaluation capabilities but also motivates patients by enabling them to monitor their own gait patterns. However, some participants noted that complex setup procedures and spatial constraints could limit the practical utility of such devices in busy clinical environments.
…The advantage is that machines can accurately measure what we cannot visually quantify, such as distance and symmetry, without requiring large spaces.
[P1]
…Real-time visualization of movements could help patients self-correct during therapy sessions.
[P2]
Theme 2: Necessity for Simplified and Portable Hardware Design
Although the hardware components and instructions were generally understandable, several participants stressed the need for wireless operations to enhance portability and convenience. The participants recommended minimizing wired connections, incorporating a wireless printer, and adding basic features such as a built-in level gauge for easier setup on uneven surfaces.
…The product should be portable. Wireless operation would greatly improve usability.
[P2]
…Adding a leveling tool on the device would help ensure proper setup on uneven floors.
[P3]
Theme 3: Challenges Related to Power Button and Battery Display
Many participants expressed frustration with the placement of the power button, which was obscured by the device frame. They suggested relocating it for easier access and improving the visibility of the battery status, either through a larger indicator light or by displaying the battery percentage on the screen.
…The power button is hidden behind the frame, making it very inconvenient to press.
[P1]
…It would be helpful to see the battery level on the main screen.
[P2]
Theme 4: Calibration Process Needs Automation
The participants criticized the manual calibration process, noting that it could introduce human error depending on the user’s expertise and environmental factors. They recommended an automated calibration to enhance the reliability and usability of the system.
…Manual calibration can vary the results based on a user’s skill. AI-based automatic calibration would significantly increase reliability.
[P1]
…It should not rely on the inspector’s height but rather on more objective standards like leg length.
[P2]
Theme 5: User Interface and Patient Management System Improvements
Participants generally found the software interface to be user-friendly but identified specific areas for improvement. They requested enhancements to the patient management system, such as the introduction of an auto-complete search, ID-based patient identification, and simplified patient registration procedures.
…When selecting patients, it’s inconvenient to search by full name. An auto-complete or list view would help.
[P3]
…It would be better if patients were identified by number rather than name for privacy and speed.
[P5]
Theme 6: Enhancing the Gait Analysis Reporting and Interpretation
Participants appreciated the effort to make the results accessible but criticized the oversimplification of the analysis outputs. The authors called for more detailed and clinically meaningful metrics and stronger reference explanations, particularly for cases involving pathological gait patterns.
…Simple statements like slow walking speed aren’t enough. Specific advice is required, such as suggesting an increase in stride length by 10 cm.
[P4]
…The report distinguishes between normal and pathological gait patterns.
[P1]
Theme 7: Additional Requirements for Clinical Usability
Finally, the participants highlighted the need for expanded language support for international use, improved data extraction features (eg, exporting to Excel), and improved integration with hospital electronic medical record (EMR) systems. Some participants suggested adding remote-start features to minimize device manipulation during testing and prevent device tilting.
…If the device is intended for hospital use, data should be easily exportable and integrated with the EMR.
[P3]
…Remote control start could prevent tilting issues during test initiation.
[P2]
Quantitative Findings (SUS)
The mean SUS score was 57, corresponding to a grade of “D” on the SUS grading scale (, ).

| Item | Min (point) | Max (point) | Mean (SD) | Scaled mean (SD) |
| Total | —a | — | 2.88 (0.44) | 57.00 (17.98) |
| Utility (Q1) | 1.00 | 3.00 | 2.40 (0.89) | 35.00 (20.00) |
| Complexity (Q2) | 1.00 | 3.00 | 2.00 (0.71) | 75.00 (15.81) |
| Simplicity (Q3) | 4.00 | 5.00 | 4.20 (0.45) | 80.00 (10.00) |
| Professionalism (technician support) (Q4) | 2.00 | 4.00 | 3.40 (0.89) | 40.00 (20.00) |
| Integration (Q5) | 1.00 | 3.00 | 2.00 (0.71) | 25.00 (15.81) |
| Unity (Q6) | 1.00 | 4.00 | 3.20 (1.30) | 45.00 (29.15) |
| Learnability (Q7) | 3.00 | 5.00 | 4.00 (0.71) | 75.00 (15.81) |
| Convenience (Q8) | 1.00 | 4.00 | 2.20 (1.10) | 70.00 (24.49) |
| Satisfaction (Q9) | 3.00 | 4.00 | 3.20 (0.45) | 55.00 (10.00) |
| Professionalism (prior learning) (Q10) | 1.00 | 3.00 | 2.20 (0.84) | 70.00 (18.71) |
aNot applicable.
Discussion
Principal Findings
This formative usability evaluation identified key strengths and limitations of the prototype AI-based gait analysis system MediStep M from the perspective of clinical end users. Overall, the findings demonstrate that while the system shows strong potential to support objective gait assessment in rehabilitation, several usability issues must be addressed to enable efficient integration into clinical practice. The mean SUS score of 57 (grade D) indicates below-average usability, aligning with qualitative feedback obtained through FGIs.
Participants acknowledged the clinical value of quantitative gait analysis enabled by AI-based systems, particularly the ability to objectively visualize spatiotemporal gait parameters that are difficult to assess through observation alone. However, multiple usability barriers were identified across both hardware and software components. Hardware-related issues included limited portability, poor accessibility of the power button, and lack of a battery status indicator. Software-related challenges involved the need for automated calibration, improved patient management features, and more detailed gait reporting to support clinical decision-making. These findings highlight that usability and clinical applicability depend not only on algorithmic performance but also on the intuitiveness and efficiency of the overall user interface and workflow design.
The results of this study are consistent with prior usability research on AI-driven and markerless gait analysis systems, which similarly reported barriers such as setup complexity, calibration burden, and limited interoperability with EMR systems. Previous studies have also emphasized that successful implementation of AI technologies in rehabilitation requires a user-centered approach that reflects real-world clinical workflows [-]. Compared to treadmill-based systems such as Zebris, MediStep M offers an overground and portable configuration that allows a more natural gait pattern to be analyzed. Nonetheless, iterative refinement remains necessary to enhance user experience, workflow compatibility, and data management efficiency.
The findings from this study suggest several priorities for future development. First, hardware design should improve ergonomics and mobility through compact structure, wireless operation, and automated leveling mechanisms. Second, calibration procedures should be automated using AI-assisted alignment algorithms to reduce operator variability. Third, software feedback should evolve beyond raw numerical data to include graphical trends, interpretive summaries, and actionable recommendations for rehabilitation. Finally, interoperability features such as secure EMR integration and standardized data export formats are essential for broader clinical adoption.
Importantly, several of the usability issues identified in this formative evaluation were subsequently addressed by the manufacturer. As of March 2025, the finalized version of MediStep M incorporated significant hardware and software refinements, including a redesigned rear magnetic detachable cover, removal of the front frame obstructing the power button, and addition of a main-screen battery status indicator. Wireless printing capability using Apple AirPrint, an AI-based automated calibration algorithm achieving 97% accuracy compared to the manual process, and a CSV-based data export function were also implemented. This iterative process underscores how formative usability evaluations can directly inform product improvement and regulatory readiness.
Limitations
Several limitations should be acknowledged. The small sample size (N=5) limits the generalizability of our findings, and the predominance of male participants (4 out of 5) may introduce gender bias. As a formative study, the evaluation did not measure long-term usability or clinical efficacy outcomes. Future studies should include larger and more heterogeneous samples, repeated testing cycles, and longitudinal assessments to validate the system’s usability and clinical performance after iterative refinement.
Conclusion
This study provides evidence on the usability of an AI-based gait analysis system from the perspective of clinical end users. Although the prototype demonstrated technical feasibility and clinical potential, several usability challenges were identified that require further optimization. The subsequent improvements implemented by the manufacturer exemplify how formative feedback can guide user-centered development of rehabilitation technologies, supporting safer and more effective integration of AI systems into clinical workflows.
Acknowledgments
This study was supported by grants (NRCRSP-24TB01 and 25TB01) from the Rehabilitation Research & Development Support Program, Korea National Rehabilitation Center, Ministry of Health & Welfare, Korea.
Data Availability
The datasets generated and/or analyzed in this study are available from the corresponding author upon reasonable request.
Conflicts of Interest
None declared.
References
- Bonanno M, De Nunzio AM, Quartarone A, Militi A, Petralito F, Calabrò RS. Gait analysis in neurorehabilitation: from research to clinical practice. Bioengineering (Basel). Jun 30, 2023;10(7):785. [FREE Full text] [CrossRef] [Medline]
- Ali F, Padilla H, Blazek AM, Barnard L, Kaufman KR. Gait analysis in neurologic disorders: methodology, applications, and clinical considerations. Neurology. Oct 21, 2025;105(8):e214154. [CrossRef] [Medline]
- Fung V. Functional gait disorder. Handb Clin Neurol. 2016;139:263-270. [CrossRef] [Medline]
- Richards CL, Malouin F, Dean C. Gait in stroke: assessment and rehabilitation. Clin Geriatr Med. Nov 1999;15(4):833-855. [Medline]
- Chiou-Tan FY, Bloodworth D. Approach to gait disorders and orthotic management in adult onset neuromuscular diseases. Muscle Nerve. May 2025;71(5):857-868. [CrossRef] [Medline]
- Ota M. Markerless motion analysis using new digital technology. Phys Ther Res. 2025;28(2):77-84. [FREE Full text] [CrossRef] [Medline]
- Muro-de-la-Herran A, Garcia-Zapirain B, Mendez-Zorrilla A. Gait analysis methods: an overview of wearable and non-wearable systems, highlighting clinical applications. Sensors (Basel). Feb 19, 2014;14(2):3362-3394. [FREE Full text] [CrossRef] [Medline]
- Li X, Xu H, Cheung JT. Gait-force model and inertial measurement unit-based measurements: a new approach for gait analysis and balance monitoring. J Exerc Sci Fit. Dec 2016;14(2):60-66. [FREE Full text] [CrossRef] [Medline]
- Han X, Guffanti D, Brunete A. A comprehensive review of vision-based sensor systems for human gait analysis. Sensors (Basel). Jan 16, 2025;25(2):498. [FREE Full text] [CrossRef] [Medline]
- Avogaro A, Cunico F, Rosenhahn B, Setti F. Markerless human pose estimation for biomedical applications: a survey. Front. Comput. Sci. Jul 6, 2023;5:1153160. [CrossRef]
- Dion G, Tessier-Poirier A, Chiasson-Poirier L, Morissette J, Brassard G, Haman A, et al. In-sensor human gait analysis with machine learning in a wearable microfabricated accelerometer. Commun Eng. Mar 16, 2024;3(1):1-10. [CrossRef]
- Hasan S, D'auria BG, Mahmud MAP, Adams SD, Long JM, Kong L, et al. AI-aided gait analysis with a wearable device featuring a hydrogel sensor. Sensors (Basel). Nov 19, 2024;24(22):7370. [FREE Full text] [CrossRef] [Medline]
- Prasanth H, Caban M, Keller U, Courtine G, Ijspeert A, Vallery H, et al. Wearable sensor-based real-time gait detection: a systematic review. Sensors (Basel). Apr 13, 2021;21(8):2727. [FREE Full text] [CrossRef] [Medline]
- Kim HY, An YS, Oh SH, Lee HC. Clinical feasibility of a markerless gait analysis system. Clin Orthop Surg. Jun 2024;16(3):506-516. [FREE Full text] [CrossRef] [Medline]
- Pedroli E, Greci L, Colombo D, Serino S, Cipresso P, Arlati S, et al. Characteristics, usability, and users experience of a system combining cognitive and physical therapy in a virtual environment: positive bike. Sensors (Basel). Jul 19, 2018;18(7):2343. [FREE Full text] [CrossRef] [Medline]
- Guest G, Namey E, Taylor J, Eley N, McKenna K. Comparing focus groups and individual interviews: findings from a randomized study. International Journal of Social Research Methodology. Feb 13, 2017;20(6):693-708. [CrossRef]
- Krueger R. Focus Groups: A Practical Guide for Applied Research. Thousand Oaks, CA. SAGE Publications; 2014.
- Kim J, Jo S, Jang W. An exploratory study on the development of pressure ulcer prevention cushion for pregnant women with spinal cord injury: based on focus group interviews. The Journal of Korean Society of Assistive Technology. 2022;14(1):23-35. [CrossRef]
- Brooke J. SUS-A quick and dirty usability scale. Usability evaluation in industry. 1996;189:7. [CrossRef]
- Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: adding an adjective rating scale. Journal of Usability Studies. May 1, 2009;4(3):114-123. [FREE Full text]
- Sauro J. A Practical Guide to the System Usability Scale: Background, Benchmarks, and Best Practices. Denver. Measuring Usability LLC; 2011.
- Sumner J, Lim HW, Chong LS, Bundele A, Mukhopadhyay A, Kayambu G. Artificial intelligence in physical rehabilitation: a systematic review. Artif Intell Med. Dec 2023;146:102693. [FREE Full text] [CrossRef] [Medline]
- Hassan M, Kushniruk A, Borycki E. Barriers to and facilitators of artificial intelligence adoption in health care: scoping review. JMIR Hum Factors. Aug 29, 2024;11:e48633. [FREE Full text] [CrossRef] [Medline]
- Wu P, Cao B, Liang Z, Wu M. The advantages of artificial intelligence-based gait assessment in detecting, predicting, and managing Parkinson's disease. Front Aging Neurosci. 2023;15:1191378. [FREE Full text] [CrossRef] [Medline]
- Keogh A, Argent R, Anderson A, Caulfield B, Johnston W. Assessing the usability of wearable devices to measure gait and physical activity in chronic conditions: a systematic review. J Neuroeng Rehabil. Sep 15, 2021;18(1):138. [FREE Full text] [CrossRef] [Medline]
Abbreviations
| AI: artificial intelligence |
| EMR: electronic medical record |
| FGI: focus group interview |
| SUS: system usability scale |
Edited by S Munce; submitted 16.Jul.2025; peer-reviewed by PE Roos, J Brooke, L Sabbe; comments to author 03.Oct.2025; revised version received 27.Oct.2025; accepted 03.Dec.2025; published 18.Dec.2025.
Copyright©Seojin Hong, Hyun Choi, Hyosun Kweon. Originally published in JMIR Rehabilitation and Assistive Technology (https://rehab.jmir.org), 18.Dec.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Rehabilitation and Assistive Technology, is properly cited. The complete bibliographic information, a link to the original publication on https://rehab.jmir.org/, as well as this copyright and license information must be included.

