This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Rehabilitation and Assistive Technology, is properly cited. The complete bibliographic information, a link to the original publication on https://rehab.jmir.org/, as well as this copyright and license information must be included.
Immersive technologies like virtual reality can enable clinical care that meaningfully aligns with real-world deficits in cognitive functioning. However, options in immersive 3D environments are limited, partly because of the unique challenges presented by the development of a clinical care platform. These challenges include selecting clinically relevant features, enabling tasks that capture the full breadth of deficits, ensuring longevity in a rapidly changing technology landscape, and performing the extensive technical and clinical validation required for digital interventions. Complicating development, is the need to integrate recommendations from domain experts at all stages.
The Cognitive Health Technologies team at the National Research Council Canada aims to overcome these challenges with an iterative process for the development of bWell, a cognitive care platform providing multisensory cognitive tasks for adoption by treatment providers.
The team harnessed the affordances of immersive technologies while taking an interdisciplinary research and developmental approach, obtaining active input from domain experts with iterative deliveries of the platform. The process made use of technology readiness levels, agile software development, and human-centered design to advance four main activities: identification of basic requirements and key differentiators, prototype design and foundational research to implement components, testing and validation in lab settings, and recruitment of external clinical partners.
bWell was implemented according to the findings from the design process. The main features of bWell include multimodal (fully, semi, or nonimmersive) and multiplatform (extended reality, mobile, and PC) implementation, configurable exercises that pair standardized assessment with adaptive and gamified variants for therapy, a therapist-facing user interface for task administration and dosing, and automated activity data logging. bWell has been designed to serve as a broadly applicable toolkit, targeting general aspects of cognition that are commonly impacted across many disorders, rather than focusing on 1 disorder or a specific cognitive domain. It comprises 8 exercises targeting different domains: states of attention (Egg), visual working memory (Theater), relaxation (Tent), inhibition and cognitive control (Mole), multitasking (Lab), self-regulation (Butterfly), sustained attention (Stroll), and visual search (Cloud). The prototype was tested and validated with healthy adults in a laboratory environment. In addition, a cognitive care network (5 sites across Canada and 1 in Japan) was established, enabling access to domain expertise and providing iterative input throughout the development process.
Implementing an interdisciplinary and iterative approach considering technology maturity brought important considerations for the development of bWell. Altogether, this harnesses the affordances of immersive technology and design for a broad range of applications, and for use in both cognitive assessment and rehabilitation. The technology has attained a maturity level of prototype implementation with preliminary validation carried out in laboratory settings, with next steps to perform the validation required for its eventual adoption as a clinical tool.
Mental health issues are increasing worldwide [
To best manage a mental health condition, it is important to understand the different ways in which functioning is impacted in an individual. A core feature of psychopathology is cognitive dysfunction, in which impairments can occur broadly and nonspecifically among domains such as attention, response inhibition, and visual memory, cutting across disorder boundaries [
Neuropsychological assessments are currently experiencing a shift, moving away from traditional
Although existing solutions have shown great promise for the use of digital cognitive health interventions in general, and even VR interventions specifically, these interventions have not yet been widely adopted. The majority of the VR platforms mentioned above comprise a single exercise tailored to a specific disorder, require manual exercise reconfiguration to support repeated measures, and support limited and often specific user display and interaction hardware [
The aim of this work is to create a toolkit for clinicians to perform assessment and rehabilitation on a platform enabled with immersive technologies. Because of the novel, interdisciplinary nature of developing software for immersive cognitive care, a secondary goal is to outline a process for the iterative development of cognitive care software in collaboration with domain experts.
A human-centered approach and design thinking were used to identify key requirements for the proposed platform such that it would satisfy the criteria of being desirable, viable, and feasible (
Venn diagram showing innovative solution sweet spot that lies at the intersection of desirability, viability, and feasibility.
Specifically, an interdisciplinary approach [
Schematic demonstrating the stages of clinical collaboration at the different TRL: early TRL (1-3: Foundational research), mid TRL (4-6: Technology development) and late TRL (7-9: deployment). TLR: technology readiness levels.
The development of the bWell platform began in 2017, and activities to date have focused on research to demonstrate technical feasibility as defined in the early TRLs. During this phase of technology development, 4 main activities were advanced: (1) identification of basic requirements and key differentiating criteria (TRL-1), (2) prototype design and foundational research to implement key components (TRL 2-3), (3) testing and validation in laboratory settings (TRL 3-4), and (4) the establishment of the CCN early adopter group. The CCN was formed in 2019 to prepare for the time when the platform would reach an intermediate TRL (TRL 4-6), ready to be shifted from validation in the laboratory to validation in clinical settings.
The fundamental objective of this work is to harness the affordances of immersive technologies to enable assessment and therapy that can meaningfully align with real-world problems in cognitive functioning. By creating simulations that engage users through multiple senses (visual, audio, and touch) while permitting natural movement, immersive VR creates a sense of presence (
Taking into consideration the needs for cognitive care, technology affordances and innovation potential, four main requirements were identified for the bWell platform:
Support for third-party hardware with varying levels of technical maturity facilitates the planning of clinical interventions and research in the ever-changing technical landscape of VR hardware. In addition, different modes of immersion allow for flexible content delivery in cases where immersive technology is not available or where a fully immersive environment is not well tolerated by a patient. Integrating this support also enabled the use of a range of low-cost to high-end consumer devices for clinical and home settings.
This approach allows clinical partners to choose from the available tasks and options based on the needs of a specific patient or target clinical population (eg, pediatric and older adult patients). Common core features required by all tasks were standardized to enable a faster, more agile software development process and to open up possibilities for adding customized features. Tasks were selected to address aspects of cognition common to a variety of mental health disorders rather than a specific disorder. Additionally, configurable exercises permit pairing standardized assessment with corresponding adaptive and gamified variants for therapy, providing therapeutics that do not stand in isolation from assessment.
This interface had to contain a wide range of adjustable parameters, exposing the extensive design options afforded by the platform. Exposed parameters would permit the clinician to administer and prescribe interventions (dosing, duration, and frequency) as well as facilitate their surveillance and control for a trial.
Behavioral and experimental data recorded with precise timing are required for the study of cognitive processes. To enable intra and intersubject analysis of user response and physical interaction, recording of movement data, user performance, and simulation cues and events were also required.
The implementation consisted of developing a platform enabled by immersive technologies and translating the requirements identified through the active co-design process (outlined above) into hardware and software components, including the design of the content.
bWell was developed using the Unity 3D game engine and was implemented with multiple components (
bWell architecture components (Unity 3D engine): (1) in-house input or output manager for hardware, (2) comprehensive user interface for customization of patient and clinical settings, (3) content displayed immersively to the patient or nonimmersively for clinician administration, control and monitoring, and (4) resulting data are streamed for real-time adaptive control and automatically logged for offline analysis. AR: augmented reality; HMD head-mounted display; VR: virtual reality; XR: extended reality.
The research team made use of agile software development, including feature-oriented code implementation, common code repository, software testing, user tests, bug tracking, and frequent software releases. Standardized user testing was developed to solidify the integration of hardware and software features. Individuals new to the platform were included as testers to reinforce usability and to reveal issues that those familiar with the system could no longer identify. Furthermore, to help maintain the major features of each of the exercises and the core features, unit tests developed with the integrated Unity 3D tools were put in place to automate the process. To obtain active input from domain experts, the agile methodology included regular delivery of technology to early adopters to obtain iterative feedback. This feedback informed the successive phases of software development.
The technical feasibility activities for bWell led to the development of a prototype at a TRL-4 maturity level that has been tested and validated in a laboratory environment (
bWell laboratory set-up with a user immersed in a virtual reality environment and the point of view displayed on an auxiliary monitoring screen.
The prototype was designed to be administered by clinicians in various clinical environments. Its architecture was kept flexible, accommodating various input modalities and administration hardware, because it was identified that each clinic had different hardware needs depending on their patient constraints or limitations, price, availability, and the hardware that they already owned. Thus, the prototype was implemented as multimodal (fully, semi-, or nonimmersive) and multiplatform (Oculus, HTC Vive, Hololens, tablet [iOS and Android], and desktop). Automatic detection of the connected devices is executed when the platform is launched, making it a seamless feature for the user, thereby facilitating the long-term goal of functioning in a home environment with remote monitoring by a clinician.
The exercises provided in the prototype were developed to target cognitive domains common to multiple mental health disorders, including attention, memory, and executive control. To promote presence and immersion in the simulation, all exercises make use of multisensory feedback (visual, audio, and touch). Some exercises were designed to target a specific domain of interest, whereas others simultaneously engage multiple cognitive domains to be representative of everyday tasks. A total of 8 exercises were implemented (
Cognitive exercises: (Top row) states of attention (Egg), multitasking (Lab), visual working memory (Theater), relaxation (Tent); (Bottom row) response inhibition and cognitive control (Mole), self-regulation (Butterfly), sustained attention (Stroll), visual search (Cloud).
The user must first scan the environment for eggs. They are then required to direct and hold their gaze on a located egg long enough to make it hatch (attentional focus). Audio and visual distractors in the environment challenge the user, and bonus points are awarded if the user reacts to a cue while fixating (covert attention).
The user must complete 2 recipes simultaneously to investigate their ability to accomplish a range of tasks by swapping between them strategically or by planning the order in which they can be performed most efficiently. This requires the user to closely follow the recipe steps (ability to monitor) displayed on the virtual tablet screens in front of them.
Inspired by matching tasks for visual and short-term memory, the user is presented with target shapes ordered from left to right. After a set viewing time, the targets are hidden. After a specific time (delay recall), objects fall into view of the user, some of which are the target and others are not (comparison objects). The user is required to select the targets from all the objects and place them in the order presented within a limited time.
A Whack-A-Mole variation is used where the user has a hammer in each hand and has to hit cylinders that pop up in front of them. The colors of the hammers change over time, and the cylinders also have different colors. The user should only hit cylinders with a hammer of the same color (go signal).
The user engages in motor self-regulation through an activity that rewards self-restraint. The user is instructed to catch butterflies with a net but must do so in a gentle fashion because brisk movements scare them away. A motion speed indicator is visible on the net to promote self-awareness. The user can monitor, control, and inhibit unproductive motor responses that may be triggered when the butterflies are near.
This is an immersive version of a sustained attention to response task, a go or no-go task with infrequent no-go events to measure user attention. The user is provided a self-avatar, taking a stroll in a natural scene. Shapes continuously appear in front of the user, and a button must be pressed when each new shape appears, except when it is a green diamond.
This is an immersive version of a visual search and attention test, in which a grid of orange and blue
In relaxation and sensory exploration, the user is immersed in scenic views and asked to look around while focusing on their breathing. A rhythmic object is present to guide their breathing pace. Close adherence to the rhythmic object should have a calming effect. The user is free to explore the scene in which they are or to select a different scene from a set of options presented to them as pages in a book. This exercise was also designed for eventual self-guided stress management, where future efforts will include the integration of heart rate variability biofeedback based on slow-paced breathing within virtual nature scenes.
The clinician-facing UI was developed to allow the exercise parameters to be customized and for functionalities to be turned on or off during the exercises. A scrolling menu (
Clinician-facing user interface with clinical exercise settings interface under Clinical tab.
Clinician user interface 2D icons: (1) exercise selection, (2) real-time intervention, and (3) settings user interface menu access.
bWell was designed to serve as a toolkit; so, task parameters were designed to be highly configurable. Several personalization elements were included to promote a sense of embodiment. The self-avatar can be personalized with elements such as gender, skin color, height, and dominant hand that can be modified in a patient settings tab added to the clinician-facing UI. bWell’s design also considers potential physical limitations of users, giving them the option to play while standing or sitting. The patient settings can also be used to adjust the height of certain elements inside an exercise if the auto-adjusting features are insufficient. In addition, the patient VR interactions in bWell use multiple input types, such as gaze direction, teleportation, and grabbing or hitting objects, to accommodate participants with conditions that limit dexterous hand movements.
Task difficulty parameters were configured to allow for personalized assessments and rehabilitation plans. Difficulty levels can be set at fixed levels by the clinician, ensuring trial reproducibility for comparison with past performance or across participants, or set to use an adaptive algorithm configured to achieve an 80% success rate. Intra and intersession difficulty progression settings are also available.
All task parameters are logged using a completely automated logging system to facilitate offline analysis. In addition to logging exercise settings, event orders, and difficulty levels, the system records an array of timestamped, synchronized data, including motion tracking, exercise events, cues presented to the user, and performance measures. The logs, anonymized for confidentiality, are output as JSON and CSV files, which are compatible with standard data analysis software.
Other features were included to add variety to gameplay and to promote the repetition of tasks typically required in cognitive rehabilitation protocols. Gamification elements, such as rewards and animations, were integrated to promote adherence and engagement and to provide feedback on performance. Visual cues, audio cues, and sometimes distracting elements were also incorporated. In certain exercises, the displayed virtual environment can be explored with a teleportation feature, or the user can select between different virtual environments.
bWell is designed to operate in three modes: (1) tutorial, (2) assessment, and (3) rehabilitation. Each mode is designed to address a specific set of clinical requirements determined through early adopter feedback.
In the tutorial mode, participants engaged in structured learning of the required actions for each task. This was implemented to ensure that participants began assessment and rehabilitation exercises with comparable levels of familiarity with the actions required to complete the exercises, even if they had different baseline levels of familiarity with XR. During the tutorial for each exercise, a standardized set of instructions is presented in writing and verbally by a humanoid avatar. If an action was not executed after a specified time interval, a revised version of the instructions was presented to increase the likelihood that participants properly understood what was required. The humanoid avatar also provided verbal feedback about successes and failures throughout the tutorial (
bWell Mole exercise as an example of modes and configurable parameters. The left image shows the tutorial mode with the score that must be achieved before the exercise can begin. The humanoid avatar provides instructions and verbal feedback on successes and failures. The center image shows the mole exercise in progress at a fixed, low difficulty level (4 cylinders) without any feedback on successes and failures. The right image shows the mole exercise in progress with adaptive difficulty leveling and a score visually displayed. This configuration aims for gamified rehabilitation.
In the assessment mode, participants completed the exercise with events always occurring in the same order and with the same timing (ie, based on a fixed randomization seed) and with fixed difficulty levels [
In the rehabilitation mode, the emphasis is on parameters that promote patient adherence by increasing engagement and providing feedback on performance. Adaptive algorithms adjust current level of difficulty based on performance to ensure that it is never too easy (boring) or too hard (demoralizing) for the user [
The authors conducted 2 preliminary acceptability studies with healthy participants and have described them in previous reports [
The results of these preliminary studies demonstrate the acceptability of the bWell platform. The careful attention taken during the design regarding cybersickness has shown successful administration even in participants who reported being highly susceptible to motion sickness. bWell tasks in immersive VR, both in static scenes and those involving more complex user motion, were well tolerated and engaging for healthy participants and provided the required support for testing in clinical populations as the next steps. In addition, as it was determined that engagement is not the same for immersive versus nonimmersive delivery of the exercises, the fully semi-, or nonimmersive modes may result in different user performances. As such, and in particular for cognitive assessment, comparisons of task results should be performed within a given mode of immersion.
Although bWell has shown general acceptability and tolerability, a small number of users experienced mild cybersickness. The Biomedical Data Intelligence team at the NRC investigated the use of an avatar within the bWell environment to monitor user discomfort. The dialog agent provides instantaneous and interactive assistance to users in the form of personalized advice on symptom relief [
The prototype is currently being taken beyond the laboratory and into clinical settings. To provide domain-specific expertise and clinical validation, a CCN was assembled consisting of 4 institutions across Canada and 1 in Japan (
Cognitive Care Network (CCN) sites. The CCN includes 5 clinical partners at 4 sites across Canada (CAMH, IUGM, SickKids, and UBC) and 1 site in Japan (ATR). Sites cover a broad range of expertise and are critical for the iterative development of bWell.
Evaluation of bWell by clinicians began in 2019 to provide domain-specific input throughout the bWell development process. bWell has thus far been installed at 4 locations, and ongoing feedback from early testing at these installations has been incorporated in iterative improvements. Adaptations of bWell exercises to target clinical populations have begun. These activities are structured according to the Birckhead et al VR trial methodology [
The use of digital solutions has the potential to address the current gap in mental health care resources. To this end, digital therapeutics have entered the pharmaceutical landscape [
A primary driver for the use of VR technology is its capacity for sensory immersion with tight experimental control, making it feasible to test or study cognitive and sensory-motor functioning that is typically prohibitive in real world settings because of the unpredictable and uncontrolled events that occur in everyday life. Immersive and engaging tasks were selected in bWell to encourage meaningful user interactions. The simulated scenes were also designed to be multisensorial and interactive to permit naturalistic movement, increasing the likelihood that skills learned within VR would be transferable to real life.
Another driver for the use of simulation technologies is the ability to capture rich behavioral data simultaneously. As users interact in virtual environments, all activities can be recorded for analysis. bWell was implemented with two data workflows—one with data streaming for real-time adaptive exercises and the other with data logging for offline analysis. In bWell, streaming data are used in a closed loop to adapt the exercises in real time, targeting patient-specific rehabilitation. Currently, data on user performance are used as input for adaptive algorithms that adjust the difficulty levels of the exercise accordingly. The integration of wearable, physiological sensors in XR scenes is also currently in progress to enable biofeedback. In this case, features derived in real time from sensor data are used as input to adapt the exercises. As part of the collaboration with the Advanced Telecommunications Research Institute from the CCN, new exercises are being developed for cognitive training using an electroencephalography-based brain-machine interface to control a virtual third arm [
In bWell, data are also currently being logged for offline analysis, with information including motion, key presses, VR cues, events, and patient performance. In this workflow, the goal is to derive features from sensor data that cannot be obtained from real-time processing. For example, classification of user states can be obtained with predictive models built using techniques performed offline, such as cognitive modeling and machine learning. With user state obtained from quantitative measures, such as user reaction to an increase in exercise difficulty or adverse response, VR assessments coupled with physiological sensors can provide more objective measures of individual function than a traditional self-report, which is known to be an unreliable index of functional outcomes [
During the development of the platform, two lessons were learned. First, design with a generic core and providing users with customizable options is a valuable approach for developing applications for a variety of treatment providers. To accomplish this, the bWell platform was designed to target specific cognitive domains rather than specific pathologies. Clinicians select exercises from the available ones to address the needs of a particular patient. This choice provides an opportunity to address symptoms rather than disorders. The treatment provider can further customize individual exercises by choosing from the configurable settings, such as enabling 1 of the 3 basic modes (tutorial, assessment, and rehabilitation) or turning specific functionalities on or off. In the study of cognitive functioning, this means that one is able to design a paradigm specific to a research question by choosing the virtual environment (eg, game-based or real-world setting), the type of experimental stimuli (eg, target, distractors, cueing, and feedback), the time and type of presentation (multisensorial or not), and the type of user response (eg, aim to minimize errors or perform as quickly as possible).
Second, avoiding dependence on specific hardware is advantageous when developing with XR. The implementation in bWell was hardware agnostic, multimodal, and multiplatform, accommodating the range of specifications coming from different potential clinical applications and increasing its staying power in the rapidly changing technological landscape. The work done on bWell focused on the use case where clinicians administer the intervention. However, the hardware agnostic core of the platform considers its eventual use at home with remote monitoring by clinicians. bWell’s core was also designed to accommodate the evolution of commercially available HMDs. HMDs have come a long way, from unwieldy, tethered devices to performant and comfortable standalone units aimed at the mass market. bWell’s core also allows for other modes of content delivery, such as AR or mobile, when immersive VR is not suitable. For instance, Google AR glasses have shown to be promising for teaching children on the autism spectrum to recognize emotions in real time [
bWell, developed by the NRC, is an immersive and interactive cognitive care platform that delivers assessment and therapeutic tasks as a multisensorial experience (visual, auditory, and touch). The technology has attained the maturity level of prototype implementation with preliminary validation carried out in laboratory settings. A CCN of early adopters was formed to evaluate the system and access domain-specific perspectives. Within this network, 4 installations of bWell prototypes have been completed, and the next steps have begun to adapt the system and to co-design content targeting specific clinical populations. In addition, we plan to perform the validation required for the eventual adoption of bWell as a clinical tool.
augmented reality
cognitive care network
US Food and Drug Administration
head-mounted display
National Research Council Canada
technology readiness level
user interface
virtual reality
extended reality
None declared.