Published on in Vol 10 (2023)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/48031, first published .
Older Adults’ Engagement and Mood During Robot-Assisted Group Activities in Nursing Homes: Development and Observational Pilot Study

Older Adults’ Engagement and Mood During Robot-Assisted Group Activities in Nursing Homes: Development and Observational Pilot Study

Older Adults’ Engagement and Mood During Robot-Assisted Group Activities in Nursing Homes: Development and Observational Pilot Study

Original Paper

1School of Applied Psychology, University of Applied Sciences and Arts Northwestern Switzerland, Olten, Switzerland

2City of Bern (Digital Stadt Bern), Bern, Switzerland

Corresponding Author:

Alexandra Tanner, MSc

School of Applied Psychology

University of Applied Sciences and Arts Northwestern Switzerland

Riggenbachstrasse 16

Olten

Switzerland

Phone: 41 0796040979

Email: mail@alexandratanner.net


Background: Promoting the well-being of older adults in an aging society requires new solutions. One resource might be the use of social robots for group activities that promote physical and cognitive stimulation. Engaging in a robot-assisted group activity may help in the slowdown of physical and cognitive decline in older adults. Currently, our knowledge is limited on whether older adults engage in group activities with humanlike social robots and whether they experience a positive affect while doing so. Both are necessary preconditions to achieve the intended effects of a group activity.

Objective: Our pilot study has 2 aims. First, we aimed to develop and pilot an observational coding scheme for robot-assisted group activities because self-report data on engagement and mood of nursing home residents are often difficult to obtain, and the existing observation instruments do have limitations. Second, we aimed to investigate older adults’ engagement and mood during robot-assisted group activities in 4 different nursing care homes in the German-speaking part of Switzerland.

Methods: We developed an observation system, inspired by existing tools, for a structured observation of engagement and mood of older adults during a robot-assisted group activity. In this study, 85 older adult residents from 4 different care homes in Switzerland participated in 5 robot-assisted group activity sessions, and they were observed using our developed system. The data were collected in the form of video clips that were assessed by 2 raters regarding engagement (direction of gaze, posture as well as body expression, and activity) and mood (positive and negative affects). Both variables were rated on a 5-point rating scale.

Results: Our pilot study findings show that the engagement and mood of older adults can be assessed reliably by using the proposed observational coding scheme. Most participants actively engaged in robot-assisted group activities (mean 4.19, SD 0.47; median 4.0). The variables used to measure engagement were direction of gaze (mean 4.65, SD 0.49; median 5.0), posture and body expression (mean 4.03, SD 0.71; median 4.0), and activity (mean 3.90, SD 0.65; median 4.0). Further, we observed mainly positive affects in this group. Almost no negative affect was observed (mean 1.13, SD 0.20; median 1.0), while the positive affect (mean 3.22, SD 0.55; median 3.2) was high.

Conclusions: The developed observational coding system can be used and further developed in future studies on robot-assisted group activities in the nursing home context and potentially in other settings. Additionally, our pilot study indicates that cognitive and physical stimulation of older adults can be promoted by social robots in a group setting. This finding encourages future technological development and improvement of social robots and points to the potential of observational research to systematically evaluate such developments.

JMIR Rehabil Assist Technol 2023;10:e48031

doi:10.2196/48031

Keywords



Background

Given the global phenomenon of aging populations, strategies to reduce the risk of physical and cognitive decline and the associated consequences on the well-being of older adults and their ability to cope with everyday life are urgently needed [1]. One resource in this context might be the use of the so-called social robots. According to Anzalone and colleagues [2], social robots can be understood as “machines that humans should perceive as realistic, effective partners, able to communicate and cooperate with them as naturally as possible interestingly enough.” The acceptance of social robots and their potential to promote the well-being of older adults have been explored and demonstrated in several studies [3-8]. Most of the robots studied are animallike, with PARO [9], a seal-shaped robot, being a prominent example [2-5,10,11]. However, animallike companion robots are not multifunctional and their interactions are not sufficient for those who require care and support. A study comparing animallike and humanlike social robots in group settings provided the first evidence that humanlike robots have greater effects on cognitive training than animallike robots [12], which brings humanlike social robots into the focus of research for group activities for older adults. This so-called third generation of social robots, including Nao, Pepper, QT, Sophia, Jack, LOVOT, or Tessa [12-18], continue to evolve, as new software is developed and released into the market [13]. Their humanlike forms [19] and integrated voice capability allow for interactions through facial expression, gestures, and voice. Thus, these robots can support cognitive and physically stimulating exercises, which in combination, achieve the best results in maintaining cognitive abilities in older adults [1].

Few studies [7,12,13,20-25] have investigated whether older adults actively engage in and experience positive moods during these activities. Since mood and engagement are crucial for the effectiveness of such group activities with a humanlike social robot, this study aims to explore these 2 constructs empirically. In doing so, we chose the method of systematic behavioral observation, because self-report data of older adults in nursing homes are often difficult to obtain and might interfere with their experience of the activity itself [9]. As no suitable observational coding scheme could be identified in the literature, a second aim of this study was the development and piloting of an observational coding scheme. In summary, this pilot study addresses the following questions: (1) can the engagement and mood of older adults in a robot-assisted group activity be assessed through systematic behavioral observation? and (2) do older adults actively engage in a robot-assisted group activity and what mood (ie, positive or negative affect) can be observed in the group during such a robot-assisted group activity?

Related Work

A review identified group activities for older adults assisted by social robots in 5 domains: affective therapy, cognitive training, social facilitation, companionship, and physiological therapy [7]. Three studies [12,13,20] showed a great potential of humanlike robots in group activities, with broader functionalities for physical activities. The first indications that older adults liked to participate in robot-assisted group activities for physical activities are shown in [21,22]. The robot NAO was found suitable to be used in group settings for moving, memory training, entertainment, music, dancing, and games [23]. One study showed that older adults in a nursing home prefer walking with a robot rather than walking alone [24]. Another study showed that older adults actively participated in robot-assisted cognitive therapy and physiotherapy sessions, and a trend toward improved neuropsychiatric symptoms, reduced apathy, and higher quality of life was observed [25]. Although these studies [12,13,20-25] provide first insights into the acceptance of humanlike robots assisting in group activities of older adults, we identified only 1 study that systematically developed and used an observation system for examining the engagement of groups of older adults during activity sessions assisted by a humanlike social robot [13]. Even though this observation system indicates that systematic observation is a fruitful methodological approach in this research context, it does not fully capture the psychological constructs of engagement and mood that are at the center of our pilot study and that are usually captured using self-report surveys.

Observation of Engagement and Mood During a Robot-Assisted Group Activity

Engagement

In the context of group activities providing physical and cognitive stimulation, engagement in exercises is crucial to generate the intended effects [26]. According to Perugia and colleagues [27], engagement is defined as “the psychological state of well-being, enjoyment, and active involvement that is triggered by meaningful activities and causes people with dementia to be absorbed by the activity, more energetic and in a more positive mood.” Studies with children provide evidence that children are just as willing to engage in robot-guided exercises as when a human demonstrates the exercise [28].

Mood

The mood (ie, positive or negative affect) in the group during a robot-assisted activity is of interest to determine if older adults are enjoying themselves in the process, which is relevant to ensure participation beyond curiosity and to assess whether the intended positive effects of such robot-assisted group activities are actually attained. Assessing mood separately from engagement was important also because the stimulus for activities of older adults in the nursing home is a key factor in whether engagement occurs [29]. The general experience is that humanlike social robots, with their ability to express emotions, tend to evoke a notably positive affect. However, the counter hypothesis to this would be that older adults simply want to be polite and participate because something new is happening in the nursing home, without they actually experiencing the positive affect when interacting with the humanlike social robot.

Assessment of Engagement and Mood

The assessment of engagement and mood during a robot-assisted group activity has not been researched much [27], and collecting data with older adults in terms of reliable outcomes presents a challenge [7]. Several observational studies have provided inspiration for the design of the observation tool used in this study [30-33]. One important observational instrument is the Observational Measurement of Engagement [34], and its further development can be used to gain a broad understanding of engagement in the context of telepresence robots and companion robots [20]. Although this instrument was not directly suitable to measure the predefined behavior of older adults (eg, mimic an exercise) at the group level, it informed our methodological decisions and developments.


Study Design

We considered this as a pilot study because we developed and tested the applicability of a systematic observation system for rating participants’ engagement and mood during robot-assisted group activity sessions for older adults in nursing homes.

Recruitment Strategy

A pool of about 200 nursing homes in the German-speaking part of Switzerland were contacted by telephone and invited to participate in this study. Four nursing homes expressed their interest, and they were selected to participate in this observational field study. All participating nursing homes provide various services for leisure activities and physical and cognitive stimulation. They offer accommodation and care to 50-160 residents and provide specialized dementia care. As part of this study, the management of each nursing home agreed to co-organize a robot-assisted group activity together with the research team and made nursing staff available to accompany residents to the session. Residents of the participating nursing homes were informed about the robot-assisted group activity and the study procedure, and they were invited to participate on a voluntary basis.

Materials

The robot used for the robot-assisted group activity was the NAO robot from SoftBanks Robotics [35]. We used the software of Avatarion [36] developed by Smart Companion [37]. The software was developed in collaboration with experts for leisure activities and physical and cognitive stimulation for older adults, specifically for robot-assisted leisure activities during their care. In this study, we used 3 software modules that support common elements of group activities for older adults: singing, storytelling, and gymnastics.

  1. Singing: In the first module, the robot animates the residents to sing along with him or her by using friendly verbal communication and gestures. All songs implemented in this module are well-known Swiss songs that are popular with the older generation. The robot sings the songs with a human voice, and the singing is accompanied with suitable gestures. For songs with more complex lyrics, the residents received handouts of the lyrics.
  2. Storytelling: In the second module, the robot tells a story to the residents. The stories are designed to include biographical aspects. All stories implemented in this module are short and contain elements to imitate movements.
  3. Gymnastics: In the third module, the robot guides the residents to imitate physical exercises by using friendly verbal communication and gestures. The physical exercises are designed for older adults. For example, the robot shows how to stretch the arms or move the fingers.

Figures 1-2 illustrate the robot-assisted physical exercise sessions in 2 different nursing homes. Photos were taken during 2 robot-assisted group activity sessions in 2 different nursing homes. The pictures show the NAO robot demonstrating movements with its hands and residents participating in this physical exercise by imitating the robot’s movements.

Figure 1. An illustration of a robot-assisted group activity in a nursing home.
Figure 2. An illustration of a robot-assisted group activity in another nursing home.

Ethical Considerations

According to Swiss law this study did not require formal ethics approval and was thus exempt from formal ethics review. For more information please see the corresponding section of the Swiss Human Research Act. The participating nursing homes consented to this study and informed the residents in advance about the robot-assisted activity sessions. Participation in the robot-assisted activity sessions was voluntary. Consent was obtained for using the anonymized photographs in this paper.

Study Procedure and Data Collection

Robot-assisted group activity sessions were offered in the participating nursing homes in July and August 2019. Chairs and free spaces for wheelchair users were arranged in a way that allowed the participants to see the NAO6 robot that was placed on a table. All robot-assisted group activity sessions took place 1 hour before lunch. Participating residents arrived independently. During the session, 2-5 health care professionals were available in the room for the general support of the residents. All group activity sessions in this study were conducted by research team members and lasted 1 hour, with the actual robot-assisted group activity taking about 30 minutes.

The activity session was structured in 3 parts. First, a representative of the research team welcomed the residents, explained the procedure of the session, reminded them that participation was voluntary, informed them about data protection issues, asked for their approval regarding video recording, and introduced the persons involved. Second, a technical expert from the Smart Companion team started the robot program. The participants first performed a gymnastics exercise, then sang a song together with the robot, and toward the end of the session, they listened to a story. The research team did not interact with the participating residents during these sessions. Third, an additional exercise was conducted by the robot to get the residents in the mood for lunch. This exercise was not recorded and was not part of this study, as it did not aim at their physical and cognitive stimulation. At the end of the activity session, the robot wished the residents bon appétit and said goodbye. Finally, the research team also said goodbye and thanked the residents for their participation.

To systematically analyze residents’ engagement and mood in group activities with the robot, sessions were recorded on video. Video recording has the advantage that, for example, behavior can be observed unobtrusively and participants do not have to be bothered afterwards, as they would be when using interviews. Further, for a high number of residents in nursing homes, other forms of data collection such as surveys present an inaccurate form of assessment, since retrieval, reporting, and ranking of relevant information may be compromised. Therefore, almost all assessment techniques for people with dementia rely on behavior observation [27]. Video recording was done in a way such that residents should not be disturbed, and the Hawthorne effect could be reduced [38]. Hence, short video clips of all 3 exercises were filmed as discreetly as possible. The video clips lasted between 30 seconds and 3 minutes and were distributed across the whole duration of the 3 exercises. The time of the start of the clip in the exercise was chosen randomly. For practical reasons, video clips were recorded with a smartphone camera. For ethical reasons, we collected no personal data such as the age of participants as well as the presence and severity of dementia symptoms. The videos only show the number of participants during each session.

Measures

For a structured observation of engagement and mood during a robot-assisted group activity, an observation system was developed. The observation system builds on existing observation systems for engagement and mood of individuals but was adapted for direct observation in a group setting. For example, in studies of children’s engagement during one-on-one interactions with robots [30,31], the variables used to measure engagement were direction of gaze, facial expressions, responses, or gestures. Another study related to children with autism spectrum disorders interacting with social robots [32] used measures of engagement based on nonverbal behavior focusing on social and antisocial behaviors. Another system used for older adults observed in a session with a social robot includes measures of engagement and mood targeted to a setting with small groups and a facilitator. Engagement was measured by someone leaning toward the collaborator, and mood was assessed by movements that were accompanied by a positive or negative affect [27]. Further, we analyzed the Observational Measurement of Engagement. This tool is based on a self-identity questionnaire and the 3 dimensions of observational measurements, namely, duration, attention, and attitude. This instrument did not meet all our needs, as we had a predefined duration of an interaction, and attitude was not the focus of our study. However, attention was in our interest and was included in our observation instrument. Another study measured affect and social interaction during a game [33]. Positive affect included smiling and clapping, and negative affect included sadness and anger [33]. Both studies show the relevance of gaze direction for capturing engagement and of observable behaviors for capturing positive and negative affect.

Engagement

Extending the previous research, we aimed to assess participants’ engagement in robot-assisted group activities. We adapted an established rating system that has been used for the observation of students’ attention in class [39]. This observation system captures 3 aspects of engagement: (1) direction of gaze (looks toward the teaching center vs looks elsewhere), (2) posture and body expression (oriented toward the teaching center and alert vs averted or flaccid), and (3) activity (performs the activity necessary for the task vs does something else on the side). Since we analyzed groups of nursing home residents and were not interested in individual differences, we assessed engagement of the group as a whole. To do so, we created a 5-point rating scale reflecting the degree of engagement in the group. For example, in the original systematic behavior observation instrument [39], sequences were rated whether a child looks toward the teaching center. We have modified the formulation from “none” of the participants looks (score 1) to “all” participants look to the center of the robot-assisted activity (score 5), and this 5-level rating scale aimed to assess engagement from very low to very high (see Table 1). Therefore, we distributed the number of people who showed the behavior depending on group size on the 5-level scale.

Table 1. Description of the rating system for engagement at the group level.
RatingEngagementDirection of gazePosture and body expressionActivity
1Very lowNone of the participants look to the center of the robot-assisted group activity (looking elsewhere)None of the participants turned toward the center of the robot-assisted group activity but turned away and were flaccidNone of the participants perform the activity necessary for the task, for example, performing movements, singing, or listening to the story told by the robot (doing something else on the side)
2LowMost participants do not look to the center of the robot-assisted group activityMost of the participants are not turned toward the center of robot-assisted group activity but turned away and were flaccidMost participants do not perform the activity necessary for the task, for example, performing movements, singing, or listening to the story told by the robot (doing something else on the side)
3MediumSome participants look to the center of the robot-assisted group activitySome participants are turned toward the center of robot-assisted group activity and their body expression is alert (vs turned away and flaccid)Some participants perform the activity necessary for the task, for example, performing movements, singing, and listening to the story told by the robot (vs doing something else on the side)
4HighMost participants look to the center of the robot-assisted activity sessionMost participants are turned toward the center of robot-assisted group activity and their body expression is alert (vs turned away and flaccid)Most participants perform the activity necessary for the task, for example, performing movements, singing, and listening to the story told by the robot (vs doing something else on the side)
5Very highAll participants look to the center of robot-assisted activity sessionAll participants are turned toward the center of robot-assisted group activity and their body expression is alert (vs turned away and flaccid)All participants perform the activity necessary for the task, for example, performing movements, singing, and listening to the story told by the robot (vs doing something else on the side).

Mood

To capture participants’ mood during the robot-assisted activities, we developed an observational rating scale based on the German version of the Positive and Negative Affect Schedule (PANAS) [40]. The PANAS is frequently used in studies in which human mood states are of interest. The questionnaire consists of 20 adjectives describing different feelings with 10 adjectives capturing positive affect and the other 10 capturing negative affect. The items of the original PANAS are shown in Textbox 1 [25]. This survey instrument was chosen because it contains a set of mood variables that describe mood with positive and negative affects with several adjectives that we assumed were observable by a rater. Based on findings by Reisenzein and colleagues [41] that emotions can be detected by observers using a variety of cues (eg, facial expressions, verbal expressions, physical expressions), we transformed the survey instrument PANAS into an observational rating scale for mood at the group level.

Textbox 1. Adjectives of the Positive and Negative Affect Schedule.

Positive affect

  • Attentive
  • Active
  • Alert
  • Excited
  • Enthusiastic
  • Determined
  • Inspired
  • Proud
  • Interested
  • Strong (this mood could not be observed reliably in our study)

Negative affect

  • Hostile
  • Irritable
  • Ashamed
  • Guilty (this mood could not be observed reliably in our study)
  • Distressed
  • Upset
  • Scared
  • Afraid
  • Jittery
  • Nervous

The original 5-level response scale contains the gradations “very slightly or not at all,” “a little,” “moderately,” “quite a bit,” and “extremely.” Again, because we were interested in the mood at the group level, we adapted the rating scale to reflect observable indicators of mood in the group, and a 5-point rating scale from “very low” to “very high” was used. For example, “very low” signified none of the participants were attentive in the robot-assisted group activity, and “very high” signified all participants were attentive in the robot-assisted group activity (see Table 2). To make an objective assessment of group mood during the robot-assisted group activity, sequences from the observation were rated in relation to each adjective from the PANAS. The description of the 5-point rating scale of mood according to the PANAS is shown in Table 2. We considered the observation at group level to be particularly relevant for capturing mood in the group so that situational factors that are an important component in the observation of mental states [41] could be included.

Table 2. Description of the 5-point rating scale of mood according to the Positive and Negative Affect Schedule [35] for a robot-assisted group activity.
RatingExtent of the perceived states for the measurement of mood in the group activityDescription
1Very lowNone of the participants are __a in the robot-assisted group activity.
2LowMost participants at the robot-assisted group activity are not __.
3MediumSome participants are __ in the robot-assisted group activity.
4HighMost participants are __ in the robot-assisted group activity.
5Very highAll participants at the robot-assisted group activity are __.

aThe rating system was used for every adjective of the Positive and Negative Affect Schedule.

Coding of Video Recordings

Video clips were rated independently by 2 trained observers (rater 1 and rater 2). Rater 1 was present during all robot-assisted group activities and rater 2 during two randomly selected sessions. Both raters were trained in the observation of nonverbal communication and body language for assessing the items for all 3 aspects of engagement (ie, direction of gaze, posture and body expression, activity) and for mood (ie, positive and negative affect). Clearly visible signs of dementia and severe physical limitations of the residents had to be considered, and the rating of engagement had to be adjusted to the residents’ possibilities of participation (eg, physical limitations). However, no individual was excluded from the analysis, as all ratings were performed at the group level. Each observer rated the group as a whole in every video clip by assessing whether none of the participants, some of the participants, most of the participants, or all of the participants exhibited a particular behavior indicating engagement (eg, direction of gaze, posture and body expression, activity) or positive or negative affect (eg, attentive, scared). To provide specific and context-sensitive anchors for the ratings, we counted the number of participants and distributed them proportionally across the 5-point scale. Thus, it depended on the actual group size what most and some participants meant. During the initial trial, it became apparent that all variables of engagement could easily be observed and rated by both raters. However, for rating the perceived mood in the group of study participants according to PANAS, additional coding rules had to be defined. The items “strong” and “guilty” were difficult to observe and hard to differentiate from 2 other items (eg, proud, ashamed) and thus not considered in the analysis. Each video clip was rated independently by the 2 raters to allow for reliability assessment.

Data Analysis

Interrater Agreement

To evaluate the agreement between 2 raters, we calculated the intraclass correlation coefficient (ICC) with the SPSS statistics software (version 26; IBM Corp). An ICC higher than 0.61 was considered substantial, and ICC higher than 0.81 was considered an almost perfect agreement [42].

Video Analysis

The number and gender of participants who attended each robot-assisted group activity session was extracted from the videos and presented descriptively. For the analysis of engagement, the mean values, standard deviation, and medians of the aspects direction of gaze, posture and body expression, and activity as well as the overall mean value, standard deviation, and median of engagement were calculated from the observers’ ratings. We also calculated the mean value, standard deviation, and median for each item and the positive and negative affect dimensions from the PANAS for each robot-assisted group activity session.


Videos and Study Participants

Of the 34 video clips recorded during 5 robot-assisted group activities, 3 videos had to be excluded because not all participants were visible or the video was too short to be rated. Thus, we finally included 31 video clips. In the 4 participating nursing homes, 85 older residents participated in 5 robot-assisted group activity sessions. Participant characteristics are provided in Table 3.

Table 3. Characteristics of the nursing homes and attendance in the activity sessions.a
Participating long-term care facilitiesResidential spaces (n)Residents attending the group activity (N=85)
116015
214020
214016
38218
44816

aIn nursing home 2, we conducted 2 independent robot-assisted group activity sessions.

Interrater Agreement

Agreement between the 2 raters was high for engagement and positive and negative affect. Engagement had an ICC score of 0.83 (95% CI 0.65-0.92). Negative affect reached an ICC of 0.84 (95% CI 0.67-0.93), and positive affect had an ICC of 0.90 (95% CI 0.79-0.96). Individual items, specifically adjectives that belonged to negative affect, received a rather weak ICC. These include the items “ashamed” (ICC 0.37, 95% CI –0.32 to 0.70) and “afraid” (ICC 0.39, 95% CI –1.07 to 0.52).

Engagement

As Table 2 demonstrates, the results show that the engagement of the participants in the robot-assisted group activity was high (mean 4.19, SD 0.47; median 4.0). The direction of gaze was measured as almost very high (mean 4.65, SD 0.49; median 5.0); posture and body expression (mean 4.03, SD 0.71; median 4.0) and activity (mean 3.90, SD 0.65; median 4.0) were also rated as high.

Mood

Overall, no negative affect could be observed (mean 1.13, SD 0.20; median 1.0). The mean value of positive affect was 3.22 (SD 0.55; median 3.2), which indicates the observer perceived a good mood during the sessions. Adjectives of the positive affect such as interested (mean 4.13, SD 0.56; median 4.0), alert (mean 4.39, SD 0.67; median 4.0), inspired (mean 3.87, SD 0.96; median 4.0), attentive (mean 4.19, SD 1.05; median 4.0), and active (mean 4.16, SD 0.64; median 4.0) received high ratings around the value 4, while enthusiastic (mean 2.42, SD 1.03; median 2.0), proud (mean 1.23, SD 0.43; median 1.0), and determined (mean 1.94, SD 0.77; median 2.0) were observed to be very low or low within the group of participants. Table 4 shows the detailed results.

Table 4. Interrater agreement as well as the mean (SD) and median values for the study variables engagement and mood (ie, positive and negative affect) observed during robot-assisted group activities.a

Interrater agreement (intraclass correlation coefficient)Mean (SD)Median
Engagement0.8314.19 (0.47)4.0

Direction of gaze0.6614.65 (0.49)5.0

Posture and body expression0.883403 (0.71)4.0

Activity0.8113.90 (0.65)4.0
Positive affect0.9023.22 (0.55)3.2

Interested0.8424.13 (0.56)4.0

Excited0.8252.61 (0.92)3.0

Enthusiastic0.8402.42 (1.03)2.0

Proud0.6801.23 (0.43)1.0

Alert0.7504.39 (0.67)4.0

Inspired0.8843.87 (0.96)4.0

Attentive0.9014.19 (1.05)4.0

Determined0.7661.94 (0.77)2.0

Active0.7224.16 (0.64)4.0
Negative affect0.8401.13 (0.20)1.1

Distressed0.8421.29 (0.59)1.0

Upset0.7681.16 (0.52)1.0

Scared0.6591.06 (0.43)1.0

Hostile0.4911.10 (0.48)1.0

Irritable0.6491.06 (0.25)1.0

Ashamed0.3651.16 (0.37)1.0

Nervous0.8041.29 (0.53)1.0

Jittery0.6651.23 (0.43)1.0

Afraid0.3901.13 (0.51)1.0

aThe items “strong” and “guilty” were not analyzed.

Additional Observations

Although we did not collect this information systematically, we observed that more residents participated in the robot-assisted activity sessions than expected by the nursing home staff and the research team. The different types of robot-assisted exercises (ie, singing, storytelling, gymnastics) promoted a variety of cognitive and physical stimulations as would a human instructor. Further, when watching the video recordings, we noted that the nursing home staff took time to assist and support participants individually during the robot-assisted activity session. Conversations took place between the residents and the nursing staff, and it seemed that the robot conducting all the instructions allowed more time for personal care.


In robot-assisted group activity sessions for older adults in nursing homes, their engagement and mood (ie, positive affect) can be regarded as preconditions to achieve the intended positive effects of physical and cognitive stimulation. Our observational pilot study in 4 nursing homes shows that residents actively engage in the leisure activities demonstrated and guided by a humanlike social robot. Overall, the engagement of the older adults in gymnastics exercises, singing with the robot, or listening to the robot telling stories was high. Engagement in the group activity was measured using 3 variables: direction of gaze, posture and body expression, and activity that the robot demonstrated. Almost all participants in the robot-assisted activity sessions kept their gaze directed toward the robot, and most had an active alert posture and actively imitated the movements demonstrated by the robot. We observed a positive mood in the groups during the robot-assisted activity sessions. Overall, the items measuring positive affect received high ratings, and the mood in the groups was mainly interested, alert, inspired, and attentive. The results of our study extend and complement existing laboratory studies as well as studies applied in areas other than the nursing home [13] by systematically using observational data to gain a better understanding of the ways in which residents engage in and experience robot-assisted group activities. From a methodological point of view, the participatory observation with video recording provided new insights. The systematic coding of video clips using structured observation systems for both study variables allowed us to reliably show whether participants engage with a positive mood. Further, the observation system developed for this study complements existing instruments for measuring engagement and positive and negative affect by focusing on group level measures and the behavior toward a humanlike social robot in a group activity.

In contrast, other instruments [2,27,34] focus on engagement-related behavior, wherein older adults directly interacted in a one-on-one setting with the robot and not within a group activity. During a group activity, it is common for older adults, especially for those with physical limitations and early signs of dementia, to express behavior less consistently and clearly. By observing at group level, it was possible to assess the engagement and the mood of the individual in the situation and context of the group, and the sometimes subtle cues to emotion could be reliably detected by the trained raters. This is important as the technical recognition system still needs to be greatly improved [36]. The added value of the instrument is that it allows for monitoring engagement and mood in a group of participants with limited self-report capabilities and thus broadens the insights gained with the existing instruments. In combination with the initial findings from other field studies [13,21,23,25,43] specifically studying the fostering of well-being in a nursing home setting, our results show the potential of such activity sessions to make a valuable contribution. For example, a memory study program with a humanlike social robot for older adults for cognitive training in a nursing home showed positive trends [44] and could be adapted for fun group activities.

As limitations, the following aspects should be mentioned. First, participants attended the robot-assisted group activity sessions voluntarily and were generally informed about the content beforehand; so, when they joined the session, they may have had a positive attitude toward robots, which could therefore have introduced a bias toward a more positive affect. Moreover, although the reliability of the observation system could be shown with high ICC values for most items measuring affect, some mood items had to be excluded due to difficulties in distinguishing them through observation during short interactions in pretests and some items still have rather low ICC values (eg, ashamed, afraid). This indicates that negative affect was more difficult to assess, which needs to be reflected critically when interpreting our findings. This result matches the findings of a study that measured emotions of individuals with severe intellectual disabilities where positive emotions were also found to be more observable than negative ones [45]. Thus, the investigation of negative affect while participating in robot-assisted activities might be an interesting focus of future studies. Second, we did not collect data as to whether the participants had mild or severe dementia. Although the analysis at the group level allowed consideration of situational factors and the constraints of the individuals were included by the raters, there may be differences in engagement and mood expression depending on the level of dementia as previous research shows [46]. Third, each robot-assisted activity was only performed once per group. Thus, we were unable to assess sequence effects or analyze which activities are the most engaging or which activities might tire participants more quickly. Moreover, in this study, we did not have the opportunity to study engagement and mood over a long period of time. Future research is needed for this [47]. Thus, novelty effects cannot be excluded. Interestingly, some studies [23] over longer periods of time did not report an attractiveness loss of the robot, but they did mention loss of interest due to usability problems with the robots. The so-called novelty effect [48] theoretically predicts “a decrease in the engagement with a stimulus after its initial novelty has worn off.” Usually, it is seen as a bias that has to be overcome (eg, by repeated interaction with the robot). An experiment with well-controlled repeated interactions showed that perceptions were positively influenced when participants interacted with the robot [48] and reported that a consistently positive interaction was already determined in the first 2 minutes of the conversation with the robot and remained stable over the subsequent sessions. In contrast, perceived threat and discomfort were the dimensions that changed the most during the interactions and decreased until the last session of the experiment [48]. With this in mind, we assume that the engagement and positive mood observed in the initial interaction as in our study are likely to be maintained in a relatively stable manner. Fourth, our pilot study investigates engagement and mood across different exercises within a robot-assisted activity session. In terms of effects on health, a larger study should assess which type of exercise receives the greatest engagement and positive affect, and it is of interest to continuously record the exercises. This allows for a more precise analysis of behavior during exercises, such as fatigue, and facilitates better comparisons of participation in the exercises among themselves. This knowledge would inform future software development and implementation of social robots. Finally, we did not investigate the practicability of the NAO robot for nursing staff and how a robot-assisted group activity can be implemented in a nursing home successfully. Research following a human-centered design approach [49] and an improved understanding of sustainable integration of social robots in leisure activities of older adults during their care are crucial.

Based on the positive findings of our study, questions arise about other application areas for robot-assisted group activities. Group sessions with social robots generate a form of enthusiasm, which is why they may be particularly suitable for group activities with vulnerable groups such as children, older adults, or people with disabilities. For all these potential target groups, interactions need to be designed in a way that results in maximum benefit and does no harm. In the context of the shortage of skilled nursing staff, social robots bring the potential to conduct a leisure group activity where caregivers do not need to be continuously present, thereby enabling older adults to be physically and cognitively engaged with less care effort and with fun. Moreover, if the social robot demonstrates exercises, nursing staff have more time for individual care as well as for personal conversations with the older adults. The literature shows that engagement and mood are prerequisites for health effects to be achieved [26]. Although the generalizability of our results must be established by future research, we found that older adults engage in robot-assisted group activities and that most of them were in a good mood during the session—interested, alert, inspired, and attentive. Therefore, the positive results on engagement and mood provide clear indications that humanlike social robots can improve the cognitive and physical abilities in older adults. Compared to other technologies, robots with their ability to communicate in a humanlike manner have a special property of supporting individuals physically and psychologically. Further development of this new technology of social robots is thus worthwhile in terms of promoting the quality of life of older adults in nursing homes.

Acknowledgments

We would like to thank all the participants in this study and care professionals working in the nursing home facilities enrolled in our study. We would like to specially thank Gabriela Bohler from Smart Companion who provided the software for this study. This study was part of the strategic initiative “Robo-Lab FHNW” funded by the University of Applied Sciences and Arts Northwestern Switzerland (FHNW) from 2018 to 2020. The Swiss Innovation Agency Innosuisse provided additional funding for this study (Innovationsscheck 33808.1).

Authors' Contributions

All authors contributed to the study conception and design. AT and AU prepared the material, collected the data, and performed the analysis. AT, HS, and TM prepared the first draft of the manuscript. All authors read and approved the final paper.

Conflicts of Interest

None declared.

  1. Gheysen F, Poppe L, DeSmet A, Swinnen S, Cardon G, De Bourdeaudhuij I, et al. Physical activity to improve cognition in older adults: can physical activity programs enriched with cognitive challenges enhance the effects? A systematic review and meta-analysis. Int J Behav Nutr Phys Act. Jul 04, 2018;15(1):63. [FREE Full text] [CrossRef] [Medline]
  2. Anzalone SM, Boucenna S, Ivaldi S, Chetouani M. Evaluating the engagement with social robots. Int J of Soc Robotics. Apr 17, 2015;7(4):465. [CrossRef]
  3. Broadbent E, Stafford R, MacDonald B. Acceptance of healthcare robots for the older population: review and future directions. Int J of Soc Robotics. Oct 3, 2009;1(4):319-330. [CrossRef]
  4. Broekens J, Heerink M, Rosendal H. Assistive social robots in elderly care: A review. Gerontechnology. 2009;8:A. [CrossRef]
  5. Abdi J, Al-Hindawi A, Ng T, Vizcaychipi MP. Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open. Feb 12, 2018;8(2):e018815. [FREE Full text] [CrossRef] [Medline]
  6. Vandemeulebroucke T, de Casterlé BD, Gastmans C. How do older adults experience and perceive socially assistive robots in aged care: a systematic review of qualitative evidence. Aging Ment Health. Feb 2018;22(2):149-167. [CrossRef] [Medline]
  7. Pu L, Moyle W, Jones C, Todorovic M. The effectiveness of social robots for older adults: a systematic review and meta-analysis of randomized controlled studies. Gerontologist. Jan 09, 2019;59(1):e37-e51. [CrossRef] [Medline]
  8. Scoglio AA, Reilly ED, Gorman JA, Drebing CE. Use of social robots in mental health and well-being research: systematic review. J Med Internet Res. Jul 24, 2019;21(7):e13322. [FREE Full text] [CrossRef] [Medline]
  9. PARO Therapeutic Robot. URL: http://www.parorobots.com/ [accessed 2021-11-29]
  10. Henschel A, Laban G, Cross ES. What makes a robot social? a review of social robots from science fiction to a home or hospital near you. Curr Robot Rep. 2021;2(1):9-19. [FREE Full text] [CrossRef] [Medline]
  11. Lorenz T, Weiss A, Hirche S. Synchrony and reciprocity: key mechanisms for social companion robots in therapy and care. Int J of Soc Robotics. Nov 2, 2015;8(1):125-143. [CrossRef]
  12. Valentí Soler M, Agüera-Ortiz L, Olazarán Rodríguez J, Mendoza Rebolledo C, Pérez Muñoz A, Rodríguez Pérez I, et al. Social robots in advanced dementia. Front Aging Neurosci. 2015;7:133. [FREE Full text] [CrossRef] [Medline]
  13. Chu M, Khosla R, Khaksar SMS, Nguyen K. Service innovation through social robot engagement to improve dementia care quality. Assist Technol. 2017;29(1):8-18. [CrossRef] [Medline]
  14. Pandey AK, Gelin R. A mass-produced sociable humanoid robot: Pepper-the first machine of its kind. IEEE Robot. Automat. Mag. Sep 2018;25(3):40-48. [CrossRef]
  15. Costa A, Steffgen G, Rodríguez LF, Nazarikhorram A, Ziafati P. Socially assistive robots for teaching emotional abilities to children with autism spectrum disorder. Presented at: Conference on Human-Robot Interaction (HRI2017); March 6-9, 2017; Vienna.
  16. Dinesen B, Hansen HK, Grønborg GB, Dyrvig A, Leisted SD, Stenstrup H, et al. Use of a social robot (LOVOT) for persons with dementia: exploratory study. JMIR Rehabil Assist Technol. Aug 01, 2022;9(3):e36505. [FREE Full text] [CrossRef] [Medline]
  17. van Dam K, Gielissen M, Reijnders R, van der Poel A, Boon B. Experiences of persons with executive dysfunction in disability care using a social robot to execute daily tasks and increase the feeling of independence: multiple-case study. JMIR Rehabil Assist Technol. Nov 03, 2022;9(4):e41313. [FREE Full text] [CrossRef] [Medline]
  18. Randall N, Kamino W, Joshi S, Chen W, Hsu L, Tsui KM, et al. Understanding the connection among Ikigai, well-being, and home robot acceptance in Japanese older adults: mixed methods study. JMIR Aging. Oct 04, 2023;6:e45442. [FREE Full text] [CrossRef] [Medline]
  19. Crowell CR, Deska JC, Villano M, Zenk J, Roddy JT. Anthropomorphism of robots: study of appearance and agency. JMIR Hum Factors. May 10, 2019;6(2):e12629. [FREE Full text] [CrossRef] [Medline]
  20. Jones C, Sung B, Moyle W. Assessing engagement in people with dementia: a new approach to assessment using video analysis. Arch Psychiatr Nurs. Dec 2015;29(6):377-382. [CrossRef] [Medline]
  21. Bäck I, Mäkelä K, Kallio J. Robot-guided exercise program for the rehabilitation of older nursing home residents. Annals of Long-Term Care. URL: https:/​/researchportal.​tuni.fi/​en/​publications/​robot-guided-exercise-program-for-the-rehabilitation-of-older-nur [accessed 2021-08-18]
  22. Cruz-Sandoval D, Morales-Tellez A, Sandoval E, Favela J. A social robot as therapy facilitator in interventions to deal with dementia-related behavioral symptoms. Presented at: ACM/IEEE International Conference on Human-Robot Interaction (HRI2020); March 23-26, 2020; Cambridge. [CrossRef]
  23. Huisman C, Kort H. Two-year use of care robot Zora in Dutch nursing homes: an evaluation study. Healthcare (Basel). Feb 19, 2019;7(1):31. [FREE Full text] [CrossRef] [Medline]
  24. Nomura T, Kanda T, Yamada S, Suzuki T. The effects of assistive walking robots for health care support on older persons: a preliminary field experiment in an elder care facility. Intel Serv Robotics. Jan 10, 2021;14(1):25-32. [CrossRef]
  25. Martín F, Agüero CE, Cañas JM, Valenti M, Martínez-Martín P. Robotherapy with dementia patients. International Journal of Advanced Robotic Systems. Jan 01, 2013;10(1):10. [CrossRef]
  26. Kolanowski A, Buettner L, Litaker M, Yu F. Factors that relate to activity engagement in nursing home residents. Am J Alzheimers Dis Other Demen. 2006;21(1):15-22. [FREE Full text] [CrossRef] [Medline]
  27. Perugia G, van Berkel R, Díaz-Boladeras M, Català-Mallofré A, Rauterberg M, Barakova E. Understanding engagement in dementia through behavior: the Ethographic and Laban-inspired coding system of engagement (ELICSE) and the evidence-based model of engagement-related behavior (EMODEB). Front Psychol. 2018;9:2. [FREE Full text] [CrossRef] [Medline]
  28. Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F. Social robots for education: A review. Sci Robot. Aug 15, 2018;3(21):eaat5954. [CrossRef] [Medline]
  29. Cohen-Mansfield J, Marx MS, Freedman LS, Murad H, Regier NG, Thein K, et al. The comprehensive process model of engagement. The American Journal of Geriatric Psychiatry. Oct 2011;19(10):859-870. [CrossRef]
  30. Serholt S, Barendregt W. Robots tutoring children: longitudinal evaluation of social engagement in child-robot interaction. Presented at: 9th Nordic Conference on Human-Computer Interaction; October 23-27, 2016; Gothenburg. [CrossRef]
  31. Ahmad MI, Mubin O, Orlando J. Adaptive social robot for sustaining social engagement during long-term children–robot interaction. International Journal of Human–Computer Interaction. Mar 03, 2017;33(12):943-962. [CrossRef]
  32. Simut RE, Vanderfaeillie J, Peca A, Van de Perre G, Vanderborght B. Children with autism spectrum disorders make a fruit salad with PROBO, the social robot: an interaction study. J Autism Dev Disord. Jan 2016;46(1):113-126. [CrossRef] [Medline]
  33. Shahid S, Krahmer E, Swerts M. Child–robot interaction across cultures: How does playing a game with a social robot compare to playing a game alone or with a friend? Computers in Human Behavior. Nov 2014;40:86-100. [CrossRef]
  34. Cohen-Mansfield J, Dakheel-Ali M, Marx MS. Engagement in persons with dementia: the concept and its measurement. The American Journal of Geriatric Psychiatry. Apr 2009;17(4):299-307. [CrossRef]
  35. Watson D, Clark LA, Tellegen A. Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology. 1988;54(6):1063-1070. [CrossRef]
  36. YEOSUITE software. Avatarion Technology AG. URL: https://www.avatarion.ch/de/yeosuite/ [accessed 2021-09-20]
  37. Social robots 4 you. Smart Companion. URL: https://www.smart-companion.ch/ [accessed 2021-08-21]
  38. Gillespie R. Manufacturing Knowledge: A History of the Hawthorne Experiments (Studies in Economic History and Policy: USA in the Twentieth Century). Cambridge, UK. Cambridge University Press; 1993.
  39. Ehrhardt KJ, Findeisen P, Marinello G, Reinartz-Wenzel H. Systematische verhaltensbeobachtung von aufmerksamkeit im unterricht: zur prüfung von objektivität und zuverlässigkeit (Systematic behavior observation of attention in class: Proof of objectivity and reliability). Fachportal-Paedagogik. 1981. URL: https://www.fachportal-paedagogik.de/literatur/vollanzeige.html?FId=143768 [accessed 2023-11-29]
  40. Breyer B, Bluemke M. Deutsche version der positive and negative affect schedule PANAS (GESIS panel). Zusammenstellung sozialwissenschaftlicher Items und Skalen (ZIS). 2014:1-23. [FREE Full text] [CrossRef]
  41. Reisenzein R, Junge M, Studtmann M, Huber O. Observational approaches to the measurement of emotions. In: International Handbook of Emotions in Education. New York. Routledge Chapman & Hall; 2013;580-606.
  42. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. Mar 1977;33(1):159. [CrossRef]
  43. Cruz-Sandoval D, Penaloza C, Favela J, Castro-Coronel A. Towards social robots that support exercise therapies for persons with dementia. Presented at: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers; October 8-12, 2018; Singapore. [CrossRef]
  44. Pino O, Palestra G, Trevino R, De Carolis B. The humanoid robot NAO as trainer in a memory program for elderly people with mild cognitive impairment. Int J Soc Robotics. Feb 27, 2019;12(1):21-33. [CrossRef]
  45. Vos P, De Cock P, Petry K, Van Den Noortgate W, Maes B. Investigating the relationship between observed mood and emotions in people with severe and profound intellectual disabilities. J Intellect Disabil Res. May 2013;57(5):440-451. [CrossRef] [Medline]
  46. Obayashi K, Kodate N, Masuyama S. Measuring the impact of age, gender and dementia on communication-robot interventions in residential care homes. Geriatr Gerontol Int. Apr 2020;20(4):373-378. [CrossRef] [Medline]
  47. Mahmoudi Asl A, Molinari Ulate M, Franco Martin M, van der Roest H. Methodologies used to study the feasibility, usability, efficacy, and effectiveness of social robots for elderly adults: scoping review. J Med Internet Res. Aug 01, 2022;24(8):e37434. [FREE Full text] [CrossRef] [Medline]
  48. Paetzel M, Perugia G, Castellano G. The persistence of first impressions: the effect of repeated interactions on the perception of a social robot. Presented at: 15th ACM/IEEE International Conference on Human-Robot Interaction (HRI); March 23-26, 2020; Cambridge, United Kingdom. [CrossRef]
  49. Ergonomics of human-system interaction. Part 210: Human-centred design for interactive systems. ISO. URL: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/07/75/77520.html [accessed 2021-09-21]


ICC: intraclass correlation coefficient
PANAS: Positive and Negative Affect Schedule


Edited by M Mulvenna; submitted 01.05.23; peer-reviewed by L Campbell, NT Chang; comments to author 30.08.23; revised version received 23.10.23; accepted 15.11.23; published 25.12.23.

Copyright

©Alexandra Tanner, Andreas Urech, Hartmut Schulze, Tanja Manser. Originally published in JMIR Rehabilitation and Assistive Technology (https://rehab.jmir.org), 25.12.2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Rehabilitation and Assistive Technology, is properly cited. The complete bibliographic information, a link to the original publication on https://rehab.jmir.org/, as well as this copyright and license information must be included.