Children feel more comfortable telling robots about their mental health issues, study suggests

A new study found that children were more comfortable trusting the questioning-looking, childlike robot than responding to a mental health assessment with their parents and, in some cases, revealing information they hadn’t shared before. Are robots the future pediatric psychiatrists? Children tend to be very positive about interactive technologies. Robots may not replace humans but they’ll be very useful to break the ice.

The Cambridge University team says the findings suggest robots play a broader role in assessing children’s mental health. However, they said they were not intended to replace professional mental health support.

“Traditional methods sometimes fail to detect mental wellness disorders in children because the changes are sometimes incredibly subtle,” said Nida Itrat Abbasi, the study’s first author. “We wanted to see if robots could help with this process.”

In the study, 28 children between the ages of 8 and 13 participated in a 45-minute one-on-one session with the 60cm humanoid robot called Nao. The robot, which has a child’s voice, began with an icebreaker conversation and fist bumps to create a pleasant atmosphere. They then asked questions about happy and sad memories from the past week, a questionnaire about feelings and mood, and a questionnaire to diagnose anxiety, panic disorder, and depression.

Children whose answers to traditional questionnaires suggested they might have mental health problems gave more negative responses when they answered the same questions with the robot and shared information they hadn’t revealed when they answered the questionnaires in person or online. The interesting finding here is when they interact with the robot their responses become more negative.

Children could see the robot as a “familiar” that allows them to reveal their true feelings and experiences, the scientists suggested. One of the parents, who watched the session through a mirrored window, told investigators that he didn’t realize his son was in trouble until he heard him answer the robot’s questions. And another research found that children are more likely to share private information, such as their bullying experience, with a robot than with an adult.

“We think it’s easier to engage with the robot as a companion if the robot is child-friendly,” said Professor Hatice Gunes, who heads the University of Cambridge’s Robotics and Affective Intelligence Laboratory. Instead, she says, children may respond to parents or psychologists with “what they think is expected of them, rather than what they think is true.”

Gunes suggested that in the future, robots could be used in schools to screen children for mental health problems, allowing children to receive support earlier.

Paper: https://www.repository.cam.ac.uk/handle/1810/338405