It has been noticed that children who suffer from autism spectrum disorder (ASD) respond well to the use of animals or robots in therapy. They feel comfortable with robots whose behaviour and social signals tend to be relatively simple and predictable. For this reason, different movable puppets and robots have been used in therapies for ASD. In typical robot-assisted therapy (RAT), an additional therapist, usually hidden from the child, remotely controls the robot based on their observation of the therapeutic intervention. Consequently, such interventions require significant human resources over extended periods of time.

DREAM is an EC-funded integrated project that aims at developing novel approaches to robots that can operate in a partially autonomous manner while remaining under the supervision of therapists. We refer to this next generation of RAT as robot-enhanced therapy (RET): the robot will be able to infer the children’s psychological disposition and assess their behaviour in order to select its own actions, always supervised by a therapist who will be able to interrupt and modify the robot’s behaviour when needed.

The use of partially autonomous robots in therapeutic contexts raises several ethical issues, starting with the degree of autonomy to be afforded to the robot. The more autonomous the robot, the less control therapists have over the robot-child interaction, raising the issue of where the responsibility for the robot’s actions lies. Autonomy also raises the problem of trust: are parents happy to have their child interact with a robot? Will the child trust the robot? These and other questions will be addressed in the DREAM project.

DREAM involves psychologists, cognitive scientists, roboticists, computer vision researchers, and ethicists. The project is coordinated by the University of Skövde (SE, Tom Ziemke) and the consortium further includes experts from Babes Bolyai University(RO), Vrije Universiteit Brussel (BE), Plymouth University (UK), Portsmouth University(UK), De Montfort University (UK), and Aldebaran Robotics (FR), creator of the humanoid robot Nao.

The main DREAM experimental platform is the 58-cm tall, 5-kg humanoid robot NAOdeveloped by Aldebaran Robotics. NAO can detect and recognize pre-learned objects and faces, recognize words and sentences, and localise sounds in space. It has various communication devices including LED lights, two high-fidelity speakers, a voice synthesizer with language-specific intonation and pronunciation.

The humanoid robot NAO, developed by Aldebaran Robotics, is used to support teachers with in-class tasks and help children with autism spectrum disorder (Image credit: University of Skövde)

The second DREAM experimental platform is the social robot Probo developed at Vrije Universiteit Brussel. This robot was designed with verbal and non-verbal communication in mind, using human-like social cues and communication modalities. It is well suited for this task since it has a fully expressive and anthropomorphic head. With 20 motors in the head, the robot is able to express attention and emotions via its gaze and facial expressions.

Probo, developed at the Vrije Universiteit Brussel, is used as a social story telling agent for improving children’s social skills, and supporting children in learning to recognise basic emotions (Image credit: Vrije Universiteit Brussel).

Both robots have already been used successfully in experiments involving children with autism. Because of its size and appearance, NAO is particularly well received by young children. They anthropomorphize NAO and readily engage in affective social interactions. The ASK NAO (Autism Solution for Kids) initiative is a program created by Aldebaran Robotics to customize NAO to support teachers with in-class tasks and help children with ASD.

Probo meanwhile has a layered structure with foam and fabric which contributes a safe, soft and “huggable” interaction. During such interactions, children are very willing to touch the robot physically. Probo has previously been used in experiments with ASD children in a collaboration between Babeş Bolyai University and Vrije Universiteit Brussel. These include testing whether typically developed children are able to recognize the emotions expressed by the robot, using Probo as a social story telling agent for improving children’s social skills, and supporting children in learning to recognize basic emotions.

Probo is able to express attention and emotions via its gaze and facial expressions (Image credit: Vrije Universiteit Brussel)

Video: “The Huggable Social Robot Probo ” by Vrije Universiteit Brussel on YouTube

Further information is available at:

Acknowledgements and Contact:
Prof. Tom Ziemke
University of Skövde
School of Informatics
Interaction Lab
PO Box 408
54128 Skövde