Designing robots that care

by Nicola Plant, Queen Mary University of London

(See end for links to related careers)

Think of the perfect robot companion. A robot you can hang out with, chat to and who understands how you feel. Robots can already understand some of what we say and talk back. They can even respond to the emotions we express in the tone of our voice. But, what about body language? We also show how we feel by the way we stand, we describe things with our hands and we communicate with the expressions on our faces. Could a robot use body language to show that it understands how we feel? Could a robot show empathy?

If a robot companion did show this kind of empathetic body language we would likely feel that it understood us, and shared our feelings and experiences. For robots to be able to behave like this though, we first need to understand more about how humans use movement to show empathy with one another.

Think about how you react when a friend talks about their headache. You wouldn’t stay perfectly still. But what would you do? We’ve used motion capture to track people’s movements as they talk to each other. Motion capture is the technology used in films to make computer-animated creatures like Gollum in Lord of the Rings, or the Apes in the Planet of the Apes. Lots of cameras are used together to create a very precise computer model of the movements being recorded. Using motion capture, we’ve been able to see what people actually do when chatting about their experiences.

It turns out that we share our understanding of things like a headache by performing it together. We share the actions of the headache as if we have it ourselves. If I hit my head, wince and say ‘ouch’, you might wince and say ‘ouch’ too – you give a multimodal performance, with actions and words, to show me you understand how I feel.

So should we just program robots to copy us? It isn’t as simple as that. We don’t copy exactly. A perfect copy wouldn’t show understanding of how we feel. A robot doing that would seem like a parrot, repeating things without any understanding. For the robot to show that it understands how you feel it must perform a headache like it owns it – as though it were really theirs! That means behaving in a similar way to you; but adapted to the unique type of headache it has.

Designing the way robots should behave in social situations isn’t easy. If we work out exactly how humans interact with each other to share their experiences though, we can use that understanding to program robot companions. Then one day your robot friend will be able to hang out with you, chat and show they understand how you feel. Just like a real friend.

multimodal = two or more different ways of doing something. With communication that might be spoken words, facial expressions and hand gestures.


This article was previously published on the original CS4FN website and a copy is on page 16 of issue 19 of the CS4FN magazine, which you can read by clicking on the magazine cover below.


Related Magazine …


See also (previous post and related career options)

Click to read about the AMPER project

We have recently written about the AMPER project which uses a tablet-based AI tool / robot to support people with dementia and their carers. It prompts the person to discuss events from their younger life and adapts to their needs. We also linked this with information about the types of careers people working in this area might do – the examples given were for a project based in the Netherlands called ‘Dramaturgy for Devices’ – using lessons learned from the study of theatre and theatrical performances in designing social robots so that their behaviour feels more natural and friendly to the humans who’ll be using them.

Click to see one of the four jobs in this area with another three linked from it

See our collection of posts about Career paths in Computing.


EPSRC supports this blog through research grant EP/W033615/1.

CS4FN Advent 2023 – Day 4: Ice skate: detecting neutrinos at the South Pole, figure-skating motion capture, Frozen and a puzzle

This post is part of the CS4FN Christmas Computing Advent Calendar and we are publishing a small post every day, about computer science, until Christmas Day. This is the fourth post and the picture on today’s door was an ice skate, so today’s theme is Very Cold.

A bright red ice skate. Image drawn and digitised by Jo Brodie.

1. IceCube

The South Pole is home to the IceCube Neutrino Observatory. It’s made of thousands of light (optical) sensors which are stretch down deep into the ice, to almost 3,000 metres (3 kilometres) below the surface – this protects the sensors from background radiation so that they can focus on detecting neutrinos, which are teeny tiny particles.

Building the IceCube Observatory – photo from Wikipedia. Ice Cube drilling setup at drill camp, December 2009. This file is licensed under the Creative Commons Attribution-Share Alike 3.0 Unported license.

Neutrinos can be created by nuclear reactions (lots are produced by our Sun) and radioactive decay. They can whizz through matter harmlessly without notice (as the name suggests, they are pretty neutral), but if a neutrino happens to interact with a water molecule in the ice then they can produce a charged particle which can produce enough radiation of its own for its signal to be picked up by the sensors. The IceCube observatory has even detected neutrinos that may have arrived from outside of our solar system.

These light signals are converted to digital form and the data stored safely on a computer hard drive, then later collected by ship (!) and are taken away for further analysis. (Although there is satellite internet connection on Antarctica the broadband speeds are about 20 times slower than we’d have in our own homes!).

2. Computer science can help skaters leap to new heights

Researchers at the University of Delaware use motion capture to map a figure skater’s movements to a virtual version in a computer (remember the digital twins mentioned on Day Two of the advent calendar). When a skater is struggling with a particular jump the scientists can use mathematical models to run that jump as a computer simulation and see how fast the skater should be spinning, or the best position for their arms. They can then share that information with the skater to help them make the leap successfully (and land safely again afterwards!).

Video from the University of Delaware via their YouTube channel.

3. Frozen defrosted

by Peter McOwan, Queen Mary University of London

The hit musical movie Frozen is a mix of hit show tunes, 3D graphics effects, a moral message and loads of topics from computer science. The lead character Princess Elsa creates artificial life in the form of snowman, Olaf, the comedy sidekick, uses nanotechnology based ice dress making, employs 3D printing to build an ice palace by simply stamping her foot and singing and must be complimented for the outstanding mathematical feat of including the word ‘fractal’ in a hit song. In the USA the success of the movie has been used to get girls interested in coding by creating new ice skating routines for the film’s princesses, and devising their own frozen fractals…and let it go, let it go, … you all know the rest.

4. Today’s puzzle

This is a kriss-kross puzzle and you solve it by fitting the words into the grid. Answer tomorrow. You need to pay attention to the letter length as that tells you which word can fit where. There is only one four-letter, six-letter and eight-letter word so these can fit only in the grid where there are four, six or eight spaces, so put them in first. There are two three-letter words and two three-letter spaces – the words could be fitted into either space, but only one of them is correct (where the letters of other words will match up). Strategy! Logical thinking! (Also Maths [counting] and English [spelling]).


EPSRC supports this blog through research grant EP/W033615/1.