Den här sidan är utskriven från Högskolan i Halmstads webbplats (www.hh.se). Texten uppdaterades senast den 2017-04-27. Besök webbplatsen om du vill vara säker på att läsa den senaste versionen.
Can a robot sense your feelings? Right now 198 works of robot art are competing in a Robot Art Competition in the USA. The winners are selected partly by Facebook likes. A unique contribution to the competition comes from Halmstad University, where the robot Baxter has attempted to interpret human emotions through painting, coached by artists Peter Wahlbeck and Dan Koon.
Usually Baxter lives at the School of Information Technology at Halmstad University, where he is in training to understand and interpret people’s needs and emotions in order to help them feel better. But one day a few weeks ago, Baxter took part in a public event well suited for a social robot. People had gathered on campus to see Baxter try to read, interpret and draw the feelings of well known local artist Peter Wahlbeck, also assisted by artist and author Dan Koon. They have both been part of Baxter’s training by coaching the robot research team in an ongoing master thesis project where the robot is trained to pick up on feelings and express them through painting.
Baxter, who had taken the name ”Rob Boss” fort the art competition, met his audience blindfolded.
“We did it to show that this is not about the robot reading facial expressions or body language. A person can smile without being happy. So we want to take it one step further by getting the robot to sense emotions through brain waves, to understand how a person truly feels,” says Martin Cooney, researcher in social robotics at Halmstad University.
With a sensor measuring brain waves attached to his head, Peter Wahlbeck proclaimed to be the guinea pig for the event. The emotions he was trying to convey to the robot were decided by chance. And as it turned out, they were to be two extremes – happiness and misery. Peter Wahlbeck produced his happy thoughts by thinking of a hot sunny summer day on the beach. Misery and irritation was evoked by remembering a recent event:
“I thought about something I had ordered online that turned out to not be what I expected, which was very irritating at the time!”
The robot reading Peter Wahlbeck’s emotions painted his impressions with a bright happy yellow and a dark blue and black. Even though the robot chose colours to represent the ”right” emotions, Dan Koon was disappointed after the public event, seeing the robot perform at a more advanced level before.
“Baxter had stage fright today, the robot has done better on other occasions. But we still think that the event has proved what a large potential this research has,” says Dan Koon.
“It is positive that the robot chose the colours matching the emotions, according to the colour scheme we have been using. It makes for a promising future,” says Martin Cooney.
The public event was not the first time Dan Koon and Peter Wahlbeck painted with Baxter – and the images entered in the Robot Art Competition are from those other occasions.
Maria Luiza Recena Menezes is a postgraduate student in affective computing, a subject exploring the emotional dimension in the interaction between human and computer. She is the co-supervisor for the master thesis – which she describes as unique.
“Robot systems usually copy. But here we want the robot to learn how to interpret signals into images,” she says, explaining the reason why the robot has been programmed only with colours and emotions, no images, “So the robot can’t copy something it has seen before.”
But the learning process must be given time.
“You have to remember that the robot is like a child learning. It must be given the time to pick up on and understand a person’s feelings. Every time we do these attempts the robot enhances its abilities.”
Helen Fuchs, lecturer in Art science in the School of Education, Humanities and Social Sciences was especially invited by the research team to give her views on the project. She feels that the endeavour is interesting and that robots learning to sense and express human emotions could lead to a new way of communication for people who suffer from various disabilities.
“The cooperation between different disciplines is interesting. The researchers take help from artists in exploring the robot’s possibilities,” says Helen Fuchs.
Peter Wahlbeck thinks that the project and the collaborations with the research team have been very interesting.
“This is an elaborate idea, that a robot can sense an emotion and express it through art. In the future this research can be of value to the health care sector in giving help to persons whom for different reasons can’t express themselves.”
The robots of today can perform complicated tasks, from difficult calculations to precision work and heavy lifting in industry. But many tasks are still not possible for robots to perform – for example tasks requiring feelings and empathy. The research team surrounding Baxter have in different projects attempted to train the abilities of the robot in sensing and responding to people’s feelings. Apart from the art project, the robot is also involved in a cooking project where, among other things, it is trained to decide if an ingredient is missing. Baxter also takes part in a health technology project aiming for the robot to be able to determine if a person needs help and, in that case, how the robot should act.
The Robot Art Competition is being held for the second time, and this year 198 contributions compete for the prize sum of 100 000 dollars, of which a large part will go to charity. The winners are selected by weighing together the votes on Facebook (40 percent) and the votes of a professional jury (60 percent).
The research team from Halmstad competes as the "HEARTalion" (Halmstad university Emotional Art RoboT). The project and the Robot Baxter’s artist name is "Rob Boss" (in reference to the American artist Bob Ross).
Text: LOTTA ANDERSSON and LOUISE WANDEL
Photo: ANDERS ANDERSSON (if nothing else is stated)
Sowmya Vaikundham Narasimman, master’s student in intelligent systems.
Daniel Westerlund, master’s student in intelligent systems.
Maria Luiza Recenta Menezes, PhD student in affective computing and the students´ co-supervisor.
Martin Cooney, robotics researcher and the students’ supervisor.
"In the future this research can be of value to the health care sector in giving help to persons whom for different reasons can’t express themselves", says Peter Wahlbeck.