Participants Argue With Robot, Accuse It of Lying, in Experiment

Posted on April 23, 2012

Psychologists at the University of Washington conducted a study to see if people will hold robots morally accountable. UW psychologists say, "Some argue that robots do not have free will and therefore cannot be held morally accountable for their actions."

Peter Kahn, a UW associate professor of psychology, says, "We're moving toward a world where robots will be capable of harming humans. With this study we're asking whether a robotic entity is conceptualized as just a tool, or as some form of a technological being that can be held responsible for its actions."

In the study, Kahn and his research team had 40 undergraduate students play a scavenger hunt with a humanlike robot, Robovie. The robot appeared autonomous, but it was actually remotely controlled by a researcher concealed in another room. Each participant engaged in small talk with Robovie and then played the scavenger hunt. Each participant had two minutes to locate objects from a list of items in the room. To win a $20 prize they had to find seven items. All the participants easily did this, but when their time was up, Robovie told them they had only found five objects.

65% of the participants said Robovie was to blame - at least to a certain degree - for wrongly scoring the scavenger hunt and unfairly denying the participants the $20 prize. Some participants argued with Robovie and accused him of being a liar.

Heather Gary, a UW doctoral student in developmental psychology, says, "Most argued with Robovie. Some accused Robovie of lying or cheating."

A video was provided as an example. A man in the study quickly finds 11 items - four more than needed to win the $20 prize - but Robovie tells him he only found five. You can see the video and find a transcript here.

The researchers say their study suggests that "it is likely that many people will hold a humanoid robot as partially accountable for a harm that it causes" as robots gain greater capabilities in language and social interactions.

The researchers are also concerned about the use of robots in warfarce. Kahn says, "Using robotic warfare, such as drones, distances us from war, can numb us to human suffering, and make warfare more likely."

You can find the research paper here (PDF).



More from Science Space & Robots