Video: UBC student designing ethical robots
Share via Email
In what sounds like a scene out of a sci-fi movie, a University of British Columbia student is attempting to teach robots the difference between right and wrong.
AJung Moon, a third-year PhD student, is working on developing robots that can be programmed to use logic to make ethical decisions and act on it— what she calls “robo ethics.” She recently posted her findings on the research blog Footnote1.
Training robots to do the right thing is a challenging task, Moon said, especially when humans sometimes even have a hard time navigating moral dilemmas.
“Human ethics tend to be a very tricky thing to get your head around,” she told Metro. “The biggest challenge is … that not everybody in the world agrees on one particular set of moral principals or ethics.”
To get around that barrier, Moon and a team from the Open Roboethics initiative gathered data from people on how they thought a robot should behave. The team put the concept to the test by putting a Willow Garage PR2 robot through an ethical challenge involving an elevator.
The team first carried out a survey, asking people to imagine a scenario where a robot is in need of riding an elevator but encounters a person inside or someone waiting to get on. If the robot can only ride the elevator alone, the team asked people how they thought the robot should proceed.
Overall, respondents said the best choice would be for the robot to talk to the person. The least appropriate choice, the respondents said, would be to do nothing. If the robot has a non-urgent delivery, the respondents said it should always yield to people.
“People opted for the robot to have a dialogue with the person, to ask the person, ‘Are you in a hurry as well?’” Moon explained.
Moon and her team then programmed the survey data into the robot.
A video of the results shows the robot always yielding to other people in need of the elevator. The robot only asks the person to get off if it has an urgent delivery, but if the person refuses to exit, it simply waits for the next one.
Moon said her research shows the potential for robots to be programmed to interact and work with people.
“As long as we teach them behaviours that have human values and ethics incorporated in them, then the end result will reflect that,” she said.