14
0
 Would you get into an automated self-driving vehicle, knowing that in the event of an accident, it might sacrifice your life if it meant saving the lives of 10 other people?
Autonomous vehicles (AVs), also known as self-driving vehicles, are already a reality. Initial guidelines from the National Highway Traffic Safety Administration regarding this technology are expected by this summer, and road tests are currently in progress across the country.
But one barrier to the widespread use of autonomous vehicles is deciding how to program these vehicles' safety rules in the most socially acceptable and ethical way.
After a six-month survey, an international team of researchers published their findings today in the journal Science and found the public has a conflicted view about the future of self-driving technology.
Scientists used Amazon's Mechanical Turk platform, an online marketplace where people are asked to conduct human intelligence tasks computers aren't yet able to do.
Over the course of six months, researchers conducted six online surveys, with 1,928 participants total, asking people to evaluate the morality of several different situations a self-driving vehicle may one day encounter.