April 5, 2018 at 2:33 pm ET
Americans Less Trusting of Self-Driving Safety Following High-Profile Accidents
Poll shows 50% of U.S. adults say self-driving cars are less safe than human drivers
Polling in January found that 36% of U.S. adults said self-driving cars are less safe than human drivers.
That figure increased to 50% in a survey that concluded earlier this month.
Americans are less trusting of self-driving cars following two deadly accidents involving autonomous or semi-autonomous vehicles, with half of U.S. adults considering those automobiles less safe than human drivers, according to a new poll.
A Morning Consult survey conducted March 29-April 1 among a national sample of 2,202 adults found that 27 percent of respondents said self-driving cars are safer than human drivers, while 50 percent said autonomous vehicles are less safe. Eight percent said the automobiles are on par with human drivers when it comes to safety.
Previous polling — from Jan. 11-16 among 2,201 adults — that found 33 percent of Americans considered driverless vehicles safer, with 36 percent saying they were less safe — a negative net swing of 20 percentage points between the two surveys.
Both polls have a margin of error of plus or minus 2 percentage points.
The most recent survey was conducted after a self-driving sport utility vehicle being tested by Uber Technologies Inc. killed a woman crossing the street in Tempe, Ariz., on March 18 — the first known death of a pedestrian struck by a driverless vehicle on a public road. Uber has since shelved testing of its autonomous vehicles, and Arizona Gov. Doug Ducey (R) suspended the company’s testing privileges in the state.
An Uber spokeswoman on Monday said the company is cooperating in all ongoing investigations and is continuing to assist investigators. The National Transportation Safety Board is conducting an investigation.
On March 23, a Tesla Inc. Model X SUV crashed into a highway median near Mountain View, Calif., killing the vehicle’s driver and prompting a similar NTSB probe. The Tesla model features an autopilot system, which provides driver assistance like allowing the SUV to change lanes on its own while still requiring driver oversight.
Tesla said in a March 30 statement on its website that the autopilot system was engaged moments before the accident occurred. Tesla did not respond to a request for comment on how the accident could affect further efforts to test autonomous technology.
Jason Levine, executive director of the Center for Auto Safety, a consumer advocacy group, said the recent incidents involving Uber and Tesla AV technology highlight the fears “many consumers have with respect to removing human control from driving.”
“The idea is that this technology will make us safer — not as safe, but safer — and when there’s a demonstration of the possibility that that won’t be the case, that does tap into people’s concerns,” Levine said in a Wednesday interview.
Amitai Bin-Nun, vice president of autonomous vehicles and mobility innovation at Securing America’s Future Energy, a pro-energy independence group that supports the development of self-driving cars, said further familiarity with driverless vehicles will likely make consumers more comfortable with the technology. But he said that linking Tesla’s accident with Uber’s could unnecessarily increase public concerns about autonomous vehicles.
“Uber was testing its vehicle with a safety driver behind the wheel,” Bin-Nun said in a Tuesday interview. By contrast, he noted, Tesla’s autopilot system is already operational in some of the company’s vehicles.
“I think in the public’s mind, it’s all often just lumped together with all autonomous vehicles,” he said.
The most recent Morning Consult survey found that 60 percent of U.S. adults have seen, read or heard at least some news about the Uber accident, while 40 percent said the same thing about the Tesla crash.
The House on Sept. 6 passed autonomous vehicle legislation by voice vote that would support the development of self-driving vehicles by allowing automakers to produce a number of vehicles that do not meet the National Highway Traffic Safety Administration’s standards for testing and development. A similar measure stalled in the Senate, with several lawmakers placing holds on the bill due to safety and security concerns after it the Senate Commerce, Science and Transportation Committee voted unanimously on Oct. 4 to advance the bill.
Sen. Gary Peters (D-Mich.), one of the bill’s four co-sponsors, said in a Tuesday statement to Morning Consult that the recent accidents were concerning, but added that “Congress must quickly pass legislation to ensure safe testing and deployment.”
Senate Commerce Committee Chairman John Thune (R-S.D.), the bill’s sponsor, and Peters have been holding conversations with other senators to address any concerns they have about the legislation, a spokeswoman for Peters said Monday. She added that they might try to move the bill forward as part of a larger legislative package.
Madhur Behl, an assistant professor in computer science at the University of Virginia, said the recent Uber and Tesla accidents are a reminder that “we need to take our time to work through the evolution of the self-driving car technology,” but suggested the technology’s safety benefits still outweigh issues with human nature.
“At the end of the day, self-driving cars have the ability to learn quickly from these mistakes and teach all future cars,” Behl said Monday in an email to Morning Consult. “The same cannot be said for human drivers.”