The first known death of a driver using an autopilot feature underscores some of the American public’s biggest fears about self-driving vehicles, even as a majority say they haven’t heard much about the fatal Tesla accident.
The National Highway Traffic Safety Administration is investigating the events that resulted in a Tesla Model S failing to notice a tractor trailer turning in front of it on a Florida highway on May 7. News of the accident emerged last week, but a plurality (43 percent) in a new Morning Consult poll said they’d seen, read and heard nothing at all about the fatal crash, and 17 percent said they hadn’t heard much about it.
Twelve percent of respondents have read, seen or heard a lot about the accident, and 28 percent who said they’d read “some” about it.
Nevertheless, voters are still wary about the dangers of riding in driverless cars. Twenty-eight percent said they were likely to purchase or lease a self-driving car in the next 10 years, while 59 percent said it wasn’t too likely or likely at all. In a similar January poll, 63 percent of respondents said they weren’t likely to buy a driverless car.
Even barring purchases or leasing, 55 percent in the most recent poll said they wouldn’t ride in an autonomous car, compared with 27 percent who said they would.
While the specifics of the Tesla accident are still being investigated, some details are already emerging.
Tesla said the car, set on its Autopilot mode, passed under the tractor trailer due to “extremely rare circumstances” because both the driver and the software didn’t detect it was there. The truck’s height, its perpendicular position to the Tesla vehicle and the amount of sunlight all contributed to the deadly collision, the company said.
“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied,” Tesla said in a blog post. Chief Executive Elon Musk elaborated on Twitter, saying the car’s radar “tunes out what looks like an overhead road sign to avoid false braking events.”
Fifty-eight percent of poll respondents said their top concern is driverless cars suffering glitches and mistaking signage. Fewer reported being very concerned about other potential obstacles: 48 percent cited human-driven cars and autonomous cars driving on roads simultaneously; 39 percent pointed to the protection of personal tracking data from their GPS; and 56 percent cited road safety overall.
Tesla’s Autopilot mode includes semi-autonomous features that allow for automatic steering, automatic lane changing and a warning system for side collisions. But the company says that doesn’t constitute autonomous driving.
The automatic steering, still in a public beta phase, keeps the car in its lane and maintains its speed — but the company warns on its website that they require drivers to “remain engaged and aware” when using the automatic steering mode and that they must keep their hands on the steering wheel.
“It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled,” Tesla said on its blog. “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert.”
Morning Consult poll respondents showed overall skepticism regarding self-driving cars and some of the activities their drivers. Majorities said they found it unacceptable for drivers using cars in an autonomous mode to text or email (57 percent), read a newspaper or books (61 percent), watch movies or TV shows (62 percent) or be intoxicated (77 percent).
A portable DVD player was found in the wreckage of the fatal Tesla crash, raising questions about whether the driver had been watching a movie at the time of the accident, according to a report by Reuters.
The national Morning Consult survey polled 2,001 registered voters from June 30-July 4 for a margin of error of plus or minus 2 percentage points.