The Tesla Model S that braked sharply and triggered an eight-car crash in San Francisco in November had the automaker’s controversial driver-assist software engaged within 30 seconds of the crash, according to data the federal government released Tuesday.
The Tesla Model S slowed to 7 mph on the highway at the time of the crash, according to the data. Publicly released video also showed the car moving into the far-left lane and braking abruptly.
The Tesla’s driver told authorities that the vehicle’s “full self-driving” software braked unexpectedly and triggered the pileup on Thanksgiving day. CNN Business was first to report last month the driver’s claim that “full self-driving” was active.
The National Highway Traffic Safety Administration then announced that it was sending a special crash investigation team to examine the incident. The agency typically conducts special investigations into about 100 crashes a year.
The pileup took place hours after Tesla CEO Elon Musk announced that its “full self-driving” driver-assist system was available to anyone in North America who requested it and had paid for the option. Tesla had previously restricted access to drivers with high scores on its safety rating system.
“Full self-driving” is designed to keep up with traffic, steer in the lane and abide by traffic signals, but despite Tesla’s name for it, it requires an attentive human driver prepared to take full control of the car at any moment. It’s delighted some Tesla drivers but also alarmed others with its flaws. Drivers are warned on an in-car screen by Tesla when they install “full self-driving” that it “may do the wrong thing at the worst time.”
Tesla generally does not engage with the professional news media and did not respond to CNN’s request for comment.
“We are proud of Autopilot’s performance and its impact on reducing traffic collisions. The benefit and promise of Autopilot is clear from the Vehicle Safety Report data that we have been sharing for 4 years,” Tesla said this month in an update to its vehicle safety data.
Traffic safety experts have long questioned the merits of Tesla’s findings, which show fewer crashes when the driver-assist technologies are active, because among other things they’re generally used on highways where crashes are already rarer.
Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive,” the agency has said in a statement.
The agency has received hundreds of complaints from Tesla users. Some have described near crashes and concerns for their safety.
Bryan Reimer, an autonomous vehicle researcher with the Massachusetts Institute of Technology’s AgeLab, told CNN Business the revelation that driver-assist technology was engaged raises questions about when NHTSA will act on its investigation, and what the future holds for Tesla’s driver-assist features.
“How many more crashes will there be before NHTSA releases findings?” Reimer said.
Reimer said it remains to be seen if there’s a recall of any Tesla driver-assist features, and what it means for the automaker’s future. Musk has said before the company would be “worth basically zero” if it doesn’t provide “full self-driving.”
This story has been updated to reflect that a driver-assist system was active within 30 seconds of the crash.