U.S. authorities are investigating Tesla’s self-driving technology.
What’s new: Federal regulators launched a probe of nearly two dozen accidents, some of them fatal, that involved Tesla vehicles equipped for self-driving, Reuters reported.
The inquiry: The National Highway Traffic Safety Administration is looking into 23 crashes of Tesla vehicles that occurred when the cars’ autonomous driving systems may have been engaged.
- The agency previously completed four investigations into Tesla crashes, most famously one from 2016 in which a Florida driver was killed when his car plowed into a big rig. Tesla’s technology was found to be partly to blame for that incident but not the other three.
- In separate investigations of the Florida incident and one in California two years later, the National Transportation Safety Board (a different federal oversight group) found Tesla’s system at fault.
- Tesla insisted its vehicles are safe. Data it collects from its fleet shows that cars under autonomous control experience fewer accidents per mile than those driven by humans, the company said. The company has not revealed whether Autopilot was engaged during the accidents under investigation.
Behind the news: Tesla has two self-driving modes.
- Autopilot, which comes standard on all new vehicles, controls the steering wheel, brakes, and accelerator. It’s meant to be used on highways with a center divider.
- Drivers can upgrade to what Tesla calls the Full Self-Driving option for $10,000. Despite the option’s name, last November, a Tesla lawyer disclosed to California regulators that the system should not be considered fully autonomous.
- Tesla advises drivers using either mode to keep their hands near the steering wheel and eyes on the road. However, the systems remain engaged even if drivers don’t follow these instructions, and videos on social media show drivers using Autopilot on roads that are not divided highways.
Why it matters: The new investigations are aimed at finding facts and will not directly result in new rules for Tesla or the self-driving industry at large. Still, the company’s reputation could take a battering, and hype about self-driving technology makes it harder for the AI community as a whole to gain trust and make progress.
We’re thinking: While it may be true that Tesla’s self-driving technology is safer on average than human drivers, it doesn’t fit the description “full self-driving.” While Tesla’s work to promote clean energy has had widespread positive impact, it’s time for the company to drop that branding and for car makers to provide clear, consistent information about their autonomous capabilities.