Home » News » Tesla’s Self Driving Mode Allegedly Responsible For Eight Car Pile Up in California

Tesla’s Self Driving Mode Allegedly Responsible For Eight Car Pile Up in California

(Image Credit Google)
According to a California Highway Patrol traffic crash report, a driver told authorities that their Tesla's "full-self-driving" software unexpectedly braked, causing an eight-car pileup in the San Francisco Bay Area last month that resulted in nine people being treated for minor injuries, including one juvenile who was hospitalized. On Wednesday, a police report made public the incident. It is the latest in a string of mishaps attributed to Tesla technology. Elon Musk, the CEO of Tesla, has heavily promoted "Full Self-Driving" (FSD) software, which is sold as a $15,000 add-on to Tesla vehicles, but it is subject to legal, regulatory, and public scrutiny. The driver told police that the FSD software malfunctioned after the San Francisco accident. Tesla Self-Driving According to the report, the Tesla Model S was traveling at approximately 55 mph when it abruptly braked, slowing the car to approximately 20 mph. This set off a chain reaction that resulted in the crash of eight vehicles, all of which were traveling at normal highway speeds. According to the police report, if the FSD failed, the driver should have taken manual control. Tesla has repeatedly said its advanced self-driving technology requires "active driver supervision" and its vehicles "are not autonomous". The police were unable to determine whether the software was in use or whether the driver's account was correct. Following a records request, the report was made public. The pileup occurred just hours after Tesla CEO Elon Musk announced that Tesla's driver-assist software was now "fully self-driving" for anyone in North America who requested it. Previously, Tesla had restricted access to drivers with high safety ratings on its rating system. "Full self-driving" vehicles are programmed to keep up with traffic, stay in their lane, and obey traffic signals. It necessitates an alert human driver who is ready to take full control of the vehicle at any time. It delighted some drivers while alarming others due to its limitations. Tesla warns drivers who install "full self-driving" that it "may do the wrong thing at the worst time." The National Highway Traffic Safety Administration is already investigating Tesla's driver-assist technologies, Autopilot and "full self-driving," following reports of unexpected braking that occurs "without warning, at random, and frequently in a single drive." The NHTSA upgraded the investigation to an engineering analysis last summer. Jennifer Homendy, chair of the National Transportation Safety Board, has questioned whether "full self-driving" is an accurate description of the technology, and has said Tesla must do more to prevent misuse.

By Aaem Joshi

I am a Journalist who loves digging up stories that remain unheard. Strongly Believe in the knowledge of the social world.

RELATED NEWS

Porsche, known for its iconic sports cars, is divi...

news-extra-space

In the most recent winter range test at El Prix, a...

news-extra-space

A special electric car just did something amazing ...

news-extra-space

Volvo unveiled its inaugural electric minivan, the...

news-extra-space

Cruise Robotaxis are alleged to be self-using auto...

news-extra-space

image credit - carscoops.com You may be interes...

news-extra-space
2
3
4
5
6
7
8
9
10