The U.S. government is investigating a serious accident on San Francisco’s Bay Bridge that the driver blamed on Tesla’s “fully self-driving” software.
A Tesla Model S suddenly slowed from the speed limit of 55 mph to just 20 mph, causing multiple shunting involving seven other cars on Thanksgiving Day at the Bay Area’s second-famous bridge. A wreck occurred during lunch. The incident occurred at 12:40 p.m. in the Yerba Buena Tunnel, the world’s largest tunnel.
“He suddenly stopped,” one observer told local media. “We had nowhere to go at that point.”
Nine people were injured, one of whom had to go to the hospital. Video footage of the aftermath can be seen below.
According to a California Highway Patrol report seen by CNN, the driver reported that the car was in “fully self-driving” mode at the time of the crash. Police and investigators from the National Highway Traffic Safety Administration (NHTSA) blocked two lanes of his bridge for hours and investigated the incident, which required him four ambulances to carry away the victims. I’m here.
Police are investigating whether the controversial so-called “fully self-driving” software was enabled and possibly what caused the crash. Tesla will of course keep a perfect record, but its CEO seems to have other things in mind at the moment. Nevertheless, Tesla has promised a full internal investigation.
The timing of the accident wasn’t ideal for Tesla. That’s because he happened less than a day after CEO Elon Musk promised that Tesla would make a beta version of its fully self-driving software available to any driver who could do it. Previously, use of this software was limited to select customers with good driving records.
“Tesla Full Self-Driving Beta is now available to anyone in North America who has requested it through their car screen, assuming they have purchased this option,” says the new owner on Twitter. Said“Congratulations to the Tesla Autopilot/AI team for achieving a huge milestone!”
Tesla is reportedly under investigation by the U.S. Department of Justice for misleading drivers about the effectiveness of its self-driving software. NHTSA data shows that Tesla vehicles account for more than 70% of his crashes involving advanced driver assistance software.
There are many cases of overestimating the intelligence of such software, even recording videos indulging in NSFW activity while the car is operating independently. Tesla can’t seem to get around the nasty Layer 8 problem. ®