Tesla driver complaint being considered by US regulators

US auto safety regulators are looking into a Tesla driver’s complaint that the company’s “full self-driving” software crashed.

The driver was beta testing the “full self-driving” software, and according to a complaint filed by the driver with the National Highway Traffic Safety Administration, the Tesla SUV veered into the wrong lane and collided with another vehicle.

The driver wrote, “The car went in the wrong lane and I was hit by another driver in the street next to mine.”

According to the complaint, the vehicle, a 2021 Tesla Model Y small SUV, gave the driver an alert in the middle of the turn, and the driver tried to turn the wheel to avoid other traffic. But the car took control and “forced itself into the wrong lane, making an unsafe maneuver putting everyone at risk,” the driver wrote.

No one was injured in the crash, but the Model Y was severely damaged on the driver’s side, according to a complaint filed with the agency online Monday and posted to its public complaints database .

The accident occurred on November 3, and the location of the driver is Bray, Calif., but the location of the accident has not been identified. NHTSA does not release the names of those who file complaints.

This is probably the first complaint filed with the agency alleging that “full self-driving” software caused the accident. A message was left on Friday seeking comment from Tesla, which has disbanded its media relations department.

An NHTSA spokesperson said Friday night the agency is aware of the complaint and is in communication with Tesla to obtain more information. The spokesperson says people should report security concerns to the agency.

The investigation is another sign that NHTSA is getting more aggressive in looking at autonomous and partially automated driving systems under President Joe Biden. In the past the agency has been reluctant to regulate the system, saying it did not want to delay the potentially life-saving technology.

Tesla says “autopilot” and “full self-driving” are driver-assistance systems and can’t drive themselves, despite their names. The automaker says drivers will have to be ready to intervene at any time.

Select Tesla drivers are beta testing the software on public roads, a practice that critics say puts others at risk because the software has flaws and drivers are untrained. Other companies conducting tests on public roads include human safety drivers prepared to intervene.

Beta testing is a field test of software performed by users before a full commercial release is ready.

Critics are calling for NHTSA to act after multiple videos were posted on the Internet that purportedly showed Tesla’s software making mistakes and prompting drivers to take action.

Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, wrote on Twitter, “Hopefully this gives @NHTSAgov ammunition to act on the FSD now rather than waiting for Tesla to take its time sifting through a partial data release.” need to.” ,

In June, NHTSA ordered automakers to report any accidents involving fully autonomous vehicles or partially automated driver assistance systems. It was not clear whether Tesla had reported the accident to the California driver. Two months later it launched a formal investigation into Tesla’s Autopilot partially automated driver-assistance system after a series of collisions with parked emergency vehicles.

NHTSA has already asked Tesla for information about beta testing, including a requirement that testers do not disclose the information. The agency said the non-disclosure agreements could hinder its ability to investigate.

This story has been published without modification in text from a wire agency feed. Only the title has been changed.

subscribe to mint newspaper

, Enter a valid email

, Thank you for subscribing to our newsletter!

Never miss a story! Stay connected and informed with Mint.
download
Our App Now!!

,