The widow of a man who died when his Tesla veered off the road and hit a tree while using its partially automated driving system is suing the carmaker, claiming its marketing of the technology is dangerous. is misleading.
According to the lawsuit filed by Nora Bass in state court on May 3, the Autopilot system prevented Hans von Ohan from being able to put his Model 3 Tesla on a Colorado road in 2022. Von Ohan died after his car hit a tree and burst into flames, the lawsuit says, but a passenger managed to escape.
According to a Colorado State Patrol report, Von Ohen was intoxicated at the time of the crash.
The Associated Press sent an email to Tesla’s communications department on Friday seeking comment.
Tesla offers two partially automated systems, Autopilot and a more sophisticated “Full Self Driving,” but the company says that despite their names, neither can drive itself.
Also Read – OpenAI may announce AI-powered Google Search competitor on this date: What to expect?
The lawsuit, which was also filed on behalf of von Ohen and Bass’s only child, alleges that Tesla, facing financial pressures, released its Autopilot system before it was ready for real-world use. done. It also cites a 2016 promotional video that claims the company has a “reckless disregard for consumer safety and truth.”
“By showing a Tesla vehicle driving through traffic without any hands on the steering wheel, Tesla irresponsibly misled consumers into believing that their vehicles had greater capabilities than they actually did,” the video said.
Last month, Tesla paid an undisclosed amount to settle a separate lawsuit making similar claims that was brought by the family of a Silicon Valley engineer who died in a 2018 crash while using Autopilot. It was done. Walter Huang’s model before hitting a concrete barrier at an intersection on a busy highway in Mountain View, California
Also Read – Quick Response: A Wknd Interview with Madhumita Murgia, Author of Code Dependent
Evidence indicated that Huang was playing a video game on his iPhone when he collided with the barrier on March 23, 2018. But his family claimed Autopilot was promoted in a way that led vehicle owners to believe they did not need to be cautious. Were behind the wheel.
US auto safety regulators pressured Tesla to recall more than 2 million vehicles in December to fix a faulty system that requires drivers to pay attention when using Autopilot.
Also Read – What are the risks if advanced AI models fall into the wrong hands?
In a letter to Tesla posted on the agency’s website this week, investigators with the U.S. National Highway Traffic Safety Administration wrote that they found no differences between the warning software issued after the recall and the software that existed before it. The agency says Tesla has reported 20 more accidents involving Autopilot since the recall.