Rewrite This Title Experts Lay Into Tesla Safety In Federal Autopilot Trial

Summarize this content to 200 words

For example, she said Tesla “clearly recognized that mode confusion is an issue—this is where people, for example, think the car is in Autopilot and don’t understand that the Autopilot has disengaged,” she told the court.
Cummings also referred to the deposition of Tesla autopilot firmware engineer Ajshay Phatak. Phatak’s deposition told the court that the company did not keep good track of Autopilot crashes prior to 2018, and Cummings pointed out that “it was clear they knew that they had a big problem with people ignoring the warnings. Ignoring the hands-on requests. And…as you know, prior to this accident. It was known to Tesla that they were having problems with people ignoring their warnings.”
Tesla’s abuse of statistics to make misleading claims about safety is nothing new: In 2017, Ars found out that Tesla’s claim about Autopilot reducing crashes was not at all backed by data, which, in fact, showed the driver assist increased crash rates.
Mendel Singer, a statistician at Case Western University School of Medicine, was very unimpressed with Tesla’s approach to crash data statistics in his testimony. Singer noted that he was “not aware of any published study, any reports that are done independently… where [Tesla] actually had raw data and could validate it to see does it tend to make sense” and that the car company was not comparing like with like.
“Non-Teslas crashes are counted based on police reports, regardless of safety system…

Source Domain:arstechnica.com

Redirect url:https://arstechnica.com/cars/2025/07/experts-lay-into-tesla-safety-in-federal-autopilot-trial/

decoded:https://arstechnica.com/cars/2025/07/experts-lay-into-tesla-safety-in-federal-autopilot-trial/

author:Jonathan M. Gitlin

title_words_as_hashtags:#Experts #lay #Tesla #safety #federal #autopilot #trial

title_words_as_slug:experts-lay-into-tesla-safety-in-federal-autopilot-trial

Source link