Search

Tesla Autopilot data from NHTSA sheds light on Elon Musk's promises of autonomy - The Washington Post

SAN FRANCISCO — Tesla vehicles running its Autopilot software have been involved in 273 reported crashes over roughly the past year, according to regulators, far more than previously known and providing concrete evidence regarding the real-world performance of its futuristic features.

The numbers, which were published by the National Highway Traffic Safety Administration for the first time Wednesday, show that Tesla vehicles made up nearly 70 percent of the 392 crashes involving advanced driver-assistance systems reported since last July, and a majority of the fatalities and serious injuries — some of which date back further than a year. Eight of the Tesla crashes took place prior to June 2021, according to data released by NHTSA Wednesday morning.

Previously, NHTSA said it had probed 42 crashes potentially involving driver assistance, 35 of which included Tesla vehicles, in a more limited data set that stretched back to 2016.

Of the six fatalities listed in the data set published Wednesday, five were tied to Tesla vehicles — including a July 2021 crash involving a pedestrian in Flushing, N.Y., and a fatal crash in March in Castro Valley, Calif. Some dated as far back as 2019.

Tesla Autopilot is a suite of systems that allows drivers to cede physical control of their electric vehicles, though they must pay attention at all times. The cars can maintain speed and safe distance behind other cars, stay within their lane lines and make lane changes on highways. An expanded set of features, called the “Full Self-Driving” beta, adds the ability to maneuver city and residential streets, halting at stop signs and traffic lights, and making turns while navigating vehicles from point to point.

But some transportation safety experts have raised concerns about the technology’s safety, since it is being tested and trained on public roads with other drivers. Federal officials have targeted Tesla in recent months with an increasing number of investigations, recalls and even public admonishments directed at the company.

Federal investigators step up probe into Tesla Autopilot crashes

The new data set stems from a federal order last summer requiring automakers to report crashes involving driver assistance to assess whether the technology presented safety risks. Tesla‘s vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

The NHTSA order required manufacturers to disclose crashes where the software was in use within 30 seconds of the crash, in part to mitigate the concern that manufacturers would hide crashes by claiming the software wasn’t in use at the time of the impact.

“These technologies hold great promise to improve safety, but we need to understand how these vehicles are performing in real-world situations,” NHTSA’s administrator, Steven Cliff, said in a call with media about the full data set from manufacturers.

Tesla did not immediately respond to a request for comment. Tesla has argued that Autopilot is safer than normal driving when crash data is compared. The company has also pointed to the vast number of traffic crash deaths on U.S. roadways annually, estimated by NHTSA at 42,915 in 2021, hailing the promise of technologies like Autopilot to “reduce the frequency and severity of traffic crashes and save thousands of lives each year.”

Data pitting normal driving against Autopilot is not directly comparable because Autopilot operates largely on highways. Tesla CEO Elon Musk, however, had described Autopilot as “unequivocally safer.”

NHTSA launches probe into Tesla’s ‘phantom braking' problem

Musk said as recently as January that there had been no crashes or injuries involving the Full Self-Driving beta software, which has been rolled out to a more limited number of drivers for testing. NHTSA officials said their data was not expected to specify whether Full Self-Driving was active at the time of the crash.

Previously, regulators relied on a piecemeal collection of data from media reports, manufacturer notifications and other sporadic sources to learn about incidents involving advanced driver-assistance.

Companies such as Tesla collect more data than other automakers, which might leave them overrepresented in the data, according to experts in the systems as well as some officials who spoke on the condition of anonymity to candidly describe the findings. Tesla also pilots much of the technology, some of which comes standard on its cars, putting it in the hands of users who become familiar with it more quickly and use it in a wider variety of situations.

Driver-assistance technology has grown in popularity as owners have sought to hand over more of the driving tasks to automated features, which do not make the cars autonomous but can offer relief from certain physical demands of driving. Automakers such as Subaru and Honda have added driver-assistance features that act as a more advanced cruise control, keeping set distances from other vehicles, maintaining speed and following marked lane lines on highways.

But none of them operate in as broad a set of conditions, such as residential and city streets, as Tesla’s systems do. NHTSA disclosed last week that Tesla’s Autopilot is on around 830,000 vehicles dating back to 2014.

Autopilot has spurred several regulatory probes, including into crashes with parked emergency vehicles and the cars’ tendency to halt for imagined hazards.

As part of its probe into crashes with parked emergency vehicles, NHTSA has said it is looking into whether Autopilot “may exacerbate human factors or behavioral safety risks.”

Autopilot has been tied to deaths in crashes in Williston and Delray Beach, Fla., as well as in Los Angeles County and Mountain View, Calif. The driver-assistance features have drawn the attention of NHTSA, which regulates motor vehicles, and the National Transportation Safety Board, an independent body charged with investigating safety incidents.

Tesla driver faces felony charges in fatal crash involving Autopilot

Federal regulators last year ordered car companies including Tesla to submit crash reports within a day of learning of any incident involving driver assistance that resulted in a death or hospitalization because of injury, or that involved a person being struck. Companies are also required to report crashes involving the technology that included an air bag deployment or cars that had to be towed.

The agency said it was collecting the data because of the “unique risks” of the emerging technology, to determine whether manufacturers are making sure their equipment is “free of defects that pose an unreasonable risk to motor vehicle safety.”

How U.S. regulators played mind games with Elon Musk

Carmakers and hardware-makers reported 46 injuries from the crashes, including five serious injuries. But the total injury rate could be higher — 294 of the crashes had an “unknown” number of injuries.

One additional fatality was reported, but regulators noted it wasn’t clear if the driver-assistance technology was being used.

Honda reported 90 crashes during the same time period involving advanced driver-assistance systems, and Subaru reported 10.

Some systems appear to disable in the moments leading up to a crash, potentially allowing companies to say they were not active at the time of the incident. NHTSA is already investigating 16 incidents involving Autopilot where Tesla vehicles slammed into parked emergency vehicles. On average in those incidents, NHTSA said: “Autopilot aborted vehicle control less than one second prior to the first impact.”

Regulators also released data on crashes reported by automated driving systems, which are commonly called self-driving cars. These cars are far less common on roads, loaded with sophisticated equipment and not commercially available. A total of 130 crashes were reported, including 62 from Waymo, a sister company to Google. That report shows no fatalities and one serious injury. There was also one report of an automated driving crash involving Tesla, which has tested autonomous vehicles in limited capacities in the past, though the circumstances of the incident were not immediately clear.

In the crashes where advanced-driver assistance played a role, and where further information on the collision was known, vehicles most frequently collided with fixed objects or other cars. Among the others, 20 hit a pole or tree, 10 struck animals, two crashed into emergency vehicles, three struck pedestrians and at least one hit a cyclist.

When the vehicles reported damage, it was most commonly to the front of the car, which was the case in 124 incidents. Damage was more often concentrated on the front left, or driver’s side, of the car, rather than the passenger’s side.

The incidents were heavily concentrated in California and Texas, the two most populous states and also the U.S. locations Tesla has made its home. Nearly a third of the crashes involving driver assistance, 125, occurred in California. And 33 took place in Texas.

Adblock test (Why?)

Read Again https://www.washingtonpost.com/technology/2022/06/15/tesla-autopilot-crashes/

Bagikan Berita Ini

0 Response to "Tesla Autopilot data from NHTSA sheds light on Elon Musk's promises of autonomy - The Washington Post"

Post a Comment

Powered by Blogger.