- NHTSA is reviewing additional crash data to determine how safe Tesla's Autopilot function is.
- The agency has also upgraded the probe, which is necessary before any possible recall.
- NHTSA has said that there is no vehicle currently that is fully automated or "self-driving."
A federal agency has escalated its probe into whether Tesla's Autopilot function is potentially defective.
The National Highway Traffic Safety Administration (NHTSA) first launched its investigation into 765,000 Tesla cars ten months ago, after identifying 11 cases of Tesla cars crashing into first-responder vehicles.
On Thursday, NHTSA said in a release that it was widening the probe into the effectiveness of Tesla's driver assistance system. NHTSA will now review data from 830,000 Tesla cars and almost 200 new cases of collisions that involved Tesla cars with the Autopilot function operating.
The agency also said it was now treating the investigation as an "Engineering Analysis," which is a necessary step before a possible recall of the cars that come with the Autopilot function.
NHTSA's expanded probe will "explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver's supervision," per the release.
Since launching the probe ten months ago, NHTSA has added six more crashes involving first-responder vehicles to its analysis, according to the release.
The agency said it found that Tesla's warning system activated most of the time just before impact. It also said the "Automatic Emergency Braking" system kicked in for about half of the crashes. "On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact," per the release.
NHTSA also studied 191 crashes that did not involve ambulances, fire trucks, or police cars. It said it dropped 85 because they were determined to have been caused by other factors or it was difficult to determine the cause. In about 50 of the remaining cases, the agency said drivers were not responsive enough when driving. In about 25 other crashes, the agency said the drivers were using Autopilot in situations where Tesla has said could limit the system's effectiveness, such as in bad weather.
Tesla markets Autopilot as a feature that allows cars to automatically brake and steer within their lanes. It also labels some assistance features as "Full Self-Driving." The NHTSA's website states, "There is no vehicle currently available for sale that is fully automated or 'self-driving.'"
In its Thursday release, NHTSA said that misuse of the Autopilot function did not mean that the system was not defective. "This is particularly the case if the driver behavior in question is foreseeable in light of the system's design or operation," it said.
Democratic Senator Edward Markey of Massachusetts welcomed NHTSA's escalation of the probe. "Every day that Tesla disregards safety rules and misleads the public about its 'Autopilot' system, our roads become more dangerous," he tweeted on Thursday.
In a separate investigation, the NHTSA is investigating 758 cases of "phantom braking" where drivers complained that their Tesla cars braked suddenly while traveling at high speeds. It has given Tesla a long list of questions to answer by June 20.
Tesla did not immediately respond to Insider's request for comment.