Posted in

Tesla’s Smart Summon Racked Up 97 Crashes Before The Feds Finally Walked Away

Tesla's Smart Summon Racked Up 97 Crashes Before The Feds Finally Walked Away

The federal government just quietly closed one of the most closely watched investigations in autonomous vehicle history — and the final crash tally is harder to ignore than regulators want to admit. Nearly 100 documented collisions later, the NHTSA says Tesla’s Smart Summon is basically fine.

That conclusion deserves a second look. Here’s what actually happened, what the data shows, and why drivers using this feature right now should still be paying close attention.

What the NHTSA investigation actually found

The agency’s Office of Defects Investigation formally closed its probe into Tesla’s Smart Summon feature in 2026, citing “low incident occurrence and low incident severity.” Out of an estimated 2,585,000 vehicles equipped with the technology, investigators tracked approximately 159 reported incidents — with 97 confirmed crashes once duplicates were filtered out.

Nearly every collision followed the same pattern: the vehicle either struck a parked car, clipped a parking gate, or rolled into a short bollard. No serious injuries were reported in the cases reviewed. The government’s position is that those numbers, spread across millions of Summon sessions, represent a fraction of 1% incident rate — technically accurate, but also a number that still adds up to nearly 100 real-world crashes.

Snow, gates, and the moments nobody was watching

Two specific crashes caught my attention most in the investigation summary. Both involved camera blockages caused by snow accumulation. In each case, the Tesla struck an unoccupied parked vehicle — and in both cases, the user operating the Summon session did not issue a stop or pause command despite the visible obstruction. Tesla later pushed an over-the-air update in early 2026 to address camera blockage detection.

There was also a separate incident where Smart Summon failed to yield to a gate arm blocking a garage exit. The user, again, didn’t intervene. That collision triggered another software patch specifically designed to “improve vehicle reaction to dynamic gates.” What stands out isn’t the crashes themselves — it’s the pattern of users assuming the system would handle it and not stepping in when it clearly couldn’t.

The real story inside the crash numbers

Metric Detail
Vehicles with Smart Summon ~2,585,000
Total reported incidents ~159 (including duplicates)
Confirmed crashes ~97
Crash rate Less than 1% of all Summon sessions
Serious injuries reported None confirmed in investigation
Common crash type Parked vehicles, gates, bollards
Investigation outcome Closed — no recall or formal action

The NHTSA’s framing emphasizes the low percentage. And statistically, less than 1% of sessions ending in an incident does sound reassuring when you’re talking about millions of activations. But 97 crashes is still 97 real parking lots, 97 real repair bills, and in several cases, 97 moments where a human had the ability to intervene and didn’t.

The real finding buried in this investigation isn’t just about Tesla’s software — it’s about how drivers behave when they hand control to automation. People are stepping back, watching their phone, and trusting the system completely. That behavioral shift is arguably more dangerous than any software bug Tesla could patch with an OTA update.

Tesla patched the problems, but the habit is the issue

To Tesla’s credit, every specific failure mode identified by the NHTSA triggered a software response. Camera blockage detection was added. Gate arm recognition was improved. The company’s ability to push fixes wirelessly means the vehicle fleet is continuously being updated in ways traditional automakers simply can’t match.

But no software update can fix a driver who isn’t watching. The NHTSA investigation explicitly noted that in multiple incidents, users “failed to fully detect or respond appropriately to vehicle surroundings.” Smart Summon is designed to require active human oversight — the operator is supposed to keep a thumb on the button and eyes on the car at all times. What investigators found is that in practice, that’s not always happening. The technology is outpacing the habits of the people using it.

Why this investigation closing isn’t the end of the story

The NHTSA walking away without a recall or formal corrective order might feel like a green light for autonomous parking features across the industry. Ford, GM, Rivian, and nearly every major EV brand is developing or already deploying similar low-speed autonomous maneuvering technology. A federal sign-off on Tesla’s approach — even an implicit one — shapes how regulators will view comparable systems going forward.

That context matters enormously right now. As fully autonomous vehicles edge closer to mainstream roads in 2026, the standards set by investigations like this one will define the accountability framework for the entire industry. The bar being set here is “no serious injuries, low percentage rate.” Whether that bar is high enough is a question worth asking before the next generation of parking automation rolls out at scale.

If you use Smart Summon — or any similar feature on your vehicle — this investigation is a useful reminder that these systems are tools, not replacements for your attention. Keep your eyes on the car, keep your thumb ready, and don’t assume the software has accounted for every bollard, snowflake, or gate arm in that parking lot. Share this with anyone you know who relies on autonomous features without thinking twice about them.

Leave a Reply

Your email address will not be published. Required fields are marked *