A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.
I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.
There have been other similar cases lately, which clearly indicate problems with the car.
The driver has put up the footage from all the cameras of the car, so he has done what he can to provide evidence.
https://www.reddit.com/r/TeslaFSD/comments/1ksa79y/1328_fsd_accident/
It’s very clear from the comments, that some have personally experienced similar things, and others have seen reporting of it.
This is not an isolated incident. It’s just has better footage than most.
Also this happened in February. He never reached out to Tesla? He never requested the data to show that FSD was engaged? In that thread he says he only just did it. There’s also an official Tesla software program you can use to get the full logs, but as expected he hasn’t done that.
Dudes lying for sure.
You are so full of shit, I just checked it out:
https://www.reddit.com/r/technology/comments/1kskfqd/comment/mtmbkvm/?context=3
He never claimed it was recent.
Every claim you make you never provide sources, because you are probably just parroting hearsay.
February explains why he wasn’t on 13.2.9.
Why would he reach out to Tesla? That’s not his job, but the insurance.
But there is no point, because Tesla never takes responsibility in these cases.
Just no footage from the interior camera, no proof of FSD being used.
Others have pointed out critical holes in his story - namely that he claims that he was on a version of FSD that was not released at the time of his crash.
The link I gave you is the place he posted this. And you can see what version he says he was using:
https://www.notateslaapp.com/fsd-beta/
So you are parroting bullshit, the current version is 13.2.9.
Funny how people in the thread I linked to you, who drive Tesla themselves don’t question this?
Some people believe the FSD saw the shadow of the pole as a curb in the road, or maybe even the base of a wall. And that’s why the FSD decided to “evade”.
There are plenty examples in the comments from people who drive Tesla themselves, about how it steers into oncoming traffic, one describes how his followed black skid marks in the road wearing wildly left to right, another describes how his made an evasive maneuver because of a patch in the road. It just goes on and on with how faulty FSD is.
IDK what Tesla cars have what cameras. But I’ve seen plenty reporting on Tesla FSD, and none of it is good.
So why do you believe it’s more likely to be human error? When if it was a human not paying attention, it would be much more likely to weer slowly. rather than making an abrupt idiotic maneuver?
To me it seems you are the one who lacks evidence in your claims.
And problem with Tesla logging is that it’s a proprietary system that only Tesla has access to, that system needs to be open for everybody to examine.