Artificial Intelligence (AI) technology is a powerful tool, but like many powerful tools, it has the potential to allow humans to let our natural abilities atrophy. It’s the same way that the invention of the jackhammer pretty much caused humans to lose the ability to pound through feet of concrete and asphalt with our bare fists. We’re already seeing effects of this with the widespread use of ChatGPT seemingly causing cognitive decline and atrophying writing skills, and now I’m starting to think advanced driver’s aids, especially more comprehensive ones like Level 2 supervised semi-automated driving systems are doing the same thing: making people worse drivers.
I haven’t done studies to prove this in any comprehensive way, so at this point I’m still just speculating, like a speculum. I’m not entirely certain a full study is even needed at this point, though, because there are already some people just flat-out admitting to it online, for everyone to see, free of shame and, perhaps, any degree of self-reflection.
Specifically, I’m referring to this tweet that has garnered over two million views so far:
The other night I was driving in pouring rain, fully dark, and the car randomly lost GPS.
No location. No navigation.
Which also meant no FSD.I tried two software resets while driving just to get GPS back. Nothing worked. So there I was, manually driving in terrible… pic.twitter.com/hKlU6AlCEZ
— Oli (@CARN0N) December 13, 2025
Oh my. If, for some reason, you’re not able to read the tweet, here’s the full text of it:
“The other night I was driving in pouring rain, fully dark, and the car randomly lost GPS. No location. No navigation. Which also meant no FSD. I tried two software resets while driving just to get GPS back. Nothing worked. So there I was, manually driving in terrible conditions, unsure of positioning, no assistance, no guidance. And it genuinely felt unsafe. For me and for the people in the car. Then it hit me. This feeling – the stress, the uncertainty, the margin for error – this is how most drivers feel every single day. No FSD. No constant awareness. No backup. We’ve normalised danger so much that we only notice it when the safety net disappears.”
Wow. Drunk Batman himself couldn’t have beaten an admission like this out of me. There’s so much here, I’m not even really sure where to start. First, it’s night, and it’s “fully dark?” That’s kind of how night works, champ. And, sure, pouring rain is hardly ideal, but it’s very much part of life here on Earth. It’s perfectly normal to feel some stress when driving in the dark, in bad weather, but it’s not “how most drivers feel every single day.” Most drivers are used to driving, and they deal with poor conditions with awareness and caution, but, ideally, not the sort of panic suggested in this tweet.
Also, my quote didn’t replicate the weird spacing and short, staccato paragraphs that made this whole thing read like one of those weird LinkedIn posts where some fake thing someone’s kid said because a revelation of B2B best practices, or some shit.
It seems that the reason this guy felt the way he did when the driver aids were removed is that he’s, frankly, not used to actually driving. In fact, if you look at his profile on eX-Twitter, he notes that he’s a Tesla supervisor, which is pretty significantly different than calling yourself a Tesla driver:

This is an objectively terrible and deeply misguided way to view your relationship with your car for many reasons, not the least of which is the fact that even if you do consider yourself a “supervisor” – a deeply flawed premise to begin with – the very definition of Level 2 semi-autonomy is that the person “supervising” has to be ready to take over with zero warning, which means you need to be able to drive your damn car, no matter the situation it happens to be in.
If anything, you would think the takeaway here would have been, shit, I need to be a more competent driver and less of a candy-ass as opposed to coming away thinking, as stated in the tweet,
“We’ve normalised danger so much that we only notice it when the safety net disappears.”
This is so deeply and eye-rollingly misguided I almost don’t know where to start, except I absolutely do know where to start: the idea that the “safety net” is Tesla’s FSD software. Because that is exactly the opposite of how Level 2 systems are designed to work! You, the human, are the safety net! If you’ve already made the arguably lazy and questionable decision to farm out the majority of the driving task to a system that lacks redundant sensor backups and is still barely out of Beta status, then you better damn well be ready to take over when the system fails, because that’s how it’s designed to work.
To be fair, our Tesla Supervisor here did take over when his FSD went down due to loss of a GPS signal, but, based on what he said, he felt “unsafe” for himself and the passengers in the car. The lack of FSD isn’t the problem here; the problem is that the human driver didn’t feel safe operating their own motor vehicle.
Not only was he uncomfortable driving in the inclement weather and lack of light (again, that’s just nighttime, a recurring phenomenon), but the reason he had to debase himself so was because of a technical failure of FSD, which, it should be noted, can happen at any time, without warning. Hence the need to be able to drive a damn car, comfortably.
What does he mean when he says, referring to human driving, “no constant awareness?” Almost every driver I know is constantly aware that they are driving. That’s part of driving. Do people get distracted, look at phones, get lost in reveries, or whatever? Sure they do. That’s not ideal, but it doesn’t mean people aren’t aware.
Unsurprisingly, the poster of this admission has been getting a good bit of blowback in comments from people a little less likely to soil themselves when they have to drive in the rain. So, he provided a follow-up tweet:
To everyone that says I should’ve have a license:
This is a true story – it was pouring rain a couple of weeks ago on a Saturday night, with insane wind making it a recipe for disaster. I had just washed the car and was on my way to pick someone up.
About 30 seconds after… https://t.co/POIA0Ue58n
— Oli (@CARN0N) December 15, 2025
I’m not really sure what this follow-up actually clarified, but he did describe the experience in a bit more detail:
“I knew the rough direction but not exactly. I never use my phone while driving, so 1 rely solely on the car nav. Unfortunately, it wasn’t working, and I had to pull over to double-check where I was going.”
That’s just…driving. This is how all driving was up until about 15 years ago or so. I have an abysmal sense of direction, so I feel like I spent most of my pre-GPS driving life lost at least a quarter of the time I was driving anywhere. But you figure it out. You take some wrong turns, you end up in places you didn’t originally plan to be in, you looked at maps or signs or asked someone and you eventually got there. It wasn’t perfect, but it was what you had, and when we could finally, say, print out MapQuest directions and clip them to the dash, oh man, that was a game changer.
I took plenty of long road trips in marginal cars with no phone and just signs and vague notions to guide me where I was going. If I had to do it today, sure, there would be some significant adapting to exhume my pre-GPS navigational skills – well, skills is too generous a word, so maybe we can just say ability – but I think it could be done. And every driver really should be able to do the same thing.
FSD (Supervised) is a tool, a crutch, and if you find yourself in a position where its absence is causing you fear instead of just a bit of annoyance, you’re no longer really qualified to drive a car. Teslas (and other mass-market cars with similar L2 driver assist systems) don’t have redundant sensors, most don’t have the means to clean camera lenses (or radar/lidar windows and domes), and none of them are rated for actually unsupervised driving. Which means that you, the person in the driver’s seat, need to actually live up to the name of that seat: you have to know how to drive a damn car.
This tweet should be taken as a warning, because while it’s fun to feel all smug because you can drive in the rain and ridicule this hapless fellow, I guarantee you he’s not alone. There are other people whose driving skills are atrophying because of reliance on systems like Tesla’s FSD, and this is a very bad path to go down. Our Tesla Supervisor here may actually have been unsafe when he had to take full control of the car and didn’t feel comfortable. And that’s not a technical problem, it’s a perception problem, and it’s not even the original poster’s fault entirely – there is a lot of encouragement from Tesla and the surrounding community to consider FSD to be far more capable than it actually is.

Driving is dangerous, and it’s good to feel that, sometimes! You should always be aware that when you’re driving, you’re in a metal-and-plastic, ton-and-a-half box hurtling down haphazardly maintained roads at a mile per minute. If that’s not a little scary to you, then you’re either a liar, a corpse, or one of those kids who started karting at four years old.
We all need to accept the reality of what driving is, and the inherent, wonderful madness behind it. I personally know myself well enough to realize how easily I can be lured into false senses of security by modern cars and start driving like a moron; to combat this, my preferred daily drivers are ridiculous, primitive machines incapable of hiding the fact that they’re just metal boxes with lots of sharp, poke-y bits that are whizzing along far too quickly. Which, in the case of my cars, can mean speeds of, oh, 45 mph.
The point is, everyone on the road should be able to capably drive, in pretty much any conditions, without the aid of some AI. Even when we have more advanced automated driving systems, this should still be the case, at least for vehicles capable of being driven by a human. But for right now, systems like FSD are not the safety net: the safety net is always us. We’re always responsible when we’re in the driver’s seat, and if we forget that, we could end up in far worse situations than just embarrassing ourselves online.
But that can happen too, of course.
The post I Just Can’t With This Tesla FSD User Panicking About Actually, You Know, Driving appeared first on The Autopian.





