u/Gorio1961
I am a disabled veteran with severe chemotherapy induced peripheral neuropathy in my feet.
My neurologist that I’ve been seeing for the past three years at the VA has suggested that I could be eligible for walkins to assist in overall mobility, stability and fall prevention as I get older.
I have not heard anything or read anything about Walkasins. I’m looking for opinions from this group.
In Southern California my download just began. I find it interesting that’s this download began and is continuing while in low-power mode … not connected to wi-fi.
Yesterday I posted about my Tesla’s FSD doing something pretty sketchy — it got into a left-turn-only lane and then made a right turn against the directional arrow.
Clear markings. No obstructions. No ambiguity. Just… wrong.
So I went back last night to see if it was a fluke.
It wasn’t.
FSD repeated the exact same mistake — same lane choice, same committed right turn.
But here’s where it gets interesting:
Right in front of me was a human driver who did the exact same thing.
Same incorrect lane. Same illegal turn.
At that point, it stopped feeling like just a “Tesla problem.”
Now I’m wondering:
If a human and FSD both independently make the same bad decision at the same exact spot… what’s actually failing here?
Is FSD misreading the environment?
Or is the road design itself setting up both humans and AI to fail?
Because if this is reproducible (and it clearly is), then this isn’t just a quirky edge case — it’s a systemic issue, either in how Tesla interprets lane intent or how that intersection communicates it.
Either way, it’s not great.
And before anyone says “well you should’ve intervened” — yeah, obviously. That’s not the point.
The point is:
FSD didn’t hesitate. It committed. And apparently… so do humans.
Curious how many of these “FSD mistakes” are actually just mirrors of common human errors we don’t think twice about.
Dad’s been in an ALF for the past month. Nurses and the staff tell us he’s adjusting well.
He’s in early onset dementia and experiencing sun downing. He loves to play card games with his fellow ALF’rs.
His daily routine involves memory care and three meals a day.
As soon as the lights in the facility go down for the evening, he starts dialing random contacts in his cell phone. Not a problem reaching out to me or my brother, but he does it every 10 or 15 minutes never remembering that he just called.
His cell phone is his link to the real world and has been great as I live out of state. It excellent to talk to him daily.
It’s the evening calls that are out of control. Any suggestions?
Had a pretty concerning moment this morning exiting the freeway onto local streets.
The off-ramp splits into two clearly marked lanes — left lane must turn left, right lane must turn right. Pretty standard setup.
With Tesla Full Self-Driving engaged, the car moved into the left-turn-only lane, signaled… and then proceeded to make a right turn anyway, going directly against the directional arrow.
It wasn’t a hesitation or correction — it fully committed to the maneuver.
Luckily traffic was light, but in normal conditions this could’ve created a real conflict with vehicles correctly using the right-turn lane.
This one stood out more than usual because:
Lane markings were clear
The lane choice & turn decision contradicted each other
There weren’t obstructions forcing a last-second change
I’m sharing this as another reminder that while FSD is impressive, it’s still making some pretty fundamental interpretation errors in real-world scenarios.
This morning I was driving on the 91W in Orange County at normal highway speed with Tesla Full Self-Driving engaged.
Out of nowhere, the car autonomously merged across a double white line into the carpool lane.
I verified the “Use HOV Lane” selection was in the OFF position.
It did use the left turn signal, but there was very little lead time — not enough that I would have made that move myself.
There were no cars immediately next to me, and a motorcycle had just passed.
My impression is that if there had been a vehicle there, FSD probably wouldn’t have completed the maneuver.
Still — crossing a double white line like that is illegal ($490 fine) and definitely caught me off guard.
I’m posting this more as a heads-up than a complaint. I think FSD is impressive overall, but this was a reminder that it still needs constant supervision.