u/ironmonk104

I finally stopped waiting for the "perfect" time to start.
▲ 13 r/ironmonk104+3 crossposts

I finally stopped waiting for the "perfect" time to start.

We’re so conditioned to fear failure that we forget the only real failure is staying exactly where you are. I used to think a mistake was a dead end, but it’s actually just data.

Whatever that "thing" is you’ve been putting off because you’re scared of looking stupid—just do it. Even if it crashes, you’ll be smarter than you were yesterday.

u/ironmonk104 — 6 hours ago
▲ 2 r/AiMonk+1 crossposts

I’ve spent hundreds of hours on YT trying to learn GenAI. These 20 channels are the only ones actually worth your time

Most AI content right now is just clickbait "hustle" trash. If you actually want to understand the architecture (Karpathy), the math (3Blue1Brown), or the research (Two Minute Papers), this is the definitive list.

I’d personally start with Umar Jamil if you want to see how Transformers are actually coded. Who else am I missing?

u/ironmonk104 — 7 hours ago
▲ 34 r/ironmonk104+1 crossposts

The moment I stopped caring about "proving them wrong" was the moment I actually started winning.

I used to spend so much energy trying to show people I was successful, smart, or capable. It’s exhausting and honestly, nobody cares as much as you think they do. Once you switch that energy toward actually getting better for yourself, the results come naturally. You don't need an audience to grow.

u/ironmonk104 — 7 hours ago
▲ 4 r/The_AiEra+1 crossposts

The AI hype is settling, but the skill gap is getting wider. Here’s a solid roadmap for 2026.

Most people are still just playing with ChatGPT, but the industry is moving toward agents and RAG. I found this breakdown of the 9 core pillars for this year. If you're looking to actually build or stay relevant, "AI Tool Stacking" and "LLM Eval" are where the real money is moving. Which of these are you guys actually seeing used in your workflows?

u/ironmonk104 — 7 hours ago
▲ 3 r/AiMonk+2 crossposts

Everyone is obsessed with GPT, but VLMs and LAMs are where the actual money is going to be made in 2026. Change my mind

Found this infographic that perfectly breaks down the current state of the LLM zoo.

We’ve spent the last two years talking about standard Transformers (GPT), but looking at the architecture for Large Action Models (LAM) and Large Reasoning Models (LRM), it’s clear where the puck is headed.

A few takeaways:

LRMs: The shift from "next-token prediction" to "structured reasoning" is the only way we get to actual AGI.

SLMs: Seeing more and more companies ditching massive API costs for specialized Small Language Models. Efficiency > Size.

LCMs: Large Concept Models are still the dark horse here.

u/ironmonk104 — 4 days ago