r/AiMonk

I’ve spent hundreds of hours on YT trying to learn GenAI. These 20 channels are the only ones actually worth your time
▲ 2 r/AiMonk+1 crossposts

I’ve spent hundreds of hours on YT trying to learn GenAI. These 20 channels are the only ones actually worth your time

Most AI content right now is just clickbait "hustle" trash. If you actually want to understand the architecture (Karpathy), the math (3Blue1Brown), or the research (Two Minute Papers), this is the definitive list.

I’d personally start with Umar Jamil if you want to see how Transformers are actually coded. Who else am I missing?

u/ironmonk104 — 9 hours ago
▲ 3 r/AiMonk+2 crossposts

Everyone is obsessed with GPT, but VLMs and LAMs are where the actual money is going to be made in 2026. Change my mind

Found this infographic that perfectly breaks down the current state of the LLM zoo.

We’ve spent the last two years talking about standard Transformers (GPT), but looking at the architecture for Large Action Models (LAM) and Large Reasoning Models (LRM), it’s clear where the puck is headed.

A few takeaways:

LRMs: The shift from "next-token prediction" to "structured reasoning" is the only way we get to actual AGI.

SLMs: Seeing more and more companies ditching massive API costs for specialized Small Language Models. Efficiency > Size.

LCMs: Large Concept Models are still the dark horse here.

u/ironmonk104 — 4 days ago