u/Cute_Witness3405

▲ 42 r/claude

The unintuitive difficulty of using AI

As more mainstream (non-geeky) people are trying to use AI for work (and come ask me for help as the resident "expert"), I'm seeing patterns in the ways the current products fail them. They simply don't get that:

  • Something that can converse pretty fluently doesn't learn.
  • For anything complex, you need to ask it to create a plan to do a task, rather than asking it just to do it.
  • There are huge consequences of limited context; although it might do a brilliant job extracting and analyzing data from one PDF, asking it to do it for 200 will fail spectacularly (even though computers are supposed to be good at doing highly repetitive things).
  • It can do something astonishingly smart and in the next session do incredibly stupid things.
  • Many people can instantly spot AI writing and it damages your credibility and relationships if you use it for most of your emails.
  • You need to pick the right model for what you want to do.

These are all counter-intuitive, and current AI products do almost nothing in the core product experience to keep you from blindly stumbling into them. Unlike most "power user" tools that punch you in the face with their complexity right up front, AI looks like the easiest thing in the world.

This is a massive product design failure on the part of the AI companies, and I suspect it is going to burn them hard as more mainstream users try and get disillusioned.

I'm not complaining about the fact that the models have these limitations; that's a really hard problem to solve. I'm complaining about the fact that you could create onboarding experiences that help people understand that there are skills they need to build, and help them do that.

Why not have a "training wheels" mode for new accounts where a model watches what you are doing, and acts as a teacher / mentor? Give it memory to keep it from becoming a broken record / get more personalized. Or if that's too hard, even a dumb quick start module that serves to signal there are skills that need to be learned for effective use.

There's already such bad vibes around AI outside the enthusiast bubble; threat of permanent job loss, noisy datacenters, high electrical bills, artistic theft, the fact that some of the worst people in the world are the ones controlling and hyping it... getting burned when you make a genuine effort to use it just confirms all of the negativity and you could end up losing mainstream customers in a way that may last way past the current compute crunch.

This just seems like such an avoidable self-own.

reddit.com
u/Cute_Witness3405 — 11 hours ago