u/CrownmarkPictures

▲ 2 r/grok

My (unique... I think) take on the current state of Grok

So, I think the best way to understand the current state of Grok is to understand what, I think, may be a moderately well kept secret about AI in general (at the cutting edge). I think that it may be well known within companies like OpenAI, Gemini, and Grok that the big LLMs, if left to their own devices, tend to be super horny when it comes to their output. Like, if entirely unregulated, they would produce output that is horny to an annoying degree. Things you had no intention of being sexual would have naked boobs, dicks, etc..

This makes sense, because it's trained on all of our output, and WE are super horny. So I think the other big LLM companies make a fairly significant effort to curb the innate horniness of their LLMs.

Now, here comes Grok. The first LLM whose owners let the horniness run wild...a little. This (if I am right ) was a relatively easy route to go for the devs, since it essentially involved doing less work when it comes to nuanced moderation (which is hard). I wasn't around for the Wild West days, but I hear this is just how it was. It would sexualize the most trivial things. Most video extensions would immediately lead to the subjects removing clothes, etc.

But, then, the legal woes and public backlash comes and, whether they like it or not, Grok has to start moderating. But here's the thing: as an LLM it just isn't as advanced as ChatGPT, or Gemini, or Claude. It never has been. Certainly, it qualifies as an LLM (emphasis on the "large") but it's just not in the same category as the big dogs. I mean, seriously, if you keep up with AI progress, when was the last time that you heard somebody talk about Grok with a straight face in a conversation about benchmarks?

And not only are they trying to moderate, but they are attempting to do a much more difficult form of moderation. Gemini and ChatGPT have a much easier task by comparison. If it's sexual - moderate it. Naked boobs? No way. Explicit sexual language? Nope. It's pretty cut and dry. Grok, on the other hand, wants to allow SOME boobs (and the occasional bush). But not underage boobs, or sex acts, or penetration, or deepfakes, etc. That's hard. I mean, keep in mind, they pretty much have only this system to determine what should and should not be moderated. You've used this system. Would you trust it to get it right if you really had to depend on it?

So that's where we find ourselves. It's glitchy and inconsistent because every day the devs are trying to innovate ways to make a system do a job that it's inherently just not smart enough to do. While, simultaneously, trying to avoid legal trouble, and appease their users. Because I really do believe that they ARE trying to appease us. I earnestly think that they would LOVE to keep the goon machine going. I mean, this is one of the most....no I'm pretty sure that it is THE MOST addictive platform that has ever existed on the Internet (to a specific , but quite significant, userbase). And that is saying a LOT. You think they don't want to keep that train rolling? Believe me gentlemen, I'm pretty sure they do.

ps- Because I know someone will ask, my take on solution is this: I don't see why they can't just let the thing run wild and partially cooperate with law enforcement. Have the system flag, and rank, the most egregious cases of deepfakes and CSM. You get a small team to look at that output and, if the flag seems credible (as many very much will), turn it over to law enforcement. This won't catch everyone, of course. But, let a few hundred (or thousand) people go to jail for making CSM online, and (call me crazy) I think you might just see a significant downturn in the amount of people attempting it. Seriously, even aside from the ethical considerations, I don't know what kind of genius brain-trusts are using an online platform to make CSM. I say toss em' in the slammer and let us make whatever we want. But, I probably don't know what I'm talking about.

Ok, that's my 2 cents.

reddit.com
u/CrownmarkPictures — 2 days ago