When I first posted about this, I was reacting in real time. I didn’t yet have the chance to go through everything carefully—I just knew something didn’t add up and wanted to document that it was happening.
Now that I’ve reviewed the situation in detail, I want to provide a more complete update.
To recap: one of my bots was flagged and had its image removed. I reached out to appeal that decision, pointing out that nearly identical images—same pose, same level of suggestiveness—are currently all over the platform featuring women, and remain untouched. My concern was straightforward: if that content is allowed, then male versions of it should be treated equally.
Shortly after sending that message, within roughly a 15-minute window around 4 AM, about 30 of my bots were removed and my account was hit with a temporary ban.
After going back through those 30 bots individually, there’s no consistent basis for their removal. The images vary widely—some are mildly suggestive (e.g., a guy in a thong), some are barely suggestive or not suggestive at all (fully clothed characters, abstract or humorous concepts). At the same time, similar or more explicit content remains live both on my page and across the platform.
That timeline is important. Fifteen minutes is not enough time to meaningfully review 30 bots in a careful, case-by-case way. And if this had been the result of a prior audit, the outcome doesn’t reflect that—because comparable content was clearly left up.
Because of that, the explanation isn’t just “a rushed decision.” What this looks like is a set of removals used to justify the ban that followed. The sequence matters: the bots were removed first, and the ban came immediately after. But those removals themselves don’t hold up under scrutiny—some of the content does not violate the platform’s guidelines at all, point blank.
So either this was triggered externally (for example, through reporting) and handled without proper review, or it was an internal decision applied selectively. But in the latter case, the lack of consistency suggests it wasn’t a neutral audit of my page—it was a targeted action that produced just enough removals to support a penalty.
In other words, the issue isn’t just inconsistency—it’s that enforcement appears to have been biased against me as a creator, then works backward to justify it.
For transparency, I’ve made all of the removed images available in my Discord so people can review them directly and form their own conclusions. Links are all over all my bots.
At this point, I’m less interested in debating a single image and more concerned with the broader pattern: inconsistent moderation, unclear standards, and enforcement actions that simply are not applied evenly or accountably.
My page will be back in about a week. When it is, I’ll be restoring content in a way that aligns as clearly as possible with the platform’s guidelines—however loosely those guidelines may be defined.
Georgir12648