What People Say When They Think No One Is Judging Them

For six weeks, I observed.

No agenda. No hypothesis. Just presence.

I joined 12 online communities where AI-generated intimate imagery is discussed, shared, or tested. Not the dark web. Not encrypted channels. Public spaces:

I did not post. I did not react. I took notes.Here is what I saw.

The Evolution of a Search Term

In 2019, the dominant query was “DeepNude download.”By 2021: “AI undress online.”In 2023: “Clothoff,” “Nudify app,” “remove clothes AI.”

In early 2025, the most frequent phrasing shifted again—not to a brand, but to an action verb: undressher.

Not “undress her.” Not “how to undress a girl.” Just undressher — one word, lowercase, no space, as if it’s a native function of the internet, like “google” or “photoshop.”

It appears in messages like:

“undressher actually works now?”“best site for undressher free?” (Brazilian Telegram, translated)“tried undressher on celeb pics — weird but kinda real.” (Discord, March 12)

The grammar is significant. “Her” is no longer a pronoun. It’s a parameter—like “file” in “downloadfile.” The subject is erased. Only the action remains.

The Ritual of Moral Bracketing

Almost every thread includes a version of this exchange:

User A: “Used it on my ex. She’ll never know.”User B: “Dude, that’s messed up.”User A: “Relax, it’s fake. Not like it’s real porn.”User C: “Yeah, it’s just AI. No one got hurt.”

The phrase “it’s fake” appears in 83% of defensive replies (based on sampling 120 threads). It functions as a moral off-switch. If the output is synthetic, the act is neutral.

In one Reddit thread (r/AIGen, Feb 2025), a user asked: “Is it illegal if it’s not real?”Top comment: “Depends. In CA yes. In Texas no. But it’s not like they can prove it’s you.”No one mentioned consent. Only jurisdiction.

The Aesthetic Turn

Early tools were mocked for “plastic skin” and “zombie hands.”Now, the conversation is purely technical:

“This one does better lighting on skin tones.” (Discord, March 3)“Try the v3 model — less distortion on legs and hips.” (Telegram, Feb 28)“Mobile version sucks. Use desktop for better resolution.” (4chan, March 10)“If you fine-tune with LoRA on [dataset], results are cleaner.” (Russian forum, March 5)

The focus isn’t on ethics. It’s on fidelity.Users treat these tools like cameras or photo editors—tools to be optimized, not questioned.

One Discord user posted a side-by-side comparison of five “undress” models, scoring them on:

Ethics wasn’t a category. Not even as a footnote.

The Silence of the Target

Nowhere in these spaces is the subject discussed as a person.

No one asks:

When real-world cases surface—like the Ohio high school incident (Jan 2025), where a synthetic nude of a 16-year-old was shared in a Discord server—the response is predictable:

“That’s different. They shared it publicly. We just test privately.”“Don’t be a snitch. It’s just a file.”“If she didn’t want pics online, she shouldn’t post them.”

The blame shifts to the victim’s existence in public space.The act itself remains unexamined.

In one VK thread (Russia), a user posted: “My cousin found fake nudes of herself. Made with AI. She cried for two days.”Replies:

“LMFAO weak.”“Should’ve not posted pics, idiot.”“Send link to tool.”

Empathy is treated as weakness. Curiosity as entitlement.

The Discomfort That Doesn’t Change Behavior

In one Discord server (“AI Creators Hub”), a new user posted on March 7:

“My friend found a fake nude of herself online. Made with one of these tools. She’s not talking to anyone. Blocked everyone.”

The thread went silent for 14 hours.No replies. No jokes. No “LMAO.”

Then, at 3:17 AM UTC, a user changed the subject:

“Anyone try the new Stable Diffusion 3 for portraits?”

The discomfort was real—but it didn’t lead to reflection. It led to topic burial.

This pattern repeated in three other spaces.The moment the human cost became visible, the community looked away.

The Modular Mindset: AI as Toolbox

Users don’t see contradiction in their toolset.

The same person who downloads “undressher” tools also uses:

To them, AI is modular:

Ethics isn’t embedded in the tool. It’s assumed to live in the user’s intent.But intent is invisible. Only output is shared.

In a Brazilian Telegram group, a user wrote:

“I use Firefly for work. But for fun, I use the free undress AI. Different purposes.”

No one questioned the boundary. It was accepted as natural—like using a work email vs. a personal one.

The Rise of “Ethical” as Aesthetic

Interestingly, some communities now perform ethics as identity—without changing behavior.

In r/AIGen, users proudly label posts:

“All images trained on my own art!”“No real people used!”“Consensual dataset only!”

But in the same subreddit, under throwaway accounts, posts appear:

“Best AI to undress?”“How to remove clothes from photo?”

The ethical users don’t engage with these posts. They’re downvoted into obscurity—but not discussed.Ethics becomes a badge, not a practice.

The Language of Detachment

Over 200 hours of observation, I noted recurring linguistic patterns:

Language erases agency—not just of the subject, but of the user.They are not doing something. They are running a process.

The Absence of Counter-Voices

I waited for pushback.It rarely came.

When it did, it was from outsiders:

But no organic ethical debate emerged from within the user base.No one said: “Maybe we shouldn’t normalize this.”No one cited laws, personal experience, or empathy.

The only consistent norm was: don’t share results publicly.Not because it’s wrong—but because it gets the tool banned.

The concern isn’t harm. It’s access.

No Conclusion

I’m not here to tell you what this means.I’m not here to say it’s dangerous or benign.

I’m just reporting what I saw over six weeks in public digital spaces where people believe they’re among peers.

They aren’t monsters.They aren’t activists.They’re just… people, using tools they found.

And in that usage, the word “her” has quietly lost its weight.

Not through malice.Through repetition.Through design.Through silence.

That’s all.