this post was submitted on 14 Jan 2026
266 points (97.2% liked)

News

35962 readers
3596 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious biased sources will be removed at the mods’ discretion. Supporting links can be added in comments or posted separately but not to the post body. Sources may be checked for reliability using Wikipedia, MBFC, AdFontes, GroundNews, etc.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source. Clickbait titles may be removed.


Posts which titles don’t match the source may be removed. If the site changed their headline, we may ask you to update the post title. Clickbait titles use hyperbolic language and do not accurately describe the article content. When necessary, post titles may be edited, clearly marked with [brackets], but may never be used to editorialize or comment on the content.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials, videos, blogs, press releases, or celebrity gossip will be allowed. All posts will be judged on a case-by-case basis. Mods may use discretion to pre-approve videos or press releases from highly credible sources that provide unique, newsworthy content not available or possible in another format.


7. No duplicate posts.


If an article has already been posted, it will be removed. Different articles reporting on the same subject are permitted. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners or news aggregators.


All posts must link to original article sources. You may include archival links in the post description. News aggregators such as Yahoo, Google, Hacker News, etc. should be avoided in favor of the original source link. Newswire services such as AP, Reuters, or AFP, are frequently republished and may be shared from other credible sources.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
 

While Grok has introduced belated safeguards to prevent sexualised AI imagery, other tools have far fewer limits

“Since discovering Grok AI, regular porn doesn’t do it for me anymore, it just sounds absurd now,” one enthusiast for the Elon Musk-owned AI chatbot wrote on Reddit. Another agreed: “If I want a really specific person, yes.”

If those who have been horrified by the distribution of sexualised imagery on Grok hoped that last week’s belated safeguards could put the genie back in the bottle, there are many such posts on Reddit and elsewhere that tell a different story.

And while Grok has undoubtedly transformed public understanding of the power of artificial intelligence, it has also pointed to a much wider problem: the growing availability of tools, and means of distribution, that present worldwide regulators with what many view as an impossible task. Even as the UK announces that creating nonconsensual sexual and intimate images will soon be a criminal offence, experts say that the use of AI to harm women has only just begun.

top 50 comments
sorted by: hot top controversial new old
[–] LouNeko@lemmy.world 42 points 1 month ago (3 children)

Imagine not only being that horny, but also having such a peanut sized brain that you voluntarily type in all you fucked up sexual fantasies into a website that makes all it's money from data analytics and ads.

At least with regular porn sites, you can use a VPN or Tor to at least keep some veil of privacy, but with Grok people have to make a Twitter account, verify their identity and enter their credit card information.

People that use off-the-shelf AI like for sexual stuff are literally retarded.

It's not even cheap either. In the long run, grabbing a used GPU from eBay and running your own model on it will not only keep your kinky shit private, but also save you money.

On the other hand, I hope these fucks that generate un-consenting real-person porn will get exposed in the next big "oopsie-woopsie" data leak, because they were dumb enough to leave all their private info behind.

[–] Corkyskog@sh.itjust.works 12 points 1 month ago

Sounds like they aren't that horny if they need a very specific set of circumstances to get off.

[–] Smoogs@lemmy.world 12 points 1 month ago* (last edited 1 month ago)

..I think this is less about just horniness in general and more to do with revenge porn centered around unreasonable anger and inability to process rejection in a healthy way as the main problem here.

[–] peopleproblems@lemmy.world 2 points 1 month ago (1 children)

So it's interesting you say that - because I tried doing some faceswap stuff with Trump and Baron Harkkonen and it was extremely time consuming, took forever to install, and the results were disappointing.

So there's another aspect too: these people have absolutely nothing else going on!

[–] AngryCommieKender@lemmy.world 1 points 1 month ago (1 children)

I would say that would be an insult to House Harkonnen, but then I considered that Drumph might actually be the ancestor of one of those evil fucks.

[–] peopleproblems@lemmy.world 2 points 1 month ago

Not the whole house, no. The house as a whole was pretty legit. But Baron Harkonnen, (which is interesting we only get his title) was a monster. And a pedophile. And grossly unhealthy. An ultimate backstabber. Traded intellect for cleverness. Power and wealth hungry, fuck all the rest types.

They are almost indistinguishable, other than Trump appears to be able to support his own weight.

[–] FerretyFever0@fedia.io 16 points 1 month ago (1 children)

Sorry, it "sounds absurd" to view pornography of consenting adults instead of violating the privacy of countless women (and children) that were minding their fucking business?

[–] Deathray5@lemmynsfw.com 12 points 1 month ago (1 children)

Sex crimes are almost always about power and not about sexual gratification.

[–] arin@lemmy.world 6 points 1 month ago (2 children)

Photoshop has existed since the 90's, and so have scissors and glue. It's not AI harming women, it's the shitty retarded conservatives that don't know how to use photoshop or have any creativity for scissors and glue, using new technology to be the same retarded clown they were when they failed primary school.

[–] TheRealKuni@piefed.social 27 points 1 month ago (2 children)

Nonsense. AI makes the process trivial and, with extreme ease, more realistic than the photoshop/scissors and glue of yore.

Could you imagine finding out kids at your school were passing around extremely realistic nude pictures of you? Or having any argument you make be shut down by something producing a lurid picture of you? Even if it’s fake, that’s gotta do a number on people.

This is different.

[–] FishFace@piefed.social 1 points 1 month ago (2 children)

Why are people concerned about having a fake (but realistic) nude photo of themselves being shared around?

Not because people are looking at their actual naked body, obviously, because they aren't. Rather, it's because of what the people sharing those images are thinking and feeling while doing so; it's because those people are sharing fake nudes as a way to sexually demean their victim. That aspect is wholly identical regardless of how exactly they are doing it. Sharing fake nudes should be treated the same regardless of the method: as sexual bullying. Maybe we didn't recognise how serious it was when it was rare and required effort, but we also shouldn't over-correct now.

[–] MountingSuspicion@reddthat.com 8 points 1 month ago

Also, AI continues to get more indistinguishable from actual images. If someone shares revenge porn but acts like it's AI, the victim should not have to prove one way or the other. Currently, I think real or AI should be treated the same, but it's possible I'm overlooking some unintended consequences of that.

[–] arin@lemmy.world 1 points 1 month ago
[–] hector@lemmy.today 1 points 1 month ago (1 children)

It is different, but why are we only worried about woman being victimized? Men are not fair game to abuse either, and are not responsible for the accumulated sins of men any more than a citizen is not responsible for their government.

[–] TheRealKuni@piefed.social 0 points 1 month ago

I didn’t say anything about women or men. I never said men are to blame for the accumulated sins of men. I never said men were fair game to abuse. I’m not sure where you’re getting this nonsense.

[–] makyo@lemmy.world 9 points 1 month ago

I hear that argument a lot but the old method required access to the software and some actual skill with it. With Grok any smooth brain that can write at a fifth grade level has the ability to publicly victimize the women and/or girls in their life.

[–] P1nkman@lemmy.world 3 points 1 month ago (1 children)

Can't we all just make a Shitter account and ask Groknto create unflattering images of Musk? He'd hate that.

[–] hector@lemmy.today 2 points 1 month ago

Musk's actual body is unflattering, he's a fat sloppy bitch. Shirt off pic is around from some boat, his tailors hide his fat. Probably wears a man girdle too.

[–] RememberTheApollo_@lemmy.world -1 points 1 month ago* (last edited 1 month ago) (1 children)

I’m highly ambivalent about the use of AI for porn. On one hand it’s unequivocally bad when used to make porn or images of unwilling and unknowing people, doubly so when shared publicly and/or used maliciously.

OTOH it allows creation of whatever fetish or kink someone might have that might be difficult to deal with outside of fantsy and harms nobody so long as it remains contained.

I can’t get into the psychology of it, I’m not smart enough, I don’t know if casual use of such imagery bleeds over into real life problems. Obsessive use would obviously be a problem, but that would likely be a problem with or without AI.

[–] Rekorse@sh.itjust.works 1 points 1 month ago

Its not complicated, people want to have their cake and eat it too.

There's no way to practice abusive things in isolation that will not affect you around other people. Its a stupid thing to want to figure out in the first place, in my opinion.

[–] hector@lemmy.today -1 points 1 month ago

To hurt people more like it. As if men will not be victimized. But parts of society celebrate men getting hurt, blaming all men for the ones that hurt others.