401
submitted 11 months ago by L4s@lemmy.world to c/technology@lemmy.world

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[-] Crow@lemmy.world 119 points 11 months ago

I remember being a dumb & horny kid and Photoshopping my crush’s face onto a porn photo. And even then I felt what I did was wrong and never did it again.

[-] stebo02@sopuli.xyz 65 points 11 months ago

Post nut clarity can be truly eye opening

[-] agitatedpotato@lemmy.world 12 points 11 months ago

or closing depending where you get it

[-] CleoTheWizard@lemmy.world 21 points 11 months ago

I feel like what you did and the reaction you had to what you did is common. And yet, I don’t feel like it’s harmful unless other people see it. But this conversation is about to leave men’s heads and end up in public discourse where I have no doubt it will create moral or ethical panic.

A lot of technology challenges around AI are old concerns about things that we’ve had access to for decades. It’s just easier to do this stuff now. I think it’s kind of pointless to stop or prevent this stuff from happening. We should mostly focus on the harms and how to prevent them.

[-] azertyfun@sh.itjust.works 15 points 11 months ago

I've seen ads for these apps on porn websites. That ain't right.

Any moron can buy a match and a gallon of gasoline, freely and legally, and that's a good thing. But I would hope that anyone advertising and/or selling Arson Kits™ online would be jailed. Of course this will not stop determined arsonists, but repression might deter morons, inventive psychopaths, and overly impulsive people (especially teenagers!) from actually going through with a criminal act. Not all of them. But some/most of them. And that's already a huge win.

[-] KairuByte@lemmy.dbzer0.com 9 points 11 months ago* (last edited 11 months ago)

I mean, you’ve been able to do a cursory search and get dozens of “celeb lookalike” porn for many years now. “Scarjo goes bareback” isn’t hard to find, but that ain’t Scarjo in the video. How is this different?

Edit: To be clear, it’s scummy as all fuck, but still.

[-] shuzuko@midwest.social 14 points 11 months ago

This is different because, to a certain extent, people in the public eye can expect, anticipate, and react to/suppress this kind of thing. They have managers and PR people who can help them handle it in a way that doesn't negatively affect them. Billy's 13 year old classmate Stacy doesn't have those resources and now he can do the same thing to her. It's on a very different level of harm.

this post was submitted on 08 Dec 2023
401 points (93.3% liked)

Technology

59674 readers
3403 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS