233
top 50 comments
sorted by: hot top controversial new old
[-] rustyfish@lemmy.world 77 points 1 month ago

“Many girls were completely terrified and had tremendous anxiety attacks because they were suffering this in silence,” she told Reuters at the time. “They felt bad and were afraid to tell and be blamed for it.”

WTF?!

[-] sigmaklimgrindset@sopuli.xyz 55 points 1 month ago

Spain is a pretty Catholic country, and even if religious attendance is dropping off, the ingrained beliefs can still remain. Madonna/Whore dichotomy still is very prevalent in certain parts of society there.

load more comments (3 replies)
[-] cyberpunk007@lemmy.ca 43 points 1 month ago

Psychology 101.

[-] todd_bonzalez@lemm.ee 36 points 1 month ago

Welcome to Christianity.

If a man sexually exploits a woman, it's the woman's fault for leading him astray.

This is how women are treated in deeply Christian communities.

Those women fear stepping forward to report assault or abuse because there are many in their community that will condemn them for it.

[-] CybranM@feddit.nu 18 points 1 month ago

Welcome to religion at large

[-] SnokenKeekaGuard@lemmy.dbzer0.com 45 points 1 month ago

I read the headline and said oh come on. One paragraph in and that turned to what in the absolute fuck.

[-] Zak@lemmy.world 118 points 1 month ago

Are you surprised by teenage boys making fake nudes of girls in their school? I'm surprised by how few of these cases have made the news.

I don't think there's any way to put this cat back in the bag. We should probably work on teaching boys not to be horrible.

[-] cyberpunk007@lemmy.ca 36 points 1 month ago

I'm not sure you can teach boys not to be horny teenagers 😜

[-] DarkThoughts@fedia.io 62 points 1 month ago

Being horny is one thing, sharing this stuff another. If whoever did the fake would've kept it to themselves, then nobody would've even known. The headline still is ass and typical "AI" hysteria though.

load more comments (4 replies)
[-] Zak@lemmy.world 61 points 1 month ago

Having been a teenage boy myself, I wouldn't dream of trying.

But I knew it wasn't OK to climb a tree with binoculars to try to catch a glimpse of the girl next door changing clothes, and I knew it wasn't OK to touch people without their consent. I knew people who did things like that were peeping toms and rapists. I believed peeping toms and rapists would be socially ostracized and legally punished more harshly than they often are in reality.

Making and sharing deepfakes of real people without their consent belongs on the same spectrum.

[-] MagicShel@programming.dev 20 points 1 month ago

We do eventually grow up at least

... into horny men

... but hopefully with a little more empathy and propriety.

[-] pennomi@lemmy.world 30 points 1 month ago

There are always two paths to take - take away all of humanity’s tools or aggressively police people who abuse them. No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it, and for society to function properly we have to do something about the delinquent minority of society.

[-] ThePantser@lemmy.world 16 points 1 month ago

No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it,

Hydraulic press channel guy offended you somehow? I'm missing something here.

[-] pennomi@lemmy.world 21 points 1 month ago

No, just an example. But if you’ve ever noticed the giant list of safety warnings on industrial machinery, you should know that every single one of those rules was written in blood.

[-] Emperor@feddit.uk 13 points 1 month ago

Sometimes other bodily fluids.

[-] devfuuu@lemmy.world 6 points 1 month ago

The machines need to be oiled somehow.

[-] superminerJG@lemmy.world 1 points 1 month ago

🤨 vine boom

[-] 0x0@programming.dev 3 points 1 month ago

Either Darwin awards or assholes, most likely. Those warnings are written due to fear of lawsuit.

[-] hendrik@palaver.p3x.de 1 points 4 weeks ago* (last edited 4 weeks ago)

However this tool doesn't have any safety warnings written on it. The App they used specifically caters for use-cases like this. They advertise to use it unmorally and we have technology to tell age from pictures for like 10 years. And they deliberately chose to have their tool generate pictures of like 13 yo girls. In the tool analogy that's like selling a jigsaw that you're very well aware of, misses some well established safety standards and is likely to injure someone. And it's debatable whether it was made to cut wood anyways, or just injure people.
And the rest fits, too. No company address, located in some country where they can't be persecuted... They're well aware of the use-case of their App.

[-] Ookami38@sh.itjust.works 8 points 1 month ago

I don't think they're offended. I think they're saying that a tool is a tool. A gun or AI are only dangerous if misused, like a hydraulic press.

We can't go around removing the tools because some people will abuse them. Any tool can kill someone.

load more comments (5 replies)
[-] catloaf@lemm.ee 6 points 1 month ago

We could also do a better job of teaching people from childhood not to be assholes.

load more comments (1 replies)
[-] afraid_of_zombies@lemmy.world 2 points 1 month ago

Guns do not belong in the list. Guns are weapons, not tools. Don't bother posting some random edge case that accounts for approximately 0.000001% of use. This is a basic category error.

Governments should make rules banning and/or regulating weapons.

[-] pennomi@lemmy.world 11 points 1 month ago

Weapons are tools, by strict definition, and there are legitimate uses for them. Besides, my point was that they should be regulated. In fact, because they are less generally useful than constructive tools, they should be regulated far MORE strictly.

load more comments (34 replies)
load more comments (4 replies)
[-] IllNess@infosec.pub 39 points 1 month ago

They are releasing stories like this to promote the new that requires adults to login to pornsites and to limit their use of it.

[-] autotldr@lemmings.world 16 points 1 month ago

This is the best summary I could come up with:


A court in south-west Spain has sentenced 15 schoolchildren to a year’s probation for creating and spreading AI-generated images of their female peers in a case that prompted a debate on the harmful and abusive uses of deepfake technology.

Police began investigating the matter last year after parents in the Extremaduran town of Almendralejo reported that faked naked pictures of their daughters were being circulated on WhatsApp groups.

Each of the defendants was handed a year’s probation and ordered to attend classes on gender and equality awareness, and on the “responsible use of technology”.

Under Spanish law minors under 14 cannot be charged but their cases are sent to child protection services, which can force them to take part in rehabilitation courses.

In an interview with the Guardian five months ago, the mother of one of the victims recalled her shock and disbelief when her daughter showed her one of the images.

“Beyond this particular trial, these facts should make us reflect on the need to educate people about equality between men and women,” the association told the online newspaper ElDiario.es.


The original article contains 431 words, the summary contains 181 words. Saved 58%. I'm a bot and I'm open source!

[-] Zeratul@lemmus.org 3 points 1 month ago

What does this have to do with the equality of men and women? Girls are more at risk of this kind of abuse? That's a good point, but it's not brought up here. This parent is trying to make something political that is simply not. Not that gender equality should be political in the first place.

load more comments (3 replies)
[-] Sensitivezombie@lemmy.zip 6 points 1 month ago

Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.

[-] Duamerthrax@lemmy.world 4 points 1 month ago

How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn't differentiate between nudes and sand dunes back when they enforced their new policies.

[-] KairuByte@lemmy.dbzer0.com 2 points 1 month ago

This sounds great, but it’s one of those things that is infinitely easier to say than do. You’re essentially asking for one of two things: Manual human intervention for every single image uploaded, or “the perfect image recognition system.” And honestly, the first is fraught with its own issues, and the second does not exist.

[-] autonomoususer@lemmy.world 3 points 1 month ago

Next they'll ban E2EE and libre software.

load more comments
view more: next ›
this post was submitted on 11 Jul 2024
233 points (96.8% liked)

Technology

57175 readers
4075 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS