this post was submitted on 08 Jun 2025
17 points (94.7% liked)

Artificial Intelligence

1658 readers
1 users here now

Welcome to the AI Community!

Let's explore AI passionately, foster innovation, and learn together. Follow these guidelines for a vibrant and respectful community:

You can access the AI Wiki at the following link: AI Wiki

Let's create a thriving AI community together!

founded 2 years ago
MODERATORS
 

Original question by @SpiderUnderUrBed@lemmy.zip

Title, or at least the inverse be encouraged. This has been talked about before, but with how bad things are getting, and how realistic goods ai generated videos are getting, anything feels better than nothing, AI generated watermarks, or metadata can be removed, but thats not the point, the point is deterrence. Immediately all big tech will comply (atleast on the surface for consumer-facing products), and then we will probably see a massive decrease in malicious use of it, people will bypass it, remove watermarks, fix metadata, but the situation should be quite a bit better? I dont see many downsides/

you are viewing a single comment's thread
view the rest of the comments
[–] j4k3@lemmy.world 3 points 1 week ago

It won't change the scenario at all and will goad people into creating worse stuff. The best thing possible is to normalize both the positive and negative aspects of this like what happened much slower with Photoshop/digital editing fakes. Those were actually pretty high quality even before AI but were relegated to the weirder corners of the internet.

The more normal it is to be skeptical of recorded media, the better. AI can be tweaked and tuned in private on enthusiast level hardware. I can and have done fine tuning with a LLM and a CNN at home and offline. It is not all that hard to do. If the capabilities are ostracized, the potential to cause far larger problems becomes greater. A person is far less likely to share their stories and creations when they get a poor reception, and may withdraw further and further within themselves until they make use of what they have created.