this post was submitted on 23 Mar 2025
1241 points (98.2% liked)

Technology

68066 readers
3607 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] umbrella@lemmy.ml 1 points 6 days ago

ai excels at some specific tasks. the chatbots they push us to are a gimmick rn.

[–] Heliumfart@sh.itjust.works 1 points 6 days ago

Reminds me of "biotech is Godzilla". Sepultura version of course

[–] aramis87@fedia.io 148 points 1 week ago (20 children)

The biggest problem with AI is that they're illegally harvesting everything they can possibly get their hands on to feed it, they're forcing it into places where people have explicitly said they don't want it, and they're sucking up massive amounts of energy AMD water to create it, undoing everyone else's progress in reducing energy use, and raising prices for everyone else at the same time.

Oh, and it also hallucinates.

[–] BlameTheAntifa@lemmy.world 1 points 2 days ago* (last edited 2 days ago)

In a Venn Diagram, I think your “illegally harvesting” complaint is a circle fully inside the “owned by the same few people” circle. AI could have been an open, community-driven endeavor, but now it’s just mega-rich corporations stealing from everyone else. I guess that’s true of literally everything, not just AI, but you get my point.

[–] pennomi@lemmy.world 28 points 1 week ago (3 children)

Eh I’m fine with the illegal harvesting of data. It forces the courts to revisit the question of what copyright really is and hopefully erodes the stranglehold that copyright has on modern society.

Let the companies fight each other over whether it’s okay to pirate every video on YouTube. I’m waiting.

[–] catloaf@lemm.ee 72 points 1 week ago (1 children)

So far, the result seems to be "it's okay when they do it"

load more comments (1 replies)
[–] Electricblush@lemmy.world 33 points 1 week ago* (last edited 1 week ago) (1 children)

I would agree with you if the same companies challenging copyright (protecting the intellectual and creative work of "normies") are not also aggressively welding copyright against the same people they are stealing from.

With the amount of coprorate power tightly integrated with the governmental bodies in the US (and now with Doge dismantling oversight) I fear that whatever comes out of this is humans own nothing, corporations own everything. Death of free independent thought and creativity.

Everything you do, say and create is instantly marketable, sellable by the major corporations and you get nothing in return.

The world needs something a lot more drastic then a copyright reform at this point.

load more comments (1 replies)
[–] naught@sh.itjust.works 12 points 1 week ago (3 children)

AI scrapers illegally harvesting data are destroying smaller and open source projects. Copyright law is not the only victim

https://thelibre.news/foss-infrastructure-is-under-attack-by-ai-companies/

load more comments (3 replies)
[–] wewbull@feddit.uk 13 points 1 week ago

Oh, and it also hallucinates.

Oh, and people believe the hallucinations.

[–] riskable@programming.dev 11 points 1 week ago* (last edited 1 week ago) (10 children)

They're not illegally harvesting anything. Copyright law is all about distribution. As much as everyone loves to think that when you copy something without permission you're breaking the law the truth is that you're not. It's only when you distribute said copy that you're breaking the law (aka violating copyright).

All those old school notices (e.g. "FBI Warning") are 100% bullshit. Same for the warning the NFL spits out before games. You absolutely can record it! You just can't share it (or show it to more than a handful of people but that's a different set of laws regarding broadcasting).

I download AI (image generation) models all the time. They range in size from 2GB to 12GB. You cannot fit the petabytes of data they used to train the model into that space. No compression algorithm is that good.

The same is true for LLM, RVC (audio models) and similar models/checkpoints. I mean, think about it: If AI is illegally distributing millions of copyrighted works to end users they'd have to be including it all in those files somehow.

Instead of thinking of an AI model like a collection of copyrighted works think of it more like a rough sketch of a mashup of copyrighted works. Like if you asked a person to make a Godzilla-themed My Little Pony and what you got was that person's interpretation of what Godzilla combined with MLP would look like. Every artist would draw it differently. Every author would describe it differently. Every voice actor would voice it differently.

Those differences are the equivalent of the random seed provided to AI models. If you throw something at a random number generator enough times you could--in theory--get the works of Shakespeare. Especially if you ask it to write something just like Shakespeare. However, that doesn't meant the AI model literally copied his works. It's just doing it's best guess (it's literally guessing! That's how work!).

[–] natecox@programming.dev 10 points 1 week ago (3 children)

The problem with being like… super pedantic about definitions, is that you often miss the forest for the trees.

Illegal or not, seems pretty obvious to me that people saying illegal in this thread and others probably mean “unethically”… which is pretty clearly true.

load more comments (3 replies)
load more comments (9 replies)
load more comments (16 replies)
[–] kibiz0r@midwest.social 34 points 1 week ago (4 children)

Idk if it’s the biggest problem, but it’s probably top three.

Other problems could include:

  • Power usage
  • Adding noise to our communication channels
  • AGI fears if you buy that (I don’t personally)
[–] pennomi@lemmy.world 18 points 1 week ago (1 children)

Dead Internet theory has never been a bigger threat. I believe that’s the number one danger - endless quantities of advertising and spam shoved down our throats from every possible direction.

load more comments (1 replies)
load more comments (3 replies)
[–] ElPussyKangaroo@lemmy.world 23 points 1 week ago

Truer words have never been said.

[–] MyOpinion@lemm.ee 20 points 1 week ago (6 children)

The problem with AI is that it pirates everyone’s work and then repackages it as its own and enriches the people that did not create the copywrited work.

[–] lobut@lemmy.ca 23 points 1 week ago

I mean, it's our work the result should belong to the people.

load more comments (5 replies)
[–] DarkCloud@lemmy.world 20 points 1 week ago* (last edited 1 week ago) (7 children)

Like Sam Altman who invests in Prospera, a private "Start-up City" in Honduras where the board of directors pick and choose which laws apply to them!

The switch to Techno-Feudalism is progressing far too much for my liking.

load more comments (7 replies)
[–] Grimy@lemmy.world 18 points 1 week ago (1 children)

AI has a vibrant open source scene and is definitely not owned by a few people.

A lot of the data to train it is only owned by a few people though. It is record companies and publishing houses winning their lawsuits that will lead to dystopia. It's a shame to see so many actually cheering them on.

load more comments (1 replies)
[–] futatorius@lemm.ee 17 points 1 week ago (2 children)

Two intrinsic problems with the current implementations of AI is that they are insanely resource-intensive and require huge training sets. Neither of those is directly a problem of ownership or control, though both favor larger players with more money.

[–] frezik@midwest.social 1 points 6 days ago

If gigantic amounts of capital weren't available, then the focus would be on improving the models so they don't need GPU farms running off nuclear reactors plus the sum total of all posts on the Internet ever.

[–] finitebanjo@lemmy.world 10 points 1 week ago* (last edited 1 week ago) (1 children)

And a third intrinsic problem is that the current models with infinite training data have been proven to never approach human language capability, from papers written by OpenAI in 2020 and Deepmind in 2022, and also a paper by Stanford which proposes AI simply have no emergent behavior and only convergent behavior.

So yeah. Lots of problems.

load more comments (1 replies)
[–] AbsoluteChicagoDog@lemm.ee 16 points 1 week ago

Same as always. There is no technology capitalism can't corrupt

[–] RadicalEagle@lemmy.world 16 points 1 week ago (2 children)

I’d say the biggest problem with AI is that it’s being treated as a tool to displace workers, but there is no system in place to make sure that that “value” (I’m not convinced commercial AI has done anything valuable) created by AI is redistributed to the workers that it has displaced.

[–] protist@mander.xyz 15 points 1 week ago

Welcome to every technological advancement ever applied to the workforce

load more comments (1 replies)
[–] WrenFeathers@lemmy.world 11 points 1 week ago (1 children)

The biggest problem with AI is the damage it’s doing to human culture.

load more comments (1 replies)
[–] captain_aggravated@sh.itjust.works 11 points 1 week ago (3 children)

For some reason the megacorps have got LLMs on the brain, and they're the worst "AI" I've seen. There are other types of AI that are actually impressive, but the "writes a thing that looks like it might be the answer" machine is way less useful than they think it is.

load more comments (3 replies)
[–] PostiveNoise@kbin.melroy.org 11 points 1 week ago (6 children)

Either the article editing was horrible, or Eno is wildly uniformed about the world. Creation of AIs is NOT the same as social media. You can't blame a hammer for some evil person using it to hit someone in the head, and there is more to 'hammers' than just assaulting people.

load more comments (6 replies)
[–] umbraroze@lemmy.world 11 points 1 week ago (1 children)

AI business is owned by a tiny group of technobros, who have no concern for what they have to do to get the results they want ("fuck the copyright, especially fuck the natural resources") who want to be personally seen as the saviours of humanity (despite not being the ones who invented and implemented the actual tech) and, like all big wig biz boys, they want all the money.

I don't have problems with AI tech in the principle, but I hate the current business direction and what the AI business encourages people to do and use the tech for.

[–] interdimensionalmeme@lemmy.ml 1 points 6 days ago

Well I'm on board for fuck intellectual property. If openai doesn't publish the weights then all their datacenter get visited by the killdozer

[–] max_dryzen@mander.xyz 11 points 1 week ago (1 children)

The government likes concentrated ownership because then it has only a few phonecalls to make if it wants its bidding done (be it censorship, manipulation, partisan political chicanery, etc)

load more comments (1 replies)
[–] Grandwolf319@sh.itjust.works 10 points 1 week ago (1 children)

The biggest problem with AI is that it’s the brut force solution to complex problems.

Instead of trying to figure out what’s the most power efficient algorithm to do artificial analysis, they just threw more data and power at it.

Besides the fact of how often it’s wrong, by definition, it won’t ever be as accurate nor efficient as doing actual thinking.

It’s the solution you come up with the last day before the project is due cause you know it will technically pass and you’ll get a C.

load more comments (1 replies)
[–] TheMightyCat@lemm.ee 10 points 1 week ago (11 children)

No?

Anyone can run an AI even on the weakest hardware there are plenty of small open models for this.

Training an AI requires very strong hardware, however this is not an impossible hurdle as the models on hugging face show.

load more comments (11 replies)
load more comments
view more: next ›