this post was submitted on 29 Mar 2025
16 points (100.0% liked)

technology

23637 readers
256 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MineDayOff@hexbear.net 1 points 6 days ago (4 children)

I suppose it depends, but how the f*** does a machine have bias?

[–] hello_hello@hexbear.net 3 points 5 days ago

GenAI (TM) is the Social Darwinism of computer science.

[–] NephewAlphaBravo@hexbear.net 3 points 6 days ago* (last edited 6 days ago)

computers do what we tell them to do and exactly what we tell them to do, no more and no less, they don't intuit their way through any bad directions or misspellings to figure out what we actually intended for them to do, if we accidentally misspoke or whatever and told them to do something stupid they'll happily do the stupid thing

so yeah the short answer is any bias, even unintentional, on the part of the programmer will be reflected in the machine's output

[–] ThermonuclearEgg@hexbear.net 3 points 6 days ago

It's pretty easy. To overexaggerate, suppose I trained one of these models exclusively on photos of white people, and exclusively on racist comments. Do you think it would respond appropriately to the existence of black people?

Spoiler alert: No, that's why we got Tay, the chatbot Nazi, and Google Photos identifying black people as monkeys

[–] miz@hexbear.net 1 points 6 days ago

garbage in, garbage out