this post was submitted on 15 Feb 2026
1354 points (99.6% liked)

Fuck AI

5760 readers
2538 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

link to archived Reddit thread; original post removed/deleted

you are viewing a single comment's thread
view the rest of the comments
[–] mr_sunburn@lemmy.ml 84 points 19 hours ago* (last edited 18 hours ago) (7 children)

I raised this as a concern at the corporate role I work in when an AI tool that was being distributed and encouraged for usage showed two hallucinated data points that were cited in a large group setting. I happened to know my area well, the data was not just marginally wrong but way off, and I was able to quickly check the figures. I corrected it in the room after verifying on my laptop and the reaction in the room was sort of a harmless whoops. The rest of the presentation continued without a seeming acknowledgement that the rest of the figures should be checked.

When I approached the head of the team that constructed the tool after the meeting and shared the inaccuracies and my concerns, he told me that he'd rather have more data fluency through the ease of the tool and that inaccuracies were acceptable because of the convenience and widespread usage.

I suspect stories like this are happening across my industry. Meanwhile, the company put out a press release about our AI efforts (literally using Gemini's Gem tool and custom ChatGPTs seeded with Google Drive) as something investors should be very excited about.

[–] squaresinger@lemmy.world 74 points 18 hours ago (1 children)

When I approached the head of the team that constructed the tool after the meeting and shared the inaccuracies and my concerns, he told me that he’d rather have more data fluency through the ease of the tool and that inaccuracies were acceptable because of the convenience and widespread usage.

"I prefer more data that's completely made up over less data that is actually accurate."

This tells you everything you need to know about your company's marketing and data analysis department and the whole corporate leadership.

Potemkin leadership.

[–] whoisearth@lemmy.ca 23 points 18 hours ago (1 children)

Honestly this is not a new problem and is a further expression of the larger problem.

"Leadership" becomes removed from the day to day operations that run the organization and by nature the "cream" that rises tend to be sycophantic in nature. Our internal biases at work so it's no fault of the individual.

Humanity is their own worst enemy lol

[–] squaresinger@lemmy.world 17 points 18 hours ago

It is not a new problem and that has been the case for a long time. But it's a good visualization of it.

Everyone in a company has their own goals, from the lowly actual worker who just wants to pay the bills and spend as little effort on it as possible, to departments which want to justify their useless existence, to leadership who mainly wants to look good towards the investors to get a nice bonus.

That some companies end up actually making products that ship and that people want to use is more of an unintended side effect than the intended purpose of anyone's work.

[–] altasshet@lemmy.ca 24 points 18 hours ago (2 children)

That makes no sense. The inaccuracies are even less acceptable with widespread use!

[–] sp3ctr4l@lemmy.dbzer0.com 9 points 16 hours ago* (last edited 16 hours ago)

You're thinking like a person who values accurate information more than feeling some kind of 'cool' and 'trendy' because now you can vibe code and we are a forward thinking company that embraces new paradigms and synergizes our expectations with the potential reality our market disprupting innovations could bring.

... sorry, I lapsed back into corpo / had a stroke.

[–] BlameTheAntifa@lemmy.world 17 points 18 hours ago (1 children)

It’s technological astrology. We’re doomed.

[–] CancerMancer@sh.itjust.works 6 points 12 hours ago

You need to know the words to properly wake the machine spirit

[–] chiliedogg@lemmy.world 11 points 17 hours ago (4 children)

The board room is more concerned with the presentation than the data, because presentations make sales.

What a lot of people fail.to understand is that for the C-Suite, the product isn't what's being manufactured, or the service being sold. The product is the stock, and anything that makes the number go up in the short term is good.

Lots of them have fiduciary duties, meaning they're legally prohibited from doing anything that doesn't maximize the value of the stock from moment to moment.

[–] Sprocketfree@sh.itjust.works 19 points 16 hours ago (1 children)

Someone please show me the criminal lawsuit against the CEO that made the moral decision and the stock went down! I'm so sick of the term fiduciary duty being used as a bullshit shield for bad behavior. When Tesla stock tanked because musk threw a Nazi salute, where were the fiduciary duty people!?

[–] aesthelete@lemmy.world 8 points 17 hours ago* (last edited 17 hours ago) (1 children)

Lots of them have fiduciary duties, meaning they’re legally prohibited from doing anything that doesn’t maximize the value of the stock from moment to moment.

Overall, I agree with you that stock price is their motivation, but the notion of shareholder supremacy binding their hands and preventing them from doing things that they want to otherwise do is incorrect. For one, they aren't actually mandated to do this by law, and secondarily, even if they were -- which to reiterate, they aren't -- just about any action they take on any single issue can be portrayed as them attempting to maximize company value.

https://pluralistic.net/2024/09/18/falsifiability/#figleaves-not-rubrics

[–] AntEater@discuss.tchncs.de 1 points 15 hours ago (2 children)

No, not illegal, but they can be sued by the shareholder for failing to maximize value.

[–] WoodScientist@lemmy.world 2 points 9 hours ago

Not really, no. This is mostly a myth. Unless the executives are deliberately causing the company to lose money, they really can't be sued based on this fiduciary duty to shareholders. They have to act in the shareholders' best interest, but "shareholder interest" is entirely up to interpretation. For example, it's perfectly fine to say, "we're going to lose money over the next five years because we believe it will ensure maximum profits over the long term." In order to sue a CEO for failing to protect shareholders, they would have to be doing something deliberately and undeniably against shareholder interest. Like if they embezzle money into their own bank account, or if they hold a Joker-style literal money burning.

If it were that easy to sue executives for violating their fiduciary duty to shareholders, golden parachutes and inflated executive compensation packages wouldn't exist. But good luck suing a CEO because he's paid too much. He can just claim in court that his compensation will ensure the company attracts the best talent to perform the best they can.

Executives are given wide latitude in how they define the best financial interest of shareholders. Shareholders ultimately do have the ability to remove executives from their positions. This is supposed to be the default way of dealing with incompetent executives. As shareholders already possess the ability to fire a CEO at any time, there is a very high bar to clear before shareholders can also sue executives. It's generally assumed if they really are doing that bad a job, you should just fire them.

[–] aesthelete@lemmy.world 5 points 14 hours ago* (last edited 14 hours ago) (1 children)

Sure, but since it's an unfalsifiable proposition, good luck proving it in court for any specific action.

[–] AntEater@discuss.tchncs.de 2 points 11 hours ago* (last edited 11 hours ago) (1 children)

Apparently, it does happen: https://tempusfugitlaw.com/real-life-breach-of-fiduciary-duty-case-examples-outcomes/

Particularly of note is the descision around AA's ESG investments.

[–] aesthelete@lemmy.world 1 points 9 hours ago

I think this is mixing things up a bit. At least some of the cases there were fraud based.

[–] jj4211@lemmy.world 9 points 17 hours ago (1 children)

Further, as you hinted, long term is not their problem. They get a bump, cash in a few million dollars worth of RSUs, and either saddle the next guy with the fallout, it of they haven't left yet "whoopsie, but I can blame the LLM and I was just following best practices in the industry at the time". Either way they have enough to not even pretend to work another day of their life, even ignoring previous grifts, and they'll go on and do the same thing to some other company when they bail or the company falls over.

[–] cogman@lemmy.world 9 points 17 hours ago

At the moment, nothing will be done. There's no way the current SEC chair will give a fuck about this sort of stuff.

But assuming a competent chair ever gets in charge, I expect there to be a shit show of lawsuits. It really doesn't matter that "the LLM did it" lying on those mandatory reports can lead to big fines.

[–] Snowclone@lemmy.world 7 points 17 hours ago

it's why capitalism is over. they do not care about making a profit at all. they only care about the stocks. there is only one outcome to this approach, and that's dissolving the company slowly until it fails because your willing to saw your legs off for a small spike in quarterly earning. You eventually run out of legs to saw off.

[–] Buddahriffic@lemmy.world 6 points 16 hours ago

Sounds like the people who are realistic about AI are going to end up having a huge advantage over people who use it naively.

Like with statistics, there are a lot of tools out there that can handle them perfectly accurately, you just don't want an LLM doing "analysis" because the NN isn't encoded for that. Consider how often our own NNs get addicted to gambling while not being fully specialized for processing language. An LLM might not get caught up in a gambler's fallacy, but that's more on account of being too simple than being smarter.

I wonder if this will break the trust in MBAs because LLMs are deceptively incompetent and from the sound of this comment and other things I've seen, that deception works well enough that their ego around being involved in the tool's development clashes with the experts telling them it's not as useful as it seems.

[–] vivalapivo@lemmy.today 7 points 17 hours ago

Coming from science to industry taught me one thing: numbers (and rationality as a whole) serves only one goal. And the goal is to persuade the opponents: colleagues, investors, regulators.

In this broken sense, your head of the team is right: hallucinations are acceptable if supervisors believe the output.

[–] ScoffingLizard@lemmy.dbzer0.com 1 points 16 hours ago

You should have ask what would happen if the figures were wrong, let them make an excuse and then eat shit later. AI is taking our jobs. Never interrupt an enemy making a mistake.