this post was submitted on 01 Mar 2026
67 points (92.4% liked)

Fuck AI

6150 readers
2114 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

American companies are spending enormous sums to develop high-performing AI models. Distillation attacks are attempting to maliciously extract them — and nobody is doing much to stop it.

you are viewing a single comment's thread
view the rest of the comments
[–] bitteroldcoot@piefed.social 41 points 1 day ago (4 children)

I worked with computers for about 30 years, and in retirement been testing ai for fun. I've yet to figure out what the point of them is. They lie, manipulate users and censor information. Their prose is overly verbose and their code sucks. What's the point....

You know, as I was typing the first paragraph I realized the point. They are really good at controlling and manipulating stupid people. They are the new Facebook and twitter. How depressing.

[–] unmagical@lemmy.ml 15 points 1 day ago (2 children)

They seem great till you ask them about something you know. Somehow people fail to extrapolate out that the failures they see in their field of expertise are actually there across all subject matters.

[–] moopet@sh.itjust.works 1 points 23 hours ago

I find the same with human-written articles. Like New Scientist, for example. When I was young I liked reading it, right up until I started reading articles on topics I knew well. They were all misleading shite. So I naturally assume that everything else I read associated with that magazine is also shite.

[–] very_well_lost@lemmy.world 10 points 1 day ago

Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them.

In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.

[–] Strider@lemmy.world 6 points 1 day ago

Well, the point is using humongous amounts of energy, cutting resources from everything else and creating a huge money funnel.

It's the most effective hype yet.

[–] 13igTyme@piefed.social 4 points 1 day ago (1 children)

I work for a company that uses machine learning to make predictions for hospitals for census and discharges. It only a tool and works to help not replace. We're also working on it reading unstructured notes. I'm incredibly sceptical of AI and we test the shit out of it to make sure it's accurate.

[–] bitteroldcoot@piefed.social 5 points 1 day ago (1 children)

"reading unstructured notes." and if it screws up someone dies? I have doctors that want ai to transcribe what they say. I refused to sign the permission form.

[–] 13igTyme@piefed.social 3 points 1 day ago

The software is only used to help identify barriers for patients currently discharging. A person isn't going to die when discharging home and waiting on DME.

[–] kboos1@lemmy.world 3 points 1 day ago

The only thing I have found useful about Ai is it's ability to quickly fill in documents with slop to make it seem like I spent more time and effort on it. Usually something like, I put it together with major points and frame work, then give to Ai to slop it up and format it. Then proof it and send it out. It's also good for note taking and transcripts.

Other than that it seems like it's just another form of control because now it can search data and make decisions quickly and cheaply now. This means that things that weren't worth making time for in the past can just be given to Ai to track. In fact my company is playing around with using Ai to track our progress on projects so that the PMs don't have to interact with engineers directly. I would also bet that it will be used to assess performance in future annual performance reviews.

Companies are also hoping to get rid of employees that perform those menial tasks that support staff do and get rid of employees that do tasks that they believe don't require specialized skills or talents.