this post was submitted on 18 Apr 2025
26 points (100.0% liked)

TechTakes

1793 readers
78 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] spankmonkey@lemmy.world 13 points 1 day ago* (last edited 1 day ago) (1 children)

Why would the steps be literal when everything else is bullshit? Obviously the reasoning steps are AI slop too.

[–] dgerard@awful.systems 7 points 1 day ago (1 children)
[–] Soyweiser@awful.systems 7 points 1 day ago

The paper clipping is nigh! Repent Harlequins

[–] paraphrand@lemmy.world 11 points 1 day ago (1 children)

It’s bullshitting. That’s the word. Bullshitting is saying things without a care for how true they are.

[–] Saledovil@sh.itjust.works 6 points 20 hours ago (2 children)

The word "bullshitting" implies a clarity of purpose I don't want to attribute to AI.

[–] Soyweiser@awful.systems 2 points 9 hours ago

Yeah that is why people called it confabulating, and not bullshitting.

[–] antifuchs@awful.systems 2 points 11 hours ago

It’s kind of a distinction without much discriminatory power: LLMs are a tool created to ease the task of bullshitting; used to produce bullshit by bullshitters.

[–] diz@awful.systems 4 points 1 day ago* (last edited 1 day ago)

It re consumes its own bullshit, and the bullshit it does print is the bullshit it also fed itself, its not lying about that. Of course, it is also always re consuming the initial prompt too so the end bullshit isn’t necessarily quite as far removed from the question as the length would indicate.

Where it gets deceptive is when it knows an answer to the problem, but it constructs some bullshit for the purpose of making you believe that it solved the problem on its own. The only way to tell the difference is to ask it something simpler that it doesn’t know the answer to, and watch it bullshit in circles or to an incorrect answer.