this post was submitted on 07 Apr 2025
30 points (100.0% liked)

SneerClub

1080 readers
3 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS
 

Apparently DOGE isn’t killing enough people (literally or metaphorically)

you are viewing a single comment's thread
view the rest of the comments
[–] froztbyte@awful.systems 18 points 1 week ago (7 children)

aww, poor baby, he didn't consider the risk of incentives around an avaricious power-hungry narcissist! whoops! what an utterly embarrassing and predictable mistake to make

(get fucked moldbug)

[–] istewart@awful.systems 12 points 1 week ago (4 children)

He will never stop to reflect that his "philosophy," such as it is, is explicitly tailored for avaricious power-hungry narcissists, soooooo

[–] froztbyte@awful.systems 14 points 1 week ago (3 children)

"what if my CEO god-king decides not to follow my plan" is a thesis even the worst startup founders have muddled through and I find it repeatedly funny as fuck that this mediocre monster of a man is hitting it

maybe it's indicative of other things? maybe he believed so hard in the plan that he misjudged felon's ability? maybe felon just conned him easily? extremely possible on multiple fronts, and still just as funny

imagine how fucking frustrated the little shitgoblin must be. it makes my angry heart flutter!

[–] bitofhope@awful.systems 14 points 1 week ago (1 children)

It's weird how you can always find autocracy supporters in every era despite the overwhelmingly strong and incredibly obvious counterargument "what if the autocrat wants to do something you don't like"

I'm reminded of an old essay from Siskind that tried to break down the different approaches to disagreement as either "conflict theory" - different people want different, mutually incompatible things - and "mistake theory" where we all want the same basic thing but disagree about how to get it. Given the general silicon valley milieu's (and YudRat's specifically) affinity for "mistake theory" I think the susceptibility to authoritarianism and fascism fits remarkably well. After all, if we all want the same basic thing the only way the autocrat could do something we don't like is if they were wrong, so we just have to get a reasonable enough autocrat and give them absolute power, at which point they can magically solve all problems. See also the singularity God AI nonsense.

If I had my wish, it would be that this doesn't just remind people of how authoritarians can be/are evil or incompetent, but also that the general structure isn't actually more "efficient" because whatever delays the democratic process introduces are dwarfed by the inevitable difficulties of just trying to do anything at the scale of any modern state, much less the sheer scale of the USA.

load more comments (1 replies)
load more comments (1 replies)
load more comments (3 replies)