this post was submitted on 03 Apr 2025
21 points (100.0% liked)

SneerClub

1066 readers
64 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS
 

Thinking about how the arsing fuck to explain the rationalists to normal people - especially as they are now a loud public problem along multiple dimensions.

The problem is that it's all deep in the weeds. Every part of it is "it can't be that stupid, you must be explaining it wrong."

With bitcoin, I have, over the years, simplified it to being a story of crooks and con men. The correct answer to "what is a blockchain and how does it work" is "it's a way to move money around out of the sight of regulators" and maybe "so it's for crooks and con men, and a small number of sincere libertarians" and don't even talk about cryptography or technology.

I dunno what the one sentence explanation is of this shit.

"The purpose of LessWrong rationality is for Yudkowsky to live forever as an emulation running on the mind of the AI God" is completely true, is the purpose of the whole thing, and is also WTF.

Maybe that and "so he started what turned into a cult and a series of cults"? At this point I'm piling up the absurdities again.

The Behind The Bastards approach to all these guys has been "wow these guys are all so wacky haha and also they're evil."

How would you first approach explaining this shit past "it can't be that stupid, you must be explaining it wrong"?

[also posted in sneer classic]

all 29 comments
sorted by: hot top controversial new old
[–] istewart@awful.systems 4 points 6 hours ago* (last edited 6 hours ago) (1 children)

I've been contemplating this, and I agree with most everyone else about leaning heavily into the cult angle and explaining it as a mutant hybrid between Scientology-style UFO religions and Christian dispensationalist Book of Revelation eschatology. The latter may be especially useful in explaining it to USians. My mom (who works in an SV-adjacent job) sent me this Vanity Fair article the other day about Garry Tan grifting his way into non-denominational prosperity gospel Christianity: https://www.vanityfair.com/news/story/christianity-was-borderline-illegal-in-silicon-valley-now-its-the-new-religion She was wondering if it was "just another fad for these people," and I had to explain no, not really, it is because their AI bullshit is so outlandish that some of them feel the need to pivot back towards something more mainstream to keep growing their following.

I also prefer to highlight Kurzweil's obsession with perpetual exponential growth curves as a central point. That's often what I start with when I'm explaining it all to somebody. It provides the foundation for the bullshit towers that Yudkowsky and friends have erected. And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published. It'll most likely be the main source in the lower-division undergraduate/AP high school history texts that highlight this stuff as a background trend in the 2010s/2020s. Right now, we live in the peak days of the LessWrong bullshit volcano plume, but ultimately, it will probably be interpreted by the specialized upper-division texts that grow out of peoples' PhD theses.

awful.systems

[–] blakestacey@awful.systems 5 points 6 hours ago* (last edited 6 hours ago) (1 children)

And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published.

That is interesting to think about. (Something feels almost defiant about imagining a future that has history books and PhD theses.) My own feeling is that Yudkowsky brought something much more overtly and directly culty. Kurzweil's vibe in The Age of Spiritual Machines and such was, as I recall, "This is what the scientists say, and this is why that implies the Singularity." By contrast, Yudkowsky was saying, "The scientists are insufficiently Rational to accept the truth, so listen to me instead. Academia bad, blog posts good." He brought a more toxic variation, something that emotionally resonated with burnout-trending Gifted Kids in a way that Kurzweil's silly little graphs did not. There was no Rationality as self-help angle in Kurzweil, no mass of text whose sheer bulk helped to establish an elect group of the saved.

[–] istewart@awful.systems 4 points 3 hours ago

Yes, Kurzweil desperately trying to create some kind of a scientific argument, as well as people with university affiliations like Singer and MacAskill pushing EA, are what give this stuff institutional strength. Yudkowsky and LW are by no means less influential, but they're at best a student club that only aspires to be a proper curriculum. It's surely no coincidence that they're anchored in Berkeley, adjacent to the university's famous student-led DeCal program.

FWIW, my capsule summary of TPOT/"post-rationalists" is that they're people who thought that advanced degrees and/or adjacency to VC money would yield more remuneration and influence than they actually did. Equally burned out, just further along the same path.

[–] maol@awful.systems 5 points 8 hours ago

I think starting with Sam Bankman Fried is a solid idea. Relatively informed members of the general public a) know who that guy is, and b) know that he made some really poor decisions. He does not have the silicon valley mystique that attaches itself to some other adherents, I think fewer people will think "well that guy is really smart, why would he be in a cult". Then you can go back and explain EA and LessWrong and Yudkowsky's role in all of this.

[–] cstross@wandering.shop 11 points 23 hours ago (1 children)

@dgerard TESCREAL is structurally an evangelical a-theist religion descended from the 19th century Russian Orthodox christianity of Nikolai Fyodorovitch Fyodorov and in my new book I will—

Nah, forget the book, I've got the attention of the POTUS's ketamine-addled fractional-trillionaire shitposter, get in the car, losers! (Thows brake pedal at puzzled pedestrian)

[–] JohnBierce@awful.systems 3 points 20 hours ago (1 children)

The car's a Tesla, isn't it?

[–] cstross@wandering.shop 4 points 18 hours ago (1 children)
[–] JohnBierce@awful.systems 3 points 14 hours ago
[–] V0ldek@awful.systems 8 points 23 hours ago

It's eugenics but as a religious cult for reactionaries

Yud is that creepy nerd from your middle school who wrote disturbing fan fiction, but it wasn't just a phase and now he has the aforementioned cult

[–] YourNetworkIsHaunted@awful.systems 8 points 1 day ago (1 children)

So the primary doctrine is basically tech bros rewriting standard millenarian christianity from mythic fantasy into science fiction. But it seems like the founder wants to be a silicon valley influencer more than he wants to be a proper cult leader, meaning that some of the people who take this shit seriously have accumulated absurd amounts of money and power and occasionally the more deranged subgroups will spin off into a proper cult with everything that entails -- including, now, being involved in multiple homicides!

[–] zogwarg@awful.systems 6 points 1 day ago
[–] blakestacey@awful.systems 18 points 1 day ago

I'm trying to imagine how a John Oliver sketch would introduce them. "The kind of nerds who make you think the jocks in '80s movies had a reasonable point got together and sold 'science' and 'rational thinking' as self-help, without truly understanding either, and it got very culty."

[–] nev@bananachips.club 11 points 1 day ago

@dgerard "carrying coal to Newcastle, for philosophy"

"I wouldn't say there's such a thing as reading *too* much science fiction, but there is such a thing as not reading enough stuff that *isn't* science fiction"

"a cargo cult based on Harry Potter fanfiction; a useful but by no means universally superior way of doing statistics; and the science fiction trope about being in a computer simulation"

"an object lesson in the value of a good old liberal education"

[–] mountainriver@awful.systems 14 points 1 day ago

I usually go with "Scientology for the 21st century". That for most gives just "weird cult", which is close enough for most people.

For those that are into weird cults you get questions about Xenu and such, and can answer "No they are not into Xenu, instead they want to build their god. Out of chatbots". And so on. If they are interested in weird cult shit, and have already accepted that we are talking about weird cults the weirdness isn't a problem. If not, it stops at "Scientology for the 21st century".

[–] bitofhope@awful.systems 16 points 1 day ago (1 children)

I don't think Yud is that hard to explain. He's a science fiction fanboy who never let go of his adolescent delusions of grandeur. He was never successfully disabused from the notion that he's always the smartest person in the room and he didn't pursue high school, let alone college education to give him the expertise to recognize just how difficult his goal is. Blud thinks he's gonna create a superhumanly intelligent machine when he struggles with basic programming tasks.

He's kinda comparable to Elon Musk in a way. Brain uploading and superhuman AI are sort of in the same "cool sci fi tech" category as Mars colonization, brain implants and vactrain gadgetbahns. It's easy to forget that not too many years ago the public's perception of Musk was very different. A lot of people saw him as a cool Tony Stark figure who was finally going to give us our damn flying cars.

Yudkowsky is sometimes good at knowing just a bit more about things than his audience and making it seem like he knows a lot more than he does. The first time I started reading HPMoR I thought the author was an actual theoretical physicist or something and when the story said I could learn everything Harry knows for free on this LessWrong site I though I could learn what it means for something to be "implied by the form of the quantum Hamiltonian" or what that those "timeless formulations of quantum mechanics" were about. Instead it was just poorly paced essays on bog standard logical fallacies and cognitive biases explained using their weird homegrown terminology.

Also, it's really easy to be convinced of thing when you really want to believe in it. I know personally some very smart and worldly people who have been way too impressed by ChatGPT. Convincing people in San Francisco Bay Area that you're about to invent Star Trek technology is basically the national pastime there.

His fantasies of becoming immortal through having a God AI simulate his mind forever aren't the weird part. Any imaginative 15 year old computer nerd can have those fantasies. The weird parts are that he never grew out of those fantasies and that he managed to make some rich and influential contacts while holding on to his chuunibyō delusions.

Anyone can become a cult leader through the power of buying into your own hype and infinite thielbux.

[–] dashdsrdash@awful.systems 9 points 1 day ago

Convincing people in San Francisco Bay Area that you’re about to invent Star Trek technology is basically the national pastime there.

Ding! Ding! Ding! Upvote.

[–] AllNewTypeFace@leminal.space 14 points 1 day ago

The latest in a chain of cults, after Mormonism, the Victorian-era spiritualist fad, Scientology and new-age “quantum” woo, each using the trappings of the exciting scientific/technological ideas of their time to sell the usual proposition (a totalising belief system that answers* all questions).

[–] dashdsrdash@awful.systems 10 points 1 day ago

"Rationalism" is to normal logical thinking what blindfolded multi-board speed chess is to tic-tac-toe: you can only see in retrospect how anyone could get there from here. The things which occupy a Rationalist's mind are completely divorced from ordinary concerns like ethics. Nobody would or could have predicted this quantity or quality of lunacy.

[–] BlueMonday1984@awful.systems 10 points 1 day ago (1 children)

Here's my first shot at it:

"Imagine if the stereotypical high-school nerd became a supervillain."

[–] Architeuthis@awful.systems 8 points 1 day ago

Imagine insecure smart people yes-anding each other into believing siskind and yud are profound thinkers.

[–] Architeuthis@awful.systems 8 points 1 day ago* (last edited 1 day ago) (1 children)

It's pick-me objectivism, only more overtly culty the closer you are to it irl. Imagine scientology if it was organized around AI doomerism and naive utilitarianism while posing as a get-smart-quick scheme.

It's main function (besides getting the early adopters laid) is to provide court philosophers for the technofeudalist billionaire class, while grooming talented young techies into a wide variety of extremist thought both old and new, mostly by fostering contempt of established epistemological authority in the same way Qanons insist people do their own research, i.e. as a euphemism for only paying attention to ingroup approved influencers.

It seems to have both a sexual harassment and a suicide problem, with a lot of irresponsible scientific racism and drug abuse in the mix.

[–] Architeuthis@awful.systems 5 points 1 day ago* (last edited 1 day ago)

Wish I'd found a non clunky way to work "cult incubator" into that.

[–] jaschop@awful.systems 7 points 1 day ago

Didn't come up with that simile, but it might fit:

It's like a fleshed out version of a 12 year old thinking "everything would be great if I was in charge, because I'm smart and people are dumb"

Something about people who are too impressed with their own smarts and swap pet theories that make them feel smart.

[–] Al0neStar@lemmy.world 5 points 1 day ago (1 children)

A few years ago an article by gwern hit the first page of hackernews and idlewords comments do a good job of explaining everything wrong with the rationalist mindset.

[–] bitofhope@awful.systems 7 points 1 day ago

Those comments are tight, but really the problem with trying to explain any of this to laypeople isn't exposing how wrong it is. The hard part is making any sense of it.

Like if I told you Donald Trump has connections with a cult that believes grandmothers are a species of raspberry, whose goal is turning Denmark into cheese and oh, a splinter group of theirs just murdered a police officer. That will just raise more questions than it answers. How the hell did they come to believe that? Why would anyone want that? And then I have to choose between looking like a loony conspiracy theorist, doing an impromptu lecture or just having to decide you actually probably don't want to know.

[–] o7___o7@awful.systems 7 points 1 day ago* (last edited 1 day ago)

How would you first approach explaining this shit past “it can’t be that stupid, you must be explaining it wrong”?

This is the question of the moment, isn't it?

I have no answers, but i can say thanks for being a light in the dumbness.

[–] swlabr@awful.systems 6 points 1 day ago

Honestly, a couple hours over some beers might be the only way.