20

https://nonesense.substack.com/p/lesswrong-house-style

Given that they are imbeciles given, occasionally, to dangerous ideas, I think it’s worth taking a moment now and then to beat them up. This is another such moment.

you are viewing a single comment's thread
view the rest of the comments
[-] SubArcticTundra@lemmy.ml 3 points 1 day ago

I'm out of the loop: what is lesswrong and why is it cringe?

[-] Soyweiser@awful.systems 9 points 15 hours ago

Rationalwiki (not affiliated with LW Rationalists, the opposite actually, op is a mod there) has a page on it. https://rationalwiki.org/wiki/Less_wrong

[-] SubArcticTundra@lemmy.ml 1 points 24 minutes ago

Ok rationalwiki actually seems like a really useful resource for reading up on which sexy new movements are bullshit and which aren't

[-] captainlezbian@lemmy.world 5 points 13 hours ago

That sounds like a religion insisting it isn’t one

[-] AnarchistArtificer@slrpnk.net 1 points 2 hours ago* (last edited 2 hours ago)

They do seem to worship Bayes

Edit: I want to qualify that I'm a big fan of Bayes Theorem — in my field, there's some awesome stuff being done with Bayesian models that would be impossible to do with frequentist statistics. Any scorn in my comment is directed at the religious fervour that LW directs at Bayesian statistics, not at the stats themselves.

I say this to emphasise that LWers aren't cringe for being super enthusiastic about maths. It's the everything else that makes them cringe

[-] Soyweiser@awful.systems 6 points 9 hours ago

I think it is a little bit more complicated, Im one of the few mentioning this however, so it isnt a common idea I think. I think it isnt directly a cult/religion, but stealing the language of Silicon Valley, it is a cult incubator. Reading these things, having these beliefs about AGI and rationality makes you more susceptible to join or start cult like groups. The less wrong article "every cause wants to be a cult" doesnt help for example, neither does it when they speak highly of the methods os scientology. The various spinoffs and how many of these groups act cultlike and use cultlike shit makes me think this.

So it is worse in a way.

[-] Architeuthis@awful.systems 5 points 8 hours ago

There's also the communal living, the workplace polyamory along with the prominence of the consensual non-consensual kink, the tithing of the bulk of your earnings and the extreme goals-justify-the-means moralising, the emphasis on psychedelics and prescription amphetamines, and so on and so forth.

Meaning, while calling them a cult incubator is actually really insightful and well put, I have a feeling that the closer you get to TESCREAL epicenters like the SFB the more explicitly culty things start to get.

[-] Soyweiser@awful.systems 4 points 8 hours ago* (last edited 8 hours ago)

Yeah but tescreal is a name we give them, themselves organise in different groups (which fit into the term yes). They have different parts pf the tescreal, but it all ends up in culty behaviour, just a different cult.

Btw see also love bombing with Quantum Scott. There was also the weird LW people who ended up protesting other LW people in the crazy way (didnt it include robes or something, I dont recall much). Or calling Scottstar the rightful caliph when Yud was posting less.

So my point is more they morph into different cults, and wonder how much they use this lack of singular cult as a way to claim they are not a cult. Or whatever rot13ed word they used for cult.

E: not that all this really matters in the grand scheme of things. just a personal hangup.

[-] sailor_sega_saturn@awful.systems 6 points 7 hours ago* (last edited 7 hours ago)

whatever rot13ed word they used for cult.

It's impossible to read a post here without going down some weird internet rabbit hole isn't it? This is totally off topic but I was reading the comments on this old phyg post, and one of the comments said (seemingly seriously):

It's true that lots of Utilitarianisms have corner cases where they support action that would normally considered awful. But most of them involve highly hypothetical scenarios that seldom happen, such as convicting an innocent man to please a mob.

And I'm just thinking, riight highly hypothetical.

[-] istewart@awful.systems 6 points 10 hours ago

It is a peculiar sort of faith movement, where the central devotional practice is wandering around pulling made-up probability estimates out of one's ass

[-] froztbyte@awful.systems 2 points 8 hours ago

and then posting walls of text about them not merely burying the lede but quite fully conspiring to eliminate the evidence and all witnesses in the same go, as a starting condition

[-] SubArcticTundra@lemmy.ml 4 points 14 hours ago
[-] Architeuthis@awful.systems 11 points 17 hours ago* (last edited 17 hours ago)

It's complicated.

It's basically a forum created to venerate the works and ideas of that guy who in the first wave of LLM hype had an editorial published in TIME where he called for a worldwide moratorium on AI research and GPU sales to be enforced with unilateral airstrikes, and whose core audience got there by being groomed by one the most obnoxious Harry Potter fanfictions ever written, by said guy.

Their function these days tends to be to provide an ideological backbone of bad scifi justifications to deregulation and the billionaire takeover of the state, which among other things has made them hugely influential in the AI space.

They are also communicating vessels with Effective Altruism.

If this piques your interest check the links on the sidecard.

[-] SubArcticTundra@lemmy.ml 3 points 14 hours ago

They are also communicating vessels with Effective Altruism.

I have a basic understanding of what EA is but what do you mean by communicating vessels?

[-] Architeuthis@awful.systems 5 points 10 hours ago* (last edited 9 hours ago)

EA started as an offshoot of LessWrong, and LW-style rationalism is still the main gateway into EA as it's pushed relentlessly in those circles, and EA contributes vast amounts of money back into LW goals. Air strikes against datacenters guy is basically bankrolled by Effective Altruism and is also the reason EA considers magic AIs (so called Artificial Super Intelligences) by far the most important risk to humanity's existence; they consider climate change mostly survivable and thus of far less importance, for instance.

Needless to say, LLM peddlers loved that (when they aren't already LW/EAs or adjacent themselves, like the previous OpenAI administrative board before Altman and Microsoft took over). edit: also the founders of Anthropic.

Basically you can't discuss one without referencing the other.

[-] zbyte64@awful.systems 10 points 23 hours ago

They're Basically fanboys of whatever the latest cult is coming out of silicon valley.

this post was submitted on 30 Nov 2024
20 points (100.0% liked)

SneerClub

983 readers
33 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS