80

It's the Guardian, but it's still a good read. All of Sneerclub's favorite people were involved.

Last weekend, Lighthaven was the venue for the Manifest 2024 conference, which, according to the website, is “hosted by Manifold and Manifund”. Manifold is a startup that runs Manifund, a prediction market – a forecasting method that was the ostensible topic of the conference.

Prediction markets are a long-held enthusiasm in the EA and rationalism subcultures, and billed guests included personalities like Scott Siskind, AKA Scott Alexander, founder of Slate Star Codex; misogynistic George Mason University economist Robin Hanson; and Eliezer Yudkowsky, founder of the Machine Intelligence Research Institute (Miri).

Billed speakers from the broader tech world included the Substack co-founder Chris Best and Ben Mann, co-founder of AI startup Anthropic. Alongside these guests, however, were advertised a range of more extreme figures.

One, Jonathan Anomaly, published a paper in 2018 entitled Defending Eugenics, which called for a “non-coercive” or “liberal eugenics” to “increase the prevalence of traits that promote individual and social welfare”. The publication triggered an open letter of protest by Australian academics to the journal that published the paper, and protests at the University of Pennsylvania when he commenced working there in 2019. (Anomaly now works at a private institution in Quito, Ecuador, and claims on his website that US universities have been “ideologically captured”.)

Another, Razib Khan, saw his contract as a New York Times opinion writer abruptly withdrawn just one day after his appointment had been announced, following a Gawker report that highlighted his contributions to outlets including the paleoconservative Taki’s Magazine and anti-immigrant website VDare.

The Michigan State University professor Stephen Hsu, another billed guest, resigned as vice-president of research there in 2020 after protests by the MSU Graduate Employees Union and the MSU student association accusing Hsu of promoting scientific racism.

Brian Chau, executive director of the “effective accelerationist” non-profit Alliance for the Future (AFF), was another billed guest. A report last month catalogued Chau’s long history of racist and sexist online commentary, including false claims about George Floyd, and the claim that the US is a “Black supremacist” country. “Effective accelerationists” argue that human problems are best solved by unrestricted technological development.

Another advertised guest, Michael Lai, is emblematic of tech’s new willingness to intervene in Bay Area politics. Lai, an entrepreneur, was one of a slate of “Democrats for Change” candidates who seized control of the powerful Democratic County Central Committee from progressives, who had previously dominated the body that confers endorsements on candidates for local office.

you are viewing a single comment's thread
view the rest of the comments
[-] dgerard@awful.systems 25 points 5 months ago* (last edited 5 months ago)

this is Lightcone, hosts of the totally not race science convention, falling afoul of the FTX bankruptcy

I’m not quoted in the story, but I did supply a pile of background for it. Authors are Jason Wilson and Ali Winston, who spend a lot of time chasing neo-Nazis for the Guardian US.

original URL: https://www.theguardian.com/technology/article/2024/jun/16/sam-bankman-fried-ftx-eugenics-scientific-racism

[-] sailor_sega_saturn@awful.systems 19 points 5 months ago

We offer cozy nooks with firepits, discussion rooms with endless whiteboards, and up to 44 bedrooms (with up to 80 beds).

Not a cult.

Lighthaven is a space dedicated to hosting events and programs that help people think better and to improve humanity's long-term trajectory.

Definitely not a cult.

Humanity's future could be vast, spanning billions of flourishing galaxies, reaching far into our future light cone [1]. However, it seems humanity might never achieve this; we might not even survive the century. To increase our odds, we build services and infrastructure for people who are helping humanity navigate this crucial period. [1]: Or even more than the light cone, depending on how the acausal trade stuff works out.

Have we mentioned how very much not a cult we are?

[-] dgerard@awful.systems 13 points 5 months ago* (last edited 5 months ago)

The services and infrastructure: hosting a web forum

edit: sorry, three web forums

[-] sue_me_please@awful.systems 5 points 5 months ago

Thank you for doing god's work

[-] mawhrin@awful.systems 4 points 5 months ago* (last edited 5 months ago)

that was noticed by the gobshites and they're not happy about it, i think the tracingwoodgrains person really dislikes you:

I respect that and agree that those comments cross a line that should not be crossed. I'm sympathetic to the value of red lines and taboos, and I regularly put active effort into defending the sentiment that racism is bad and should be condemned (though I am extremely cautious about tabooing people as a whole based on specific bad sentiments).

It's more complicated for me here because as mentioned above, I find Hanania's commentary on other topics unusually valuable and think I have had valuable, worthwhile interactions with him such that I am glad for opportunities to do so.

More than that, I am conscious that many who most eagerly pursue the taboo, including the writers of the Guardian article and people like David Gerard who provided background for it openly despise you, me, and others in these spheres, and given taboo-crafting power would craft a set of norms emphatically disagreeable to me. I think parts of the EA community have themselves shown some susceptibility to similar impulses, throwing people like Nick Bostrom under the bus to do so. That post in particular actively made me more wary of EA spaces and left me wondering who else would be skewered.

The individual who wrote that post no longer works at CEA but openly demands that EA cut ties with the entire rationalist community. I like you and broadly trust your own instincts here, even where we might disagree about where to draw specific lines, but I am extremely wary of yielding norm-setting power to people who treat my approach (engaging seriously with anyone) as worthy of suspicion and condemnation, and I think when they succeed in setting the frame, it works against a lot of the rationalist and rationalist-adjacent community norms I value.

(i find it symptomatic, but not at all surprising that the person who criticised bostrom is not with the movement anymore, but scientific racists and hbd-curious fuckers like tracing… are.)

[-] bcdavid@hachyderm.io 6 points 5 months ago

@mawhrin @dgerard It never ceases to amaze me how anyone can read the word vomit these people fling into the world and think it's good writing.

[-] mawhrin@awful.systems 5 points 5 months ago

it's the type of the very dense cult jargon that you stop noticing only when you're ears-deep into the cult.

[-] dgerard@awful.systems 4 points 5 months ago* (last edited 5 months ago)

I can't work out a search to tell me for sure, but I do believe that's the first link to nu-sneerclub from anywhere on the three rationalist fora

given taboo-crafting power would craft a set of norms emphatically disagreeable to me. I think parts of the EA community have themselves shown some susceptibility to similar impulses, throwing people like Nick Bostrom under the bus to do so.

i'm sure there's a reading of this string of TW conspicuously avoiding saying the specific thing he's talking about that isn't "TW considers the racism a load-bearing feature", and he'll clarify this any time now

[-] gerikson@awful.systems 5 points 5 months ago

I can’t work out a search to tell me for sure, but I do believe that’s the first link to nu-sneerclub from anywhere on the three rationalist fora

Senpais have noticed us!

[-] Evinceo@awful.systems 3 points 5 months ago

Ain't lightcone the ones who funded the effective charity that was a husband, a wife, two employees and a brother in law who fucked an employee, angering the wife? I seem to remember her writing a long tirade about how hot tub meetings and travel photos proved that working conditions at the charity were very good, and there's nothing inappropriate about any of the above.

[-] dgerard@awful.systems 6 points 5 months ago
[-] Evinceo@awful.systems 3 points 5 months ago

Ah, I think I was confused because Ben Pace was investigating nonlinear under the auspices of Lightcone.

this post was submitted on 16 Jun 2024
80 points (100.0% liked)

SneerClub

983 readers
34 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS