this post was submitted on 26 Dec 2025
25 points (96.3% liked)

SneerClub

1215 readers
25 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS
 

Its almost the end of the year so most US nonprofits which want to remain nonprofits have filed Form 990 for 2024 including some run by our dear friends. This is a mandatory financial report.

  • Lightcone Infrastructure is here. They operate LessWrong and the Lighthaven campus in Berkeley but list no physical assets; someone on Reddit says that they let fellow travelers like Scott Alexander use their old rented office for free. "We are a registered 501(c)3 and are IMO the best bet you have for converting money into good futures for humanity." They also published a book and website with common-sense, data-based advice for Democratic Party leaders which I am sure fills a gap in the literature. https://decidingtowin.org/
  • CFAR is here. They seem to own the campus in Berkeley ("Land, buildings, and equipment ... less depreciation; $22,026,042"). I don't know what else they do since they stopped teaching rationality workshops in 2016 or so and pivoted to worrying about building Colossus.
  • MIRI is here. They pay Yud ($599,970 in 2024!) and after failing to publish much research on how to build Friend Computer they pivoted to arguing that Friend Computer might not be our friend. Edit: they had about $16 million in mostly financial assets (cash, investments, etc.) at end of year but spent $6.5m against $1.5m of revenue in 2024. Since 2021 they have been consuming a $25 million donation they received that year.
  • BEMC Foundation is here. This husband-and-wife organization gives about $2 million/year each to Vox Future Perfect and GiveWell out of tens of millions of dollars in capital.
  • The Clear Fund (GiveWell) is here. They have the biggest wad of cash and the highest cashflow.
  • Edit: Open Philanthropy (now Coefficient Giving) is here (they have two sister organizations). David Gerard says they are mainly a way for Dustin Moskevitz the co-founder of Facebook to organize donations, like the Gates, Carnegie, and Rockefeller foundations. They used to fund Lightcone.
  • Edit: Animal Charity Evaluators is here. They have funded Vox Future Perfect (in 2020-2021) and the longtermist kind of animal welfare ("if humans eating pigs is bad, isn't whales eating krill worse?")
  • Edit: Survival and Flourishing Fund does not seem to be a charity. Whereas a Lightcone staffer says that SFF funds Lightcone, SFF say that they just connect applicants to donors and evaluate grant applications. So who exactly is providing the money? Sometimes its Jaan Tallinn of Skype and Kazaa.
  • Centre for Effective Altruism is mostly British but has a US wing since March 2025 https://projects.propublica.org/nonprofits/organizations/333737390
  • Edit: Giving What We Can seems like a mainstream "bednets and deworming pills" type of charity
  • Edit: Givedirectly Inc is an excellent idea in principle (give money to poor people overseas and let them figure out how best to use it) but their auditor flagged them for Material noncompliance and Material weakness in internal controls. The mistakes don't seem sinister (they classified $39 million of donations as conditional rather than unconditional- ie. with more restrictions than they actually had). GiveDirectly, Give What We Can, and GiveWell are all much better funded than the core LessWrong organizations.

Since CFAR seem to own Lighthaven, its curious that Lightcone head Oliver Habryka threatens to sell it if Lightcone shut down. One might almost imagine that boundaries between all these organizations are not as clear as the org charts make it seem. SFGate says that it cost $16.5 million plus renovations:

Who are these owners? The property belongs to a limited liability company called Lightcone Rose Garden, which appears to be a stand-in for the nonprofit Center for Applied Rationality and its project, Lightcone Infrastructure. Both of these organizations list the address, 2740 Telegraph Ave., as their home on public filings. They’ve renovated the inn, named it Lighthaven, and now use it to host events, often related to the organizations’ work in cognitive science, artificial intelligence safety and “longtermism.”

Habryka was boasting about the campus in 2024 and said that Lightcone budgeted $6.25 million on renovating the campus that year. It also seems odd for a nonprofit to spend money renovating a property that belongs to another nonprofit.

On LessWrong Habryka also mentions "a property we (Lightcone) own right next to Lighthaven, which is worth around $1M." Lightcone's 2024 paperwork listed the only assets as cash and accounts receivable. So either they are passing around assets like the last plastic cup at a frat party, or they bought this recently while the dispute with the trustees was ongoing, or Habryka does not know what his organization actually owns.

The California end seems to be burning money, as many movements with apocalyptic messages and inexperienced managers do. Revenue was significantly less than expenses and assets of CFAR are close to liabilities. CFAR/Lightcone do not have the $4.9 million liquid assets which the FTX trustees want back and claim their escrow company lost another $1 million of FTX's money.

you are viewing a single comment's thread
view the rest of the comments
[–] dgerard@awful.systems 12 points 1 day ago* (last edited 1 day ago) (2 children)

I've noted on Reddit sneerclub previously that this is not an outrageous sort of amount for a Bay Area nonprofit to pay a C-level or a high level specialist. We might look and go "he's a high level specialist in dumb nonsense", but it's not a facially outrageous number. Those region 2 DVDs aren't going to buy themselves, you know.

[–] Architeuthis@awful.systems 6 points 22 hours ago* (last edited 22 hours ago) (2 children)

Still, it merits pointing out that this explicitly isn't happening because the private sector is clamoring to get some of that EY expertise on nothing the moment he's available, but because MIRI is for all intents and purposes a gravy train for a small set of mutual acquaintances who occasionally have a board meeting to decide how much they'll get paid that year.

transcription

The way it actually works is that I'm on the critical path for our organizational mission, and paying me less would require me to do things that take up time and energy in order to get by with a smaller income. Then, assuming all goes well, future intergalactic civilizations would look back and think this was incredibly stupid; in much the same way that letting billions of person-containing brains rot in graves, and humanity allocating less than a million dollars per year to the Singularity Institute, would predictably look pretty stupid in retrospect. At Singularity Institute board meetings we at least try not to do things which will predictably make future intergalactic civilizations think we were being willfully stupid. That's all there is to it, and no more.

This is from back when MIRI, then Singularity Institute, was paying him like $120K/y -- https://www.lesswrong.com/posts/qqhdj3W3vSfB5E9ss/siai-an-examination?commentId=4wo4bD9kkA22K5exH#4wo4bD9kkA22K5exH

[–] blakestacey@awful.systems 8 points 21 hours ago (1 children)

A downvoted reply:

That is rather peculiar reasoning to hear from you. You seem to be acting with a level of self-importance that would only be justified if there will be some future being that will torture trans-Singularity trans-humans for not having done enough to accelerate the onset of the Singularity.

And that's just stupid.

[–] lagrangeinterpolator@awful.systems 8 points 19 hours ago (1 children)

I went deep into the Yud lore once. A single fluke SAT score served as the basis for Yud's belief in his own world-changing importance. In middle school, he took an SAT with a score of 670 verbal and 740 math (maximum 800 each) and the Midwest Talent Search contacted him to tell him that his scores were very high for a middle schooler. Despite his great pains to talk about how he tried to be humble about it, he also says that he was in the "99.9998th percentile" and "not only bright but waayy out of the ordinary."

I was in the math contest scene. I have good friends who did well on AP Calculus in middle school, and were skilled enough at contests that they would have easily gotten an 800 on the math SAT if they took it. Even so, there were middle schoolers who were far more skilled than them, and I have seen other people who were far less "talented" in middle school rise to great heights later in life. As it turns out, skills can be developed through practice.

Yud's performance would not even be considered impressive in the math contest community, let alone justify calling him one of the most important people in the world. Perhaps at the time, he didn't know better. But he decided to make this a core part of his self-identity. His life quickly spiraled out of control, starting with him refusing to attend high school.

[–] CinnasVerses@awful.systems 4 points 18 hours ago (1 children)

I want to steer this conversation away from psychology to say that this scene still seems to depend on ten or so wealthy patrons (Thiel, Moskevitz, Buterin, Tallinn, SBF, anonymous crypto donor, McClave). Owners of social media sites like Twitter and Substack find them amusing too. Without that money and media backing they would just be another bohemian social movement.

[–] blakestacey@awful.systems 3 points 18 hours ago (1 children)

I mean, that's probably more patrons than Dimes Square

[–] CinnasVerses@awful.systems 3 points 16 hours ago

I am blissfully ignorant about social scenes in NYC like people in NYC are ignorant of scenes in the city and country where I live.

[–] scruiser@awful.systems 6 points 20 hours ago

this explicitly isn’t happening because the private sector is clamoring to get some of that EY expertise

I mean, Peter Thiel might like him to bend the knee and I'm sure OpenAI/Anthropic would love to have him as a shill, idk if they'd actually pay 600K for it. Also it would be a betrayal of every belief about AI Eliezer claims to have, so in principle it really shouldn't take lucrative compensation to keep him from it.

paying me less would require me to do things that take up time and energy in order to get by with a smaller income

Well... it is an improvement on cults making their members act as the leader's servants/slaves because the leader's time/effort is allegedly so valuable!

[–] Soyweiser@awful.systems 8 points 1 day ago

I'm in the wrong racket