30

At various points, on Twitter, Jezos has defined effective accelerationism as “a memetic optimism virus,” “a meta-religion,” “a hypercognitive biohack,” “a form of spirituality,” and “not a cult.” ...

When he’s not tweeting about e/acc, Verdon runs Extropic, which he started in 2022. Some of his startup capital came from a side NFT business, which he started while still working at Google’s moonshot lab X. The project began as an April Fools joke, but when it started making real money, he kept going: “It's like it was meta-ironic and then became post-ironic.” ...

On Twitter, Jezos described the company as an “AI Manhattan Project” and once quipped, “If you knew what I was building, you’d try to ban it.”

top 50 comments
sorted by: hot top controversial new old
[-] dgerard@awful.systems 21 points 1 year ago

the wonderful thing about this story is the effort Forbes went to to dox a nazi

[-] Evinceo@awful.systems 12 points 1 year ago

I spent way too much time arguing that NYT didn't dox Slatescott.

[-] GorillasAreForEating@awful.systems 12 points 1 year ago* (last edited 1 year ago)

I highly suspect the voice analysis thing was just to confirm what they already knew, otherwise it would have been like looking for a needle in a haystack.

People on twitter have been speculating that someone who knew him simply ratted him out.

[-] dgerard@awful.systems 14 points 1 year ago

i mean, probably. but also, nazis are just dogshit at opsec.

I still find it amusing that Siskind complained about being "doxxed" when he used his real first and middle name.

[-] GorillasAreForEating@awful.systems 9 points 1 year ago* (last edited 1 year ago)

update: Verdon is now accusing another AI researcher of exposing him: https://twitter.com/GillVerd/status/1730796306535514472

[-] self@awful.systems 17 points 1 year ago

posting a screenshot to preserve the cringe in its most potent form:

yeah BasedBeffJezos is just an ironic fascist persona that has nothing to do with who I am, that’s why I’m gonna threaten anyone who associates me with BasedBeffJezos

[-] maol@awful.systems 11 points 1 year ago

Whoever said that all twitter bluechecks talk like anime villains was spot on

[-] self@awful.systems 11 points 1 year ago

you might not like the consequences for exposing me as BasedEyesWHITEdragon yu-gi-boy!!!

[-] gerikson@awful.systems 9 points 1 year ago

Dude doxxed protest too much.

load more comments (1 replies)
[-] TinyTimmyTokyo@awful.systems 9 points 1 year ago

Reading his timeline since the revelation is weird and creepy. It's full of SV investors robotically pledging their money (and fealty) to his future efforts. If anyone still needs evidence that SV is a hive mind of distorted and dangerous group-think, this is it.

[-] sc_griffith@awful.systems 21 points 1 year ago

He noted that Jezos doesn’t reflect his IRL personality. “The memetics and the sort-of bombastic personality, it's what gets algorithmically amplified,” he said, but in real life, “I’m just a gentle Canadian.”

uwu im just a smollbean canadian

[-] self@awful.systems 19 points 1 year ago

Jezos has defined effective accelerationism as “a memetic optimism virus,” “a meta-religion,” “a hypercognitive biohack,” “a form of spirituality,” and “not a cult.”

“It’s like it was meta-ironic and then became post-ironic.”

Jezos described the company as an “AI Manhattan Project” and once quipped, “If you knew what I was building, you’d try to ban it.”

“Our goal is really to increase the scope and scale of civilization as measured in terms of its energy production and consumption,” he said. Of the Jezos persona, he said: “If you're going to create an ideology in the time of social media, you’ve got to engineer it to be viral.”

Guillaume “BasedBeffJezos” Verdon appears, by all accounts, to be an utterly insufferable shithead with no redeeming qualities

[-] GorillasAreForEating@awful.systems 24 points 1 year ago* (last edited 1 year ago)

“Our goal is really to increase the scope and scale of civilization as measured in terms of its energy production and consumption,” h

old and busted: paperclip maximizer

new hotness: entropy maximizer

[-] Soyweiser@awful.systems 18 points 1 year ago* (last edited 1 year ago)

you’ve got to engineer it to be viral.

All this attention including the whole Andreessen thing and he doesn't go above 50k followers. As far as virality goes that is pretty bad.

Also from his twitter: "We literally became a sufficient threat to the system that they felt compelled to attempt to neutralize me.

The thing is, I am a man of belief. You can take everything from me. I don't care. I am going to keep going until I'm dead.

You cannot stop acceleration."

Sure buddy the system did that. Keep riding the wave, in five years we will all gather on a steep hill in Las Vegas and look west.

[-] self@awful.systems 14 points 1 year ago* (last edited 1 year ago)

imagine huffing your own farts this hard cause you came up with BasedBeffJezos and posted garbage on Twitter that was so out of touch that monstrous every billionaire instantly agreed with it

[-] self@awful.systems 16 points 1 year ago

oh also, meta-ironic cult is also a term used by Remilia and some of the other Thiel death cults to describe themselves

[-] froztbyte@awful.systems 11 points 1 year ago

"to any feds reading my feed: jk jk"

[-] blakestacey@awful.systems 9 points 1 year ago

A fart-huff that hard qualifies as an Alvistime miracle.

[-] froztbyte@awful.systems 9 points 1 year ago

“As measured in terms of its energy production and consumption”

That’s so extremely fucking insane, jesus. We are already dealing with those issues at the “low” end and they want to fucking accelerate it? Christ these people are the fucking worst

[-] locallynonlinear@awful.systems 10 points 1 year ago

I had kind of the same thought. Woah, maximize long term energy production??? How novel, let's get our best people right on that, thanks for mentioning it, gosh didnt occur to anyone.

I wonder when it finally occurs to them that the monetary system is literally a proxy for energy production and consumption, and their entire philosophy might as well read: "make more $$$." I'll have to ask the stupid question again, what material difference is there between e/acc, ea, and delusion?

[-] gerikson@awful.systems 10 points 1 year ago

There's a subtype of goldbugs that want "hard money" to be represented by something more universal than gold, like energy. It's why they convince themselves Bitcoin is worth something. Maybe this joker is one of them.

[-] froztbyte@awful.systems 7 points 1 year ago

I mean my personal thoughts on it are nuanced. We can’t exactly just stop everything, and there’s a clear (and valid) demand for some cases of energy use

But we also have some extreme problems already, and we reaaaaally need to get a handle on those too. Just headlong running into using more and more is fucking insane

[-] locallynonlinear@awful.systems 9 points 1 year ago

Yes, I agree. My personal thoughts are also that long term energy maximization is synonymous with regulatorial systems and dealing with the complications of energy use. Paradoxically long term maximization is defeated by any naive short term abuse. Only a naive understanding of physics supports the idea that you can simply, just produce and use more energy just like that.

Which is why theae takes don't mean, anything. It's a revelation to want money and do stupid without consequence.

[-] sailor_sega_saturn@awful.systems 16 points 1 year ago

former Google engineer.

Of course. At this point whenever I read something with the phrase former Google engineer I'm just gonna assume they're doing something terrible.

[-] froztbyte@awful.systems 7 points 1 year ago

q: how do you know if someone's a former google engineer?

xoogler's everywhere, a: AT GOOGLE WE USED TO HAVE A WAY TO...

[-] swlabr@awful.systems 8 points 1 year ago

Ah fuck as a xoogler I do this. Everything I do is terrible (see my advent of code snippets) and I frequently refer to things that existed in the google ecosystem.

[-] m@blat.at 8 points 1 year ago

@swlabr As a ten year veteran of the SRE mines I’ve always tried really hard not to do this, but I did once leave a job partly as a result of the CTO justifying a decision with “But it says here in the SRE book that that’s the way they do this at Google!” and completely ignoring my protestations that god no, that certainly wasn’t how we did it at least in my bit of SRE.

[-] swlabr@awful.systems 7 points 1 year ago

Oh GOD that is kafkaesque. It's been too few jobs since leaving the G for that to happen to me yet, but I'm sure I'll get there one day.

load more comments (1 replies)
load more comments (2 replies)
[-] Shitgenstein1@awful.systems 16 points 1 year ago

In its reaction against both EA and AI safety advocates, e/acc also explicitly pays tribute to another longtime Silicon Valley idea. “This is very traditional libertarian right-wing hostility to regulation," said Benjamin Noys, a professor of critical theory at the University of Chichester and scholar of accelerationism. Jezos calls it the “libertarian e/acc path.”

At least the Italian futurists were up front about their agenda.

“We’re trying to solve culture by engineering,” Verdon said. “When you're an entrepreneur, you engineer ways to incentivize certain behaviors via gradients and reward, and you can program a civilizational system."

Reading Nudge to engineer the 'Volksschädling' to board the trains voluntarily. Dusting off the old state eugenics compensation programs.

[-] bitofhope@awful.systems 16 points 1 year ago

The fuck do they mean "solve culture"? Is culture a problem to be solved? Actually don't answer that.

[-] self@awful.systems 10 points 1 year ago

even more horrifying — they see culture as a system of equations they can use AI to generate solutions for, and the correct set of solutions will give them absolute control over culture. they apply this to all aspects of society. these assholes didn’t understand hitchhiker’s guide to the galaxy or any of the other sci fi they cribbed these ideas from, and it shows

[-] 200fifty@awful.systems 10 points 1 year ago

It's like pickup artistry on a societal scale.

It really does illustrate the way they see culture not as, like, a beautiful evolving dynamic system that makes life worth living, but instead as a stupid game to be won or a nuisance getting in the way of their world domination efforts

[-] dgerard@awful.systems 8 points 1 year ago

remember that Yudkowsky's CEV idea was literally to analytically solve ethics

[-] blakestacey@awful.systems 8 points 1 year ago

In an essay that somehow manages to offhandendly mention both evolutionary psychology and hentai anime in the same paragraph.

load more comments (6 replies)
[-] raktheundead@fedia.io 9 points 1 year ago

The ultimate STEMlord misunderstanding of culture; something absolutely rife in the Silicon Valley tech-sphere.

[-] gerikson@awful.systems 10 points 1 year ago

These dudes wouldn't recognize culture if unsafed its Browning and shot them in the kneecaps.

load more comments (2 replies)
[-] froztbyte@awful.systems 9 points 1 year ago

Don’t have to have Culture War when you can just systemically deploy the exact culture you want right from the comfort of your prompt, amirite?!

(This is a shitpost idea but it’s probably halfway accurate, maybe modulo the prompt (but there will definitely be someone also trying that))

[-] dgerard@awful.systems 8 points 1 year ago
[-] gerikson@awful.systems 8 points 1 year ago* (last edited 1 year ago)

The best use case for Urbit is marking its proponents as first up against the wall when the revolution comes.

[-] gerikson@awful.systems 16 points 1 year ago

HN discovers this article, almost a day later (laggards): https://news.ycombinator.com/item?id=38500192

A voice analysis conducted by Catalin Grigoras, Director of the National Center for Media Forensics, compared audio recordings of Jezos and talks given by Verdon

A particularly creepy doxxing by Forbes...

Oh no, are the tools developed by SV startups being used for stuff you don't like? How sad HN.

[-] sc_griffith@awful.systems 13 points 1 year ago

I don't want to libel the author by claiming the piece was planted by Beff himself, but what's more likely, this writer who mostly covers what's trending on tiktok gets a big scoop using voice recognition on a twitter space and then writes a glowing dossier? Or beff got on the phone with a publicist and conjured a big reveal with a softball interview all while namedropping his new startup?

extremely funny that they think this article makes him look good. also extremely funny that they think this is a big scoop

[-] hairyvisionary@fosstodon.org 8 points 1 year ago

@sc_griffith @gerikson
Buried in that piece is the probable typo but certainly pointed "On X, the platform formally known as Twitter"

load more comments (2 replies)
[-] blakestacey@awful.systems 15 points 1 year ago

my "not a cult" T-shirt has raised many questions, etc.

[-] naevaTheRat@lemmy.dbzer0.com 11 points 1 year ago

That not a cult quote strategically placed after all the cultish babble quotes mwah perfect journalism

[-] rip_art_bell@lemmy.world 7 points 1 year ago

Nerd rapture goofiness

load more comments
view more: next ›
this post was submitted on 02 Dec 2023
30 points (100.0% liked)

SneerClub

983 readers
34 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS