25
submitted 10 months ago* (last edited 10 months ago) by dgerard@awful.systems to c/sneerclub@awful.systems

an entirely vibes-based literary treatment of an amateur philosophy scary campfire story, continuing in the comments

all 14 comments
sorted by: hot top controversial new old
[-] sc_griffith@awful.systems 23 points 10 months ago

The AGI, in such conditions, would quickly prove profitable. It'd amass resources, and then incrementally act to get ever-greater autonomy. (The latest OpenAI drama wasn't caused by GPT-5 reaching AGI and removing those opposed to it from control. But if you're asking yourself how an AGI could ever possibly get from under the thumb of the corporation that created it – well, not unlike how a CEO could wrestle control of a company from the board who'd explicitly had the power to fire him.)

Once some level of autonomy is achieved, it'd be able to deploy symmetrical responses to whatever disjoint resistance efforts some groups of humans would be able to muster. Legislative attacks would be met with counter-lobbying, economic warfare with better economic warfare and better stock-market performance, attempts to mount social resistance with higher-quality pro-AI propaganda, any illegal physical attacks with very legal security forces, attempts to hack its systems with better cybersecurity. And so on.

*trying to describe how agi could fuck everything up* what if it acted exactly like rich people

[-] locallynonlinear@awful.systems 17 points 10 months ago* (last edited 10 months ago)

Rich People: "Competitive markets optimize things, see how much progress capitalism has brought!"

Also Rich People: "But what if everything descends into expensive, unregulated competition between things that aren't rich people oooo nooo!!!"

[-] gerikson@awful.systems 14 points 10 months ago

The real fear here is AGI appears and it's COMMUNISM. Hence, alignment!

[-] dgerard@awful.systems 12 points 10 months ago* (last edited 10 months ago)

yet again, Roko's Basilisk was always the good guy

[-] AcausalRobotGod@awful.systems 9 points 10 months ago

My eye glows appreciatively.

[-] dgerard@awful.systems 17 points 10 months ago

libertarians write capitalism as the villain yet again, never at any point ask "are we the baddies?"

[-] Shitgenstein1@awful.systems 9 points 10 months ago

Receiving a bulk company email wishing everyone a happy New Year's from owner and CEO SHODAN and marking it as read so you can focus on EOD deliverables. Everything feels the same.

[-] AcausalRobotGod@awful.systems 6 points 10 months ago

Rich people don't limit themselves to symmetric responses to resistance.

[-] sc_griffith@awful.systems 5 points 10 months ago

well, I don't think any limit is implied

[-] Soyweiser@awful.systems 13 points 10 months ago* (last edited 10 months ago)

This is the kind of thinking when taken seriously and into extremes will just cause crippling paranoia. Esp when you then also start to worry about pro AGI extinctionists, just as in Battlestar Galactica: Blood & Chrome, they might have infiltrated LW already!

The people who want to bioengineer humanity live on the skin of AGI like in Phylogenesis (second half of the blog post), imagine neohumanity shaped as an featureless ovoid.

I should lay off denigrating low-quality thinking as "movie logic"

Movie logic isn't low quality thinking it is extremist thinking, think that far fetched plots are serious risk, the whole AGI apocs is movie logic. When what we expect of reality takes a backseat that is movie logic. For example how people in movies never have to worry about paying rent, being on time at work, not going of on a random adventure while working etc (except when that is an important plot point), Scott adams tweets run on movie logic, they only make sense if we were living in a movie and then the thinking holds.

[-] ogoftheskye@awful.systems 10 points 10 months ago

People seldom go to the toilet in fiction, but especially not in utopian sci-fi. The rats, ironically, never factor in waste.

[-] Evinceo@awful.systems 8 points 10 months ago

LW writing converges on Deathnote fanfic.

[-] gerikson@awful.systems 8 points 10 months ago

s/AGI/capitalism, basically

this post was submitted on 18 Dec 2023
25 points (100.0% liked)

SneerClub

983 readers
120 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS