Soyweiser

joined 2 years ago
[–] Soyweiser@awful.systems 1 points 17 hours ago* (last edited 17 hours ago)

In a way it is amazing, as the science fiction idea is agis behaving like agents to help us out. We dont have agis, but they started to make the agents regardless. Feels very cargo cult, but for fiction. Beam me up Scotty im done.

Reminds me that the state of art short story collection also had a story where they give a semi smart teleport machine the wrong instructions so it teleports itself. (Causing the start of ww3 on earth basically, which fails because corporations suck).

[–] Soyweiser@awful.systems 1 points 17 hours ago

It is not a bad insight, as if you have some of the Ps there is still a place for you in the hierarchy which keeps more people invested in propping it up. (And ideas flow from the bottom to the top as well, as the current genocidal transphobia was much more a Pale Patriarchal thing (the neonazi far right) and the people in power just latched onto that, and added their Ps to it cause it helped them.

And yeah, you have been very tragically blessed with the power of foresight. I recall reading your blog posts a long time ago and thinking you were overreacting a bit. I was wrong.

What did you do to piss off Apollo?

[–] Soyweiser@awful.systems 2 points 17 hours ago

I assume because they are not satisfied with the frequency, and they are all also divorced or afraid of it. (Or tired of the people they abuse not being into it). But it still makes no sense.

Also when the origins of the virus get discovered your head will end up on a pike, Judith Slaying Holofernes style.

[–] Soyweiser@awful.systems 4 points 17 hours ago (1 children)

But those would be willing and a subset of all women, that doesnt fit the domination fantasy. All women they desire must be had, or they feel unfulfilled. (See also the "everybody is 12" theory).

[–] Soyweiser@awful.systems 3 points 1 day ago (2 children)

Also the patriarchy is involved, but my comment was already long enough. (And I didnt mention how nobody seems to talk about the victims in any of this).

[–] Soyweiser@awful.systems 4 points 1 day ago

Yes, and some people when they are reasonably new to discovering stuff like this go a little bit crazy. I had somebody in my bsky mentions who just went full conspiracy theory nut (in the sense of weird caps usage, lot of screenshots of walls of texts, stuff that didn't make sense) about Yarvin (also because I wasn't acting like them they were trying to tell me about Old Moldy, but in a way that made me feel they wanted me to stand next to them on a soapbox and start shouting randomly). I told them acting like a crazy person isn't helping, and I told them they are preaching to the choir. Which of course got me a block. (cherfan75.bsky.social btw, not sure if they toned down their shit). It is quite depressing, literally driving themselves crazy.

And because people blindly follow people who follow them these people can have quite the reach.

[–] Soyweiser@awful.systems 16 points 1 day ago (10 children)
[–] Soyweiser@awful.systems 9 points 1 day ago* (last edited 1 day ago) (8 children)

Starting to get a bit worried people are reinventing stuff like qanon and great evil man theory for Epstein atm. (Not a dig at the people here, but on social media I saw people go act like Epstein created /pol/, lootboxes, gamergate, destroyed gawker (did everyone forget that was Thiel? Mad about how they outed him?) etc. Like only Epstein has agency).

The lesson should be the mega rich are class conscious, dumb as hell, and team up to work on each others interests and dont care about who gets hurt (see how being a pedo sex trafficker wasnt a deal breaker for any of them).

Sorry for the unrelated rant (related: they also got money from Epstein, wonder if that was before or after the sparkling elites article, which was written a few months after Epsteins conviction, june vs sept (not saying those are related btw, just that the article is a nice example of brown-nosing)), but this was annoying me, and posting something like this on bsky while everyone is getting a bit manic about the contents of the files (which seems to not contain a lot of Trump references suddenly) would prob get me some backlash. (That the faked elon rejection email keeps being spread also doesnt help).

I am however also reminded of the Panama papers. (And the unfounded rumors around Marc Dutroux how he was protected by a secret pedophile cult in government, this prob makes me a bit more biasses against those sorts of things).

Sorry had to get it off my chest, but yes it is all very stupid, and I wish there were more consequences for all the people who didnt think his conviction was a deal breaker. (Et tu Chomsky?).

E: note im not saying Yud didnt do sex crimes/sexual abuse. Im complaining about the 'everything is Epstein' conspiracy I see forming.

For an example why this might be a problem: https://bsky.app/profile/joestieb.bsky.social/post/3mdqgsi4k4k2i Joy Gray is ahead of the conspiracy curve here (as all conspiracy theories eventually lead to one thing).

[–] Soyweiser@awful.systems 6 points 1 day ago

“He’s like, ‘I’m just going to give everything to AI. So send me whatever you have.’”

And thats another security flaw.

[–] Soyweiser@awful.systems 5 points 4 days ago

Lot of hitler particles in this one.

[–] Soyweiser@awful.systems 7 points 4 days ago (1 children)

A long time ago I heard a story of some guy arrested here in .nl and they also got his porn collection (which was big at the time, several gigs of images! (to give an indication of how long ago it was)), and it also included some child porn, not because he was hunting for that, but just because he had downloaded everything he could find and some of it were csam images.

What this makes me wonder about is if it is something like this, and if it is, just how much porn did they ingest? I know people have mentioned these things can recreate marvel movies without problems, but would they also be able to recreate whole porn movies? Has anybody tested that?

[–] Soyweiser@awful.systems 2 points 4 days ago

Yep, I have not paid attention to them using these words correctly, but considering how they treat the other rules of Rationalism more as guidelines anyway, I'm gonna guess no.

But compared to an actual 'here is yall speaking like evil robots, stop that' article it is lacking. Which is good, as that is our territory. ;).

 

Via reddits sneerclub. Thanks u/aiworldism.

I have called LW a cult incubator for a while now, and while the term has not catched on, nice to see more reporting on the problem that lw makes you more likely to join a cult.

https://www.aipanic.news/p/the-rationality-trap the original link for the people who dont like archive.is used the archive because I dont like substack and want to discourage its use.

 

As found by @gerikson here, more from the anti anti TESCREAL crowd. How the antis are actually R9PRESENTATIONALism. Ottokar expanded on their idea in a blog post.

Original link.

I have not read the bigger blog post yet btw, just assumed it would be sneerable and posted it here for everyone's amusement. Learn about your own true motives today. (This could be a troll of course, boy does he drop a lot of names and thinks that is enough to link things).

E: alternative title: Ideological Turing Test, a critical failure

 

Original title 'What we talk about when we talk about risk'. article explains medical risk and why the polygenic embryo selection people think about it the wrong way. Includes a mention of one of our Scotts (you know the one). Non archived link: https://theinfinitesimal.substack.com/p/what-we-talk-about-when-we-talk-about

11
submitted 8 months ago* (last edited 8 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems
 

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

 

The interview itself

Got the interview via Dr. Émile P. Torres on twitter

Somebody else sneered: 'Makings of some fantastic sitcom skits here.

"No, I can't wash the skidmarks out of my knickers, love. I'm too busy getting some incredibly high EV worrying done about the Basilisk. Can't you wash them?"

https://mathbabe.org/2024/03/16/an-interview-with-someone-who-left-effective-altruism/

 

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›