Soyweiser

joined 2 years ago
[–] Soyweiser@awful.systems 5 points 8 months ago

cat o nine tails made out of utp cables, popular at nerdy bdsm stuff.

(im joking dont do this).

[–] Soyweiser@awful.systems 7 points 8 months ago (2 children)

In the netherlands we had a big cybersecurity grifter (who was invited onto talkshows while anybody with actual cybersecurity exp/knowledge went 'wtf is she talking about' it was really bizarre), who also claimed all her stuff was being done secretly and she wasn't allowed to talk about it so think of that when people claim these secret national security claims. (And remember talking about you having a secret which you aren't allowed to reveal is revealing something, so if people say this about quantum they are already a bit sus).

[–] Soyweiser@awful.systems 2 points 8 months ago

if I cant see it, it isn't real. I don't even deny quantum, I deny you, and myself if no mirror is near.

[–] Soyweiser@awful.systems 6 points 8 months ago* (last edited 8 months ago)

My copy of "the singularity is near" also does that btw.

(E: Still looking to confirm that this isn't just my copy, or it if is common, but when I'm in a library I never think to look for the book, and I don't think I have ever seen the book anywhere anyway. It is the 'our sole responsibility...' quote, no idea which page, but it was early on in the book. 'Yudnowsky').

Image and transcript

Transcript: Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve....[T]here are no hard problems, only problems that are hard to a certain level of intelligence. Move the smallest bit upwards [in level of intelligence], and some problems will suddenly move from "impossible" to "obvious." Move a substantial degree upwards and all of them will become obvious.

—ELIEZER S. YUDNOWSKY, STARING INTO THE SINGULARITY, 1996

Transcript end.

How little has changed, he has always believed intelligence is magic. Also lol on the 'smallest bit'. Not totally fair to sneer at this as he wrote this when he was 17, but oof being quoted in a book like this will not have been good for Yudkowskys ego.

[–] Soyweiser@awful.systems 8 points 8 months ago

https://bsky.app/profile/robertdownen.bsky.social/post/3lwwntxygqc2w Thiel doing a neo-nazi thing. For people keeping score.

[–] Soyweiser@awful.systems 6 points 8 months ago* (last edited 8 months ago) (1 children)

Interesting wondering if they manage to come further in the process than our gov, which seems to restart the process every few years, and then either discovers nobody wants to do it (it being building bigger reactors, not the smaller ones, which iirc from a post here are not likely two work out) for a reasonable price, or the gov falls again over their lies about foreigners and we restart the whole voting cycle again. (It is getting really crazy, our fused green/labour party is now being called the dumbest stuff by the big rightwing liberal party (who are not openly far right, just courting it a lot)).

29 okt are our new elections. Lets see what the ratio between formation and actually ruling is going to be this time. (Last time it took 223 days for a cabinet to form, and from my calculations they ruled for only 336 days).

[–] Soyweiser@awful.systems 8 points 8 months ago

You are correct, I'm just thinking they are going to push quantum like the next big thing to drive up stock prices/investments and use it to restart the hopes for AGI. (the LLM method didn't work, lets talk about quantum and hope that will eventually give us something to latch more capabilities, hope and stock hype on). Just to put my own comment into perspective.

[–] Soyweiser@awful.systems 6 points 8 months ago

Proof that we live in the bad place.

[–] Soyweiser@awful.systems 6 points 8 months ago* (last edited 8 months ago) (2 children)

I think they will just start to make up capabilities, also with the added capabilities of quantum of a computing paradigm, AGI is back on the menu. Now, due to quantum without all the expensive datacenters and problems. We are gonna put quantum in glasses! VR/Augmented reality quantum AI glasses!

[–] Soyweiser@awful.systems 9 points 8 months ago* (last edited 8 months ago) (1 children)

So, as I have been on a cult comparison kick lately, how did it work for those doomsday cults when the world didn't end, and they picked a new date, did they become more radicalized or less? (I'm not sure myself, I'd assume it would be the people disappointed leave, and the rest get worse).

E: ah: https://slate.com/technology/2011/05/apocalypse-2011-what-happens-to-a-doomsday-cult-when-the-world-doesn-t-end.html

... prophecies, per se, almost never fail. They are instead component parts of a complex and interwoven belief system which tends to be very resilient to challenge from outsiders. While the rest of us might focus on the accuracy of an isolated claim as a test of a group’s legitimacy, those who are part of that group—and already accept its whole theology—may not be troubled by what seems to them like a minor mismatch. A few people might abandon the group, typically the newest or least-committed adherents, but the vast majority experience little cognitive dissonance and so make only minor adjustments to their beliefs. They carry on, often feeling more spiritually enriched as a result.

[–] Soyweiser@awful.systems 9 points 8 months ago

Surely they have proof for the already increased capabilities of coding. Because increased capabilities is quite something to claim. It isn't just productivity, but capabilities. Can they put a line on the graph where capabilities reach the 'can solve the knapsack problem correctly and fast' bit?

[–] Soyweiser@awful.systems 8 points 8 months ago* (last edited 8 months ago)

Another sign (posted somewhere else that a lot of quantum companies are faling their numbers) quantum might be next, Musk is in on it. When the general public finally sours on Musk and realizes he doesnt have the tech skills people claim he has the backlash will have hit big time.

E: https://bsky.app/profile/dankpasta.bsky.social/post/3lwv4igokas2v

 

As found by @gerikson here, more from the anti anti TESCREAL crowd. How the antis are actually R9PRESENTATIONALism. Ottokar expanded on their idea in a blog post.

Original link.

I have not read the bigger blog post yet btw, just assumed it would be sneerable and posted it here for everyone's amusement. Learn about your own true motives today. (This could be a troll of course, boy does he drop a lot of names and thinks that is enough to link things).

E: alternative title: Ideological Turing Test, a critical failure

 

Original title 'What we talk about when we talk about risk'. article explains medical risk and why the polygenic embryo selection people think about it the wrong way. Includes a mention of one of our Scotts (you know the one). Non archived link: https://theinfinitesimal.substack.com/p/what-we-talk-about-when-we-talk-about

 

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

 

The interview itself

Got the interview via Dr. Émile P. Torres on twitter

Somebody else sneered: 'Makings of some fantastic sitcom skits here.

"No, I can't wash the skidmarks out of my knickers, love. I'm too busy getting some incredibly high EV worrying done about the Basilisk. Can't you wash them?"

https://mathbabe.org/2024/03/16/an-interview-with-someone-who-left-effective-altruism/

 

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›