[-] pyrex@awful.systems 3 points 4 months ago

Oh, OK. I think all the VC-adjacent people still really believe in crypto, if it helps. They probably also don't believe in it, depending on the room. I think it will come back.

[-] pyrex@awful.systems 3 points 4 months ago

Put me down for "doesn't think it will end." Did crypto end?

[-] pyrex@awful.systems 5 points 4 months ago* (last edited 4 months ago)

Really? Weird. Very different experience.

(Maybe crypto is less deteriorative than business?)

[-] pyrex@awful.systems 5 points 4 months ago* (last edited 4 months ago)

The last time I met a person who had done deeply reprehensible, highly publicized tech fraud (FTX executive) he kind of just came off as a dude, and I liked him.

That kind of makes me feel bad when I think about it.

I haven't met a high-profile fraudster lately, but my first impression of bad guys is usually pretty positive. As far as I can tell, people keep their ambient personalities when they break bad, but they compartmentalize and they develop supermassive appetites for praise. This long-run increases their suggestibility because they have to be more and more gullible to not hate themselves. I think this hollows them out -- when you live a double life for long enough, you kind of stop observing the reality-fiction boundary at all.

Not clear how to stop the cycle. There's just too much money involved for me to dive off the train right now.

[-] pyrex@awful.systems 5 points 4 months ago

I've spent a lot of time trying to write without any intentional exercise of style! I think I've read far too much text generated by people on the psychotic spectrum to actually manage this.

[-] pyrex@awful.systems 5 points 4 months ago

I think most people would see higher performance on general tasks on Adderall. Not sure if this is actually a good reason to put everyone on Adderall.

Side effects can be pretty brutal, although people who abuse caffeine to get the same level of stimulation are going to probably have them a lot worse.

[-] pyrex@awful.systems 4 points 4 months ago* (last edited 4 months ago)

My actual experience is that LLMs seem to basically just become a third arm for people who use them. Google is like that too, but for their target audience, LLMs are more like that.

You don't love your arm, but if someone goes to you like, "Do you mind if I cut your arm off?" of course you say "do not." If someone's like "OK, but like, if I made you choose between your wife and your arm" you'd be like "That's incredibly perverse. I need my arm."

For people who use them it seems like it really quickly became impossible to exist without them. That's one of the reasons I think they're not going away.

[-] pyrex@awful.systems 4 points 4 months ago* (last edited 4 months ago)

The other day my landlord came over and ranted at me at about 60 decibels for about 10 minutes about the state of my apartment. Then she saw I had The Man Who Was Thursday on my bookshelf and asked "Oh, so you like Chesterton?" She was oddly polite and helpful for the rest of the visit, and only raised my rent by $400/mo.

I read Chesterton when I was like 15 and thought he was brilliant. I grew up a little and started meeting Catholic and Mormon philosophy kids, who were generally weird transhumanists in the same category as otherkin, except with the world's worst aesthetic. (If you're going to fantasize about transcending your physical body, at least fantasize about being a dragon while you're at it.)

It's not surprising to me that Scott Alexander likes him -- I like him too, on the strength of his non-philosophical gifts. He was consistently writing for overtly classist rich people but also for the masses: speaking to the exploiters and the exploited at the same time meant he actually had to innovate new ways of expressing his classism. He had to write out of the internalized classism of his audience more than he had to write out of outright contempt, and he only ventilated his own contempt in very narrow cases where he had made it seem totally defensible to do so.

I kind of came away from him feeling like he was the perfect demagogue for an era that ended -- so his rhetoric is somewhat defanged, but the exact tendencies that made it so marketable to institutions are made even more glaringly obvious. I also kind of came away with the impression that even in stereotypically conservative philosophical traditions like Mormonism, voices like his totally squash out the people who are looking for a radical form of self-expression. People like me exist everywhere, regardless of upbringing -- therefore, this isn't an accident, but a function.

I can't talk about the long term effects of Scott Alexander yet because they haven't happened, he's not as good at his thing as Chesterton was at his thing, and our system of media, while deeply flawed, is still more democratic now in our time than his was in his time.

But I've at least got a vague theory saying that someone like him has to exist in every right-shaped pocket of every universe.

[-] pyrex@awful.systems 5 points 4 months ago* (last edited 4 months ago)

I don't think you sent this to me personally, but it has been sent to me. I still like it quite a bit. I reread it now to make sure of that!

I think your summary (and additional analysis) is pretty accurate. I think I would add a few things:

  • He's not being evil in every post. Some of the posts are OK.
  • [Elizabeth Sandifer observes this.] He tends to compare a bad argument to a very bad argument, and he's usually willing to invite snark or ridicule.

There's a crunchy systemic thing I want to add. I'm sure Elizabeth Sandifer gets this, it's just not rhetorically spotlit in her post --

A lot of people who analyze Scott Alexander have difficulty assigning emotional needs to his viewers. Scott Alexander decides to align himself with Gamergate supporters in his feminism post: Gamergate isn't a thing you do when you're in a psychologically normal place.

An old Startup Guy proverb says that you should "sell painkillers, not vitamins" -- you want people to lurch for your thing when they're doing badly because you're the only thing that will actually solve their problem. When people treat Scott Alexander's viewers as if they're smug, psychologically healthy startup twits, they typically take his viewers' engagement with Scott Alexander and make it into this supererogatory thing that his audience could give up or substitute at any time. His influence by this account is vitamin-like.

This makes the tech narcissists seem oddly stronger than normal people, who are totally distorted by their need for approval. We kind of treat them like permanent twisted reflections of normal people and therefore act as if there's no need for funhouse mirrors to distort them. We make the even more fundamental error of treating them like they know who they are.

This is how I think it actually works: the narcissists you meet are not completely different from you. They're not unmoored from ethics or extremely sadistic. They're often extremely ambivalent -- there's a clash of attitudes in their heads that prevents them from taking all the contradictory feelings inside them and reifying them as an actual opinion.

From what I can tell, Scott is actually extremely effective at solving the problem of "temporarily feeling like a horrible person." He's specifically good at performing virtue and kindness when advocating for especially horrible views. He's good at making the thing you wanted to do anyway feel like the difficult last resort in a field of bad options.

I'll admit -- as a person with these traits, this is another place where the basis for my analysis seems completely obvious to me, yet I see an endless dogpile of nerds who seem as if they willfully do not engage with it. I assume they thought of it, find it convincing on some level and therefore they make significant effort to repress it. If I'm going to be conceited for a moment, though, this is probably simultaneously expecting too much intelligence and too much conventionally narcissistic behavior from my audience, who are, demographically, the same people who thought Scott was brilliant in the first place.

[-] pyrex@awful.systems 3 points 4 months ago

Actually, as a furry, I'm obligated not to hate this.

[-] pyrex@awful.systems 3 points 4 months ago

Ack, I meant to go around responding to everyone and I missed this one! Hope it was good.

[-] pyrex@awful.systems 4 points 4 months ago* (last edited 4 months ago)

I think LLMs are effective persuaders, not just bias reinforcers.

In situations where the social expectations forced them to, I've seen a lot of CEOs temporarily push for visions of the future that I don't find horrifying. A lot of them learned milktoast pro-queer liberalism because basically all the intelligent people in their social circles adopted some version of that attitude. I think LLMs are helping here -- they generally don't hate trans people and tend to be antiracist, even in a fairly bungling way.

A lot of doofy LessWrong-adjacent bullshit abruptly filtered into my social circle and I think OpenAI somehow caused this to happen. Actually, I don't mind the LessWrong stuff -- they do a lot of interesting experimentation with LLMs and I find their extreme positions interesting when they hold and defend those positions earnestly. But hearing it from people who have absolutely no connection to that made me think "wow, these people are profoundly easily-influenced and do not know where their ideas are coming from."

I do think these particular stances got mainstreamed because they entail basically no economic concessions, but I also do not think CEOs understand this. I think it would be nice if LLMs just started treating, I don't know, Universal Basic Income as this obvious thing that everyone has already started agreeing with.

view more: ‹ prev next ›

pyrex

joined 7 months ago