Glen Weyl reflects on his previous disparagement of the social scene surrounding this website, and expresses regret at having been too hostile: while he stands by many of his specific criticisms, he now thinks "rationalists" should be seen as more similar to other intellectual communities (which can be allied on some issues if not others), rather than a uniquely nefarious threat. (October 2021, 1300 words)

New Comment
12 comments, sorted by Click to highlight new comments since:

What on earth caused him to change his mind? I do not understand how he ended up writing so viciously and ungroundedly about us, nor what would lead someone out of that state. It's certainly a positive sign for someone to change their mind so much about a thing like this. The text reads as an honest and thoughtful account of what he thinks he was wrong about. But I did not expect this, and I notice I am confused.

I will try to explain what I know. I guess 90% accuracy on individual points so some of it will be wrong. 

Overview: I think Weyl was going on a process of changing his mind for a year or two. Remmelt and I have both and conversations with him. I imagine there are more conversations and maybe some some deep process we can't see.

I've talked to Weyl for an hour or so on twitter 3 or 4 times. I liked his book and like him personally, so spent some time teasing out his thoughts whenever I thought he was being unfair. eg here https://twitter.com/NathanpmYoung/status/1374308591709138948

Iirc I'd lightly pushed for a while for him to A) talk to some actual rationalists and B) Send documents with criticisms to ratinalists directly rather than post them as open letter. I think a document posted by Weyl to here would get a sober response. I've always felt Weyl was a sincere person, even if we disagreed and cares about AI risk etc. Also I genuinely like him, which makes it easier.

Four months ago, he wrote this https://twitter.com/glenweyl/status/1423686528190980097 

"I have [thought about writing on LessWrong] but I am worried I would get the tone wrong enough that it would be a net harm. @RemmeltE has kindly been trying to mentor me on this.

and later to me https://twitter.com/glenweyl/status/1424366991792513024

"Thanks for being so persistent with me about this. I do genuinely think that you’re basically right that my behavior here has been fundamentally hateful and against my principles, driven by feelings of guilt/shame and counterproductive to my own goals. I hope to have time 

Before going out on paternity leave to post an apology on LessWrong"

To me it felt as if he had a culturally different approach to AI risk than rationalists (he wants to get more people involved, and likes redistributing wealth and power) and also there was maybe hurt. This led him (in my opinon) to overextend in his criticisms, mingling what I thought were fair and unfair commentary. The article he shared here I thought was unfair and didn't deserve Weyl's support. I guess I hoped he might change his mind, but I was still surprised when it happened (which makes me wonder if there were other things going on). I was particularly surprised by the strength of the first and this subsequent apology.

Some thoughts suggestions:
- I found the apology article a bit hard to follow - I read it a couple of hours ago and I'm not sure I could explain it now
- Weyl seems to have done exactly what the rationalist part of me would want from him. If anything, it might be too much. I hope people are gracious to him for this. It probably cost him time, emotional energy, pride and possible the respect of some others.
- I still wonder what led to him being so averse to rationalism in the first place.
- I'd suggest if you're interested you thank him for the apology and talk to him on the subject.

I've struggle to write this accurately and non-arrogant/humbly so apologies if I've overcooked. Thanks to Neel for suggesting I give my thoughts.
 

Weyl may not be really apologizing here.

It is more a confession and warning than an apology.

This could mean that it's very much an apology, but even more a confession and warning. Given the lack of any other apology-language, like "I'm sorry," I think it instead means that it's not to be read as an apology, even if there's a ruefulness about it. Even if it's an apology, he doesn't explicitly say that it's an apology to us. It could just as much be an apology to his supporters for misdirecting their attention.

He's speaking to his own community, RadicalxChange, at least as much as he's speaking to us. What is he saying?

Weyl thinks Silicon Valley is a villain.

... the technology industry, and especially Silicon Valley (SV), has become the greatest unaccountable concentration of power in the world today and is thus a fundamental threat to self-government. 

He thought that rationalists were soldiers in the SV army, because this sector is overrepresented in rationalism. Now, he realizes that he was mistaken. He dislikes our perspective, thinks we're wrong and self-contradictory, and that we're narrow in our demographics and influences. But he now realizes that we don't whisper in the ear of Elon Musk. He no longer sees us as any more threatening than the many other groups he dislikes, but doesn't bother to attack, such as religious fundamentalists.

Battling SV must be hard. After all, he has to figure out the anatomy of this large, complicated culture, and figure out which bits play an executive role and which bits are mainly just being told what to do. There's not much in the way of hard evidence for him to make that distinction. He had to rely on pattern-matching and associations to identify his targets. He thought we were part of SV's executive function, and now realizes that we're not. Given the enormity of the threat he perceives, he seems to have felt it was best to shoot first and ask questions later.

What's not clear to me is whether he'd resume attacking us if he changed his mind again and believed that we did have more power in SV.

On the one hand, he says "exaggerations of the group’s power and conspiratorial allusions are basically hateful and fundamentally opposed to my belief system." That sounds like "I was wrong to demonize rationalism because demonization is wrong" and a renunciation of the "shoot first, ask questions later" approach.

On the other hand, he says "However, what has changed significantly is my views of the sociological role of the rationalist community within the technology industry." That sounds like "I was wrong to demonize rationalism because rationalism isn't a high-priority demon," and a call to his community to train their firepower on a different target. Given that he's explicitly downplaying or denying an apology, I read this as the main point of his post. He's admitting an embarrassing strategic error to his own soldiers, not apologizing to us for the collateral damage.

Someone's paraphrase of the article: "I actually think they're worse than before, but being mean is bad so I retract that part"

 

Weyl's response: "I didn’t call it an apology for this reason."

https://twitter.com/glenweyl/status/1446337463442575361

Why would we want an apology? Apologies are boring. Updates are interesting!

I didn't say we/I wanted an apology. I was just trying to clarify what he was actually saying.

Every memorable apology I've ever gotten has hailed an update, although sometimes it lags a little bit- (eg person updates –> person spends some time applying the update to all affected beliefs –> person apologizes).

 

This mostly holds for apologies i've given as well, excluding a couple where transgression and apology were separated by enough years to make pinning it on a specific update difficult.

As I said above I struggled to follow the article and now can't be bothered to reread it.

But I agree that he disagrees with his previous conduct.

Feels like "I disagree with you but went about it the wrong way" is something we'd welcome from those who disagree with us, right?

Maybe he was embarrassed by the mistakes he made, like making up 3 different wrong citations for a claim about Audrey Tang despising rationalists (which was also not true), and reflected a bit.

@Ben, I had some conversations with Glen after sharing that blindspots post with him. Happy to call one-on-one about my impressions here: calendly.com/remmelt/30min/

The header font is perfect for "Demonize", maybe "Rationalism" should be changed to Comic Sans for best effect.

While there are obviously close social links between these different contenders despite implausible claims by defenders of Rationalism to the contrary), and I believe also some important intellectual ones, these linkages might better be understood like those that exist among competitors in a niche sport rather than those among teammates. That is, we may see surprising social overlap between NRxers and Rationalists because Rationalists are afraid of losing their audience to NRxers not because they sympathize with them.

I do think the charge of feeding on the same audience as various cranky communities and semi-cults is accurate, and at least a bit worrying.

I have no idea who is that guy, and I don't really care, but that sentence was quite a surprise for me. Hands up, who is afraid of losing an audience (whom specifically?) to neoreaction, and what exactly are we doing to prevent such horrible hypothetical outcome? (Also, why I didn't I get the memo? Should I feel offended?)

Too bad the author isn't more specific about what exactly is the "niche sport" that both rationalists and neoreactionaries are supposed to practice, what kind of audience are they competing for, and what specifically are rationalists doing to win the hearts and minds of the potential neoreactionaries. It would be much easier to respond to specific accusations.

Let's start with the "niche". The first approximation is "smart contrarians", but that seems too wide. Both groups are an outgroup to woke progressives, although for different (and kinda opposite) reasons: neoreactionaries identify as right-wing, rationalists say that politics is the mindkiller. What else?

What audience are we competing for? I suppose it is the smart contrarians.

What specific things are we doing to attract them, and specifically to prevent them from becoming neoreactionaries instead? (What things, that we are doing now, would we not do in a parallel universe where neoreaction never existed?) The only difference I am aware of is that we enforce the taboo on politics more strongly than we would in a parallel universe where neoreactionaries never tried to promote their politics on Less Wrong. But this answer does not make sense -- if the author hates neoreactionaries, why would he be angry at Less Wrong for not providing a platform for them?

I am out of ideas. The remaining one -- and now I feel like I am making a strawman -- is that the author believes that in the parallel universe all rationalists would be super woke (or whatever is his preferred flavor of politics, I don't know), but in this universe we are not, because we are competing for the non-woke audience against the neoreaction. But this is so wrong it is not even funny. So, what else is there?

In the meanwhile, my assumption is that the author apparently had some very confused ideas about rationalists, then he updated somewhat, but he still remains confused a lot.