This is a fairly high-context conversation between me (Ben) and my friend (Ren). 

Ren used to work at CFAR, and then ~4.5 years ago moved away to join the Monastic Academy in Vermont. Ren had recently visited me and others in the rationalist scene in the Bay and had written a Facebook post that included her saying she'd heard word that people had been gossiping behind her back about whether she'd "gone nuts", which she said was unwholesome and further said she was angry that people thought it was acceptable behavior. As someone who had done some amount of that, I wanted to talk with Ren about it. 

We found a time to have a zoom, then I proposed we move to try talking via written dialogue, a format which we both liked quite a bit.

This is a fairly high-context conversation, and the topic I'm sorry if it doesn't make sense to a bunch of readers.

Thursday, October 12th

Hello! This is me.

Ben Pace

You can type and submit too.

Ben Pace

Hello! This is Ren.

Unreal

Cmd-enter (or ctrl-enter on Windows) will submit your message to the dialogue.

Bringing in from our spoken conversation: The thing that felt most worrying to me was your point that, if you suspected a friend had fallen into a cult or become crazy, then you would be concerned for them and want to support them, yet your sense was that rationalists were quietly coordinating to distance themselves from you, which was a very uncaring and somewhat hostile thing to do.

Ben Pace

Okay, so, I have two initial hypotheses about why people would treat you uncaringly if they thought that you had gone crazy.

  1. The rationalists are an in-group for people who believe true things at the cost of all else. Insofar as you (in their judgment) are trying to have true beliefs then you'll be supported, but if you appear to no longer be doing that then you're "out". This is not totally dissimilar to how a christian parish may accept lots of sinners except for the sin of not believing in God / being another religion.
  2. Roughly speaking the rationalist-people are not actually friends, they're a status hierarchy, and within it you are lucky to have 1-3 friends, and if you fall out of the status hierarchy, then you no longer get the benefits of being in the community. In this version of the story you are actually finding out that you didn't have friends.
  3. A third secret thing that I have not yet thought of.

Thoughts on either of these?

Ben Pace

Immediate reactions:

  1. Seems okay. 
  2. Whoa, sad.
  3. Mystery!!
Unreal

As a side note, I want to at least on-principle endorse that it's basically pretty disrespectful and adversarial to discuss whether to oust someone while hiding this from them. I can imagine doing it in highly adversarial situations (e.g. concerns about physical assault, or someone being very powerful and corrupt) but my guess is that in most possible situations it is direct evidence of weakness (moral, character, not sure exactly what) for people to not be willing to talk openly about it if you are friends.

Ben Pace

(I say on-principle because I have some concern that I am not living up to my principle here. Happy to discuss that more at some point here.)

Ben Pace

I think I just forgot that some rationalists don't view themselves as a community. And instead view themselves more as a group of professionals or some kind of ... club. I think it's a weird mix of things, and maybe ... I and they are confused about what they are. 

I do currently have a bad sense of ... like, somehow rat's are mixing things and calling it one thing when it suits them, and then using it in the other way when it suits them. But the inconsistency may be ... troublesome. But this is a big tangled mess in my mind, and I'd need to spend more time considering what's going on here.

Unreal

I agree that the use of the word 'community' seems confused. To me the word seems so overloaded in present culture that it's on my personal list of terms to try to always taboo. For instance for the last 1+ year I have been ~exclusively using the phrase the "the EA ecosystem" instead of "the EA community".

Ben Pace

I think part of what might be confusing here: 

If I'm no longer in the ingroup, what's the point of discussing my sanity at all? I've been out of the scene for 4.5 years. I guess I still occasionally comment on LessWrong. But then, man, I'd really ... much rather ... just be told to leave the clubhouse rather than people gossiping about me behind my back. 

I think, from a certain perspective, it leaves better optionality for those people. Clueless me, I keep engaging with people in the community as though I'm still sort of included. But they get to evaluate my 'fit', and decide what to do with me, secretly. And like, they maybe have the power to 'bring me in' if they wanted to or 'leave me out' if they wanted to. 

I understand what this is like from the inside, having been in CFAR. And I get why it might be appealing to 'give the mission' this sort of optionality, cuz there's some interest in 'finding the relevant ppl' 'for the mission'. But there are bad vibes to this too. 

Unreal

I agree that being told openly is better than it happening implicitly and just not getting told about it.

I was about to propose some mistake theories, but I worry I am being far too mistake theory toward everyone involved...

Ben Pace

Do you think you would feel less betrayed if I right now tried out openly making the case that you are crazy and to-be ousted? :P

Ben Pace

(Not that I believe it, but I have some probability on this.)

Ben Pace

So... I have to investigate how I actually feel about this... 

But I think, on principle, I'm against petty gossip regardless of friendship status or closeness to the people. Just based on what I would consider ethical behavior. 

So ... in fact I wouldn't do petty gossip behind people's backs even if they were relative strangers. And so the thing where these people aren't really my friends or close to me... doesn't really change... that it seems inadvisable, to me, to do. 

However........ I don't know if I'd feel angry in the same way. yeah I think i'd just be like "huh that's kinda shitty behavior" 

But I do think it's confusing because... a lot of the rat's... I thought were my friends. Actually. And maybe... they don't really think that. 😅

Unreal

I would feel good if you, as my friend, gave me feedback. And like, tried to explain things I've said or done that give you concern, out of a ... desire to help me? Or something - I am failing to be quite precise. 

I would feel pretty good if I received a "i think you actually don't belong here" - if there was a clear conclusion like this. I'd feel somewhat relieved. 

Unreal

That makes sense.

I could switch to that, but I want to first bring up that it wasn't my motive for commenting on the Facebook post. One of the activities I regularly engage in is "public discussion of what the norms are", which I think helps people both not have to make worst-case assumptions about the norms, and also prevent poor norms from coming to be. While I think there was something important you were trying to defend, I disagreed with the norm that it read to me that you were explicitly proposing, which is that "gossip about people having seriously lost touch with reality" should be punished.

I also would like you to not join a cult or go crazy, that would be very sad :(

Ben Pace

So specifically about gossip norms.

To me, the intent matters. The way it is done matters. 

I think this includes professional contexts where people don't know the other people very well. Or don't have particular loyalties to the people, as friends. And maybe that's more the relevant context here. 

I am not interested in the punishment part of this. 

I am interested in 'calling out' behavior that is inadvisable, unethical, or unprofessional. And I guess this would be... a norm reinforcement... but I don't see it as all that punishing to do.

Gossip that is coming from a place of pettiness, or desire to boost oneself up or lower someone else. Gossip that is coming from envy, spite, jealousy, malice, ill-will, etc. Gossip that seeks to divide people / cause disharmony / cause conflict between friends. Gossip that dehumanizes. 

[Edit: I want to elaborate that 'dehumanizes' here means something like "causes people to mistreat that person, including in their own minds."]

Unreal

When you write this I really don't know what you mean that isn't incredibly overreaching and damaging:

But I think, on principle, I'm against petty gossip regardless of friendship status or closeness to the people. Just based on what I would consider ethical behavior. 

Lots of private information gets passed around in small circles because it's sensitive or delicate or unfair to share in wider circles. This includes positive and negative information.

Yeah, there's private info you can spread about someone just to embarrass them or to feel superior for them or just so you can get points because you have private and personal info about someone. There's lots of bad motives for sharing info. But also sharing private strongly negative info seems really important to me too, and I don't want a rule of "no saying bad things behind people's backs".

It seems good to point out bad patterns and low motives. I'd be into trying to specify what's going wrong with gossip in this situation, as my current best guess is that you have some legitimate grievances <hopeful-yet-also-sad-emoji>

Ben Pace

Lots of private information gets passed around in small circles because it's sensitive or delicate or unfair to share in wider circles. This includes positive and negative information.

I mean this doesn't immediately read to me as 'petty'. 

Petty would be more like "oh my god did you hear so-and-so did such-and-such??" Like you can kind of tell the vibe isn't... the same as... "hey, i hear you're about to engage with Bob in such-and-such. it seems relevant to tell you about this time Bob did this bad thing that would relevantly impact your decision." 

Unreal

For instance, today I asked Alice whether Bob made her uncomfortable if I invited him to an event. She said yes, and told me a story of their interactions, and showed me some texts, and I update negatively on Bob's self-awareness and ability to handle conflict (not that he was a malefactor, just unaware and unskilled), and chose not to invite him to an event (that he was a v marginal invitee for, and Bob and I are barely acquaintances). I think this was really good info sharing and I am glad that Alice trusted me to share this info.

Ben Pace

I don't take any issue with the above example about Alice and Bob. 

Unreal

Sorry, I feel like I am being a bit needlessly defensive...

Ben Pace

I mean, gossip is legitimately a very nuanced situation. :/ It's really case by case.

Unreal

To restate my paraphrase: it sounds like the thing that seemed pretty unvirtuous is

People presenting as friends, yet saying a lot of very negative and ousting-like things behind your back.

Can I get a 1-to-10 rating on how well that captures what you feel like has been happening?

Ben Pace

Somewhere between a 3 and a 6

Unreal

Okay. Um, maybe do you wanna type more about what it seems like to you? :)

Ben Pace

I will try to delineate the thing. 

  • Negative or careless or petty gossip about me, without care for me as a person. They were just talking because they could, and it was maybe fun for them or a diversion. To speculate about my sanity. (I admit it is fun to speculate and psychoanalyze about people. So I get that. I just also think it's inadvisable.)
  • Being so-called truth-seekers, but not going to the source of truth directly. Engaging in, what seems to me, to be somewhat cowardly behavior. Or at least avoidant. It's 'too convenient'. They avoid feeling uncomfortable, get to amuse themselves by talking about someone negatively, and don't receive any of the impact of their actions. 
  • Talking about me negatively, in a way that they would not do so when I am in the room. This feels deceptive. Then, not telling me directly about it, this is further unfriendly. Also reads to me as immature. 
Unreal

Note: This is largely speculation about how people are engaging in this type of conversation. It's not like I was there. 

Unreal

This is helpful. I think the 3rd bullet is leaving me with a clearer impression of something that I think is clearly bad behavior. 

(Yep, I am not reading you to say "I definitely know that this happened" but it seems like you've got enough evidence to think it probably did or something similar to it might've.)

Ben Pace

I think people saying one thing to your face and a different thing behind your back, is not an honorable way of interacting with people you respect or are friends/allies with. Especially if the thing behind your back is very damaging to you or very negative about you.

Ben Pace

Well I've been IN convos like this in the past. I'll just confess that I've talked like this about ... I feel like most ppl who have worked at CFAR? But I think this kind of convo would often come up about Alan in particular. And I've talked about Bob in ways I would now regret. Probably also Charlie. But I think Alan is the central example here. Because he 'went the way of religion' so to speak, and everyone was confused about it and couldn't help but speculate and speak in patronizing ways about it. 

[Edit: Some names have been anonymized here]

Unreal

While there were attempts to talk directly to Alan about the religion thing, that didn't resolve in a way that was satisfying. And then from then on, ppl would keep talking about Alan in .... 

Pretty much the same way I imagine old Christian ladies talking about someone gay. Like they got lost or something. But with a tone of ... clucking. Tutting. Disappointed, but distancing. "what a shame" 

And something about that reads to me as petty, unnecessary, and like... speech that ought to be avoided. 

Unreal

I also feel like people were not forthright enough with Alan about that.

Well, in retrospect I think things were kind of confusing and what norms would be enforced was pretty confusing. I think it would have been good for a bunch of people to say "We're pretty convinced that theism is false and there's a civilizational-level amount of motivated cognition looking for arguments for it and we need around here to have a line in the sand that says we're not accepting that conclusion here". However, that wasn't the case! Anna Salamon continued to hire him for events afterwards. So I think those people weren't in a situation to enforce the boundary.

Ben Pace

Yeah... it's really not cut-and-dry. I obviously don't think the EA/rat scene should oust religious people. lol 

Unreal

I think a bunch of people did lose a lot of respect for him (me included) and it seems likely to me that we weren't up front about this and as such did not act very honorably toward him.

Ah, I do want to note that I think enough people had written their opinions on the internet and discussed atheism a bunch that I think it was surely quite clear that a lot of people would lose a lot of respect for Alan due to his believing in a god.

(I am unresolved on whether people acted honorably.)

Ben Pace

Well one thing I'm noticing is that... 

Ben gravitates towards norms that seem somewhat legible or able to be reinforced b/c of their general legibility. I think this is actually pretty valuable and good for norms. 

However... the way I judge people's behavior... has a lot to do with the vibe or way that they go about it, and their motivation or intent behind their behavior—which is generally not legible. Like not permissible evidence in a court of law, so to speak. (Although they DO try to discuss and ascertain this in courts of law... actually... and maybe we just need to get real good at this?) 

But anyway ... I do use motive and intent as a measure because of certain perceptual skills I have. I believe I am unusually good at discerning people's motives.

And maybe no one should ask me to write their norms for them. 

And maybe I'm not even in this convo to discuss norms. Or what should be enforced. Or reinforced. Maybe I'm not even at that level here.

I think I'm like... more like... that Facebook post... was me, as a friend, calling for some kind of intervention or calling out some stuff that I'm personally upset about. And calling people to step into a higher integrity. Which I believe is what friends do! 

And I believe the rationalists are my friends.

And maybe they don't. 

And I should maybe figure this out.

Unreal

Some things that feel alive to me:

  • I think gossip is good and want to defend gossip
  • I think acting dishonorably is bad and I want to know how to characterize dishonorable gossiping
  • I like chatting with Ren in a pretty general way
  • I wish I could remove the occasional whitespace/newlines at the end of some of Ren's responses [edit: this has now been solved]
  • I feel like the lack of "caring about your friend" point was really worrying to me because I thought it might be true and so I wanted to talk around that some more

Also here's a paragraph I wrote into my notes app because it didn't seem like it had a natural place to fit into the dialogue above:

I guess it seems to me like we’re engaged in the great rationalist project of “legibilize what the local norms should be and then we will all follow the output of this process” which is notably quite different from “try to directly be kind to one another”, though it is something that can be very powerful and people you don’t currently know well can help you out a great deal with by just writing comments :)

This matches up with something you're writing in your box.

Ben Pace

FYI think gossip can be quite good, and I do advocate for the wholesome version of gossip. I engage in it myself.

Unreal

For the record I'm coming from a place where people have had lots of power and I've been constantly interacting with people who are practicing and demanding norms of not gossiping about stuff you've heard and so my defensiveness is high on this norm.

Ben Pace

I think also: Insofar as you are unhappy with my behavior toward you, I feel honor-bound to step forward and tell you what I did and defend it / apologize!

[Meta: Ren here said in a side-channel that she'd be interested to know what I said.]

Ben Pace

I mentioned your name to someone who was visiting Lighthaven. They asked about you and how you were doing and I recall saying something like "Oh, she's doing well. I mean, I don't see her that often so I'm not confident, but I think she's doing well." Then later on I noticed that I think this person was worried that you had changed for the worse for being part of MAPLE, so I followed up via text.

I feel a bit embarrassed typing it in verbatim. I take this as a sign I should probably do so. Here is what I wrote:

the other day i said [ren] seemed to be doing well to me

to clarify, i am not sure she has not gone crazy

she might've, i'm not close enough to be confident

i'd give it 25%

i mostly meant she seemed more chill

like some anxiety / fakeness had fallen off of her

That's what I sent.

Ben Pace

I think that this is not like an incredible summary of my impression of you and if I thought for half an hour I would say more substantive things, so I am concerned that it will feel a bit superficial / thoughtless.

Ben Pace

I guess I'm interested in your reaction, including whether you feel hurt by it or otherwise.

Ben Pace

OK well while Ben is typing... I'm just gonna, for fun, type the Buddhist norms around speech.

Wait, before that, it's important not to like... assume these are held rigidly, in a deontological way. They are flexible, situation dependent, and more like 'aspirations' than rules. 

They advise:

  • Avoid false speech.
  • Avoid speech that is malicious (designed or intended to cause harm)
  • Avoid speech that divides people (e.g. causes fights to break out among friends or people who were generally harmonious and getting along) 
  • Avoid speech that is harsh 

I think there's a lot of nuance here, esp around the 'harsh' one. But I get a lot out of considering these. 

Unreal

Okay, I've read what you wrote. In general, I have no substantial negative reaction to reading the description above. Although I'm a bit sad about 25%. 

Unreal

But also, doesn't seem terribly unreasonable or unkind to me. ! 

Unreal

I wonder if I want to know more about what people mean when they say "gone crazy." 

My current sense is that people using this phrase have almost no real model of what they mean. Like it's got a black box feeling to it. 

Cuz they clearly aren't talking about me having a psychic break, being manic, or something.

It's more like... 

something something cult brain or brainwashed or something seemingly more sinister and hard to put a finger on. But it feels kinda creepy or like weird or off. In a way that's hard to point at. 

And this makes me ..... want to frown at it hard. 

Unreal

Okay! I am glad you don't think my texts were dishonorable.

Hmm... I have been thinking a bit about what "crazy" should refer to, I can give my current guess, with the caveat that it is probably wrong.

Ben Pace

So, I think one case I will include is medically insane, like they're seeing ghosts or are paranoid and think the CIA is following them or have multiple personalities in their head, and broadly aren't in touch with reality, and the concern is they might hurt themselves or someone else.

My stab at characterizing this in a way that might generalize is "has robustly false beliefs and is not epistemically open to changing their mind, and because of this they are unable to work with / living with other people".

There's also delusional people who believe that they are Napoleon or something. I think this is the same thing.

I think that religious people are sometimes like this. "Yes, I'd love to open a restaurant with you, but Christ is returning on Tuesday and I need to slaughter the infidels before he returns." I am impugning this person as both likely to hurt people and also probably not open to evidence falsifying their beliefs.

Ben Pace

"Yes, I'd love to open a restaurant with you, but Christ is returning on Tuesday and I need to slaughter the infidels before he returns."

Yeah, I read this as a person who is in fact experiencing a psychic break. And I would say their specific delusion has religious themes but I would not describe this person as a religious person. 

This would be similar to me saying: "I think that rationalists are sometimes like this." And pointing at [redacted] or similar. :P But anyway. Carry on.

Unreal

I think there are weaker cases of confidently believing something false and not being epistemically open to counter-evidence, in a way that is very damaging.

I think many Christian people also had confident beliefs about what their God wanted and would punish people in accordance with it (e.g. physically assault their children) and would say that their belief is based on their personal experience of the Holy Spirit, in a way where I suspect I would not be able to present them with evidence that would change their mind.

Ben Pace

My sense is that another time people describe someone as "acting crazy" is when someone is high on drugs and acting erratically and might be violent. They're not properly in contact with reality and will react to aggressions that aren't there and do things that don't make sense.

Ben Pace

I wish I knew a Christian similar enough to your example to really consider it. 

Hummmmmm. 

I maybe don't spend any time trying to change a Christian's mind. This activity might be illuminating. 

But ... I dunno. My views here are nuanced. 

A phrase that I've heard "I try not to kick anyone's knees out from under them." 

And I like this as a general principle. Esp if they're not committing violence or harm. 

And most religious people aren't really, as far as I can tell. 

Unreal

I think there's a related thing of being socially unpredictable that scares people and that people like to call crazy. I remember talking to [redacted] once during a circling session, and having a sense that the person appeared normal but was in fact not at all reading the social cues correctly, and I started to feel scared and like the person would maybe read something as a strong betrayal and act out.

[Added: I think that there's an element of "this person is not going to play a consistent social role in the scene" that people often call this 'crazy'. I think rationalists sometimes seem crazy because they're going to operate in line with what's true, which is often very different from what the scene demands. What I'm saying is that being crazy and seeming crazy are very similar, and sometimes it's because you've lost touch with reality but other times it's because you're more in-touch with reality.]

Ben Pace

I am open to the story that the person who believes that they are Napoleon or Christ can be accepted into normal society and live a good life. I am here trying to say that they are still delusional even if that works.

Ben Pace

OK but the real question is: When you were like, I put 25% on Ren maybe being crazy. What does that crazy actually refer to? 

Unreal

Oh right. I forgot about that context. I admit my examples are a bit more extreme / alarming. I feel a bit nervous about trying to talk about this politely.

Ben Pace

FTR, I would characterize a lot of rationalists as ... 'religiously so' or 'fundamentalist' in the way they hold their views. As in, ... attached to their specific stories of reality and not open to updating. So in other words, they believe what they believe based on a kind of faith, and not the good kind of faith. And I say this just b/c it sort of mirrors what you said above about religious people or Christians. 

Unreal

It's hard for me to share all of the vibes I've picked up, and I am a bit defensive about them. I am not sure exactly what I mean, but I'll give it a shot at a explicit meaning.

  • Will Renshin confidently believe something that a lot of her actions rest on that seems to me strongly false and not be open to changing her mind, and as a result either waste a ton of time/effort or possibly do something harmful?

I think my concerns here are something like "Frame control by a central figure in her local social environment who has a lot of power, such that she does not really question some central tenets of belief".

(I don't mean to necessarily imply that this doesn't happen to any other people and isn't happening to me.)

Examining the case of Nonlinear gave me a bunch more taste for how much of an attractor state there is here. The two women I talked to at length were in a situation where:

The CEO of the company 

  • Had ~all of the financial power (millions in his own bank account, paid out ~very little to any employees)
  • Believed himself to be one of the greatest people alive, such that his brother (also at the company) said that if the CEO ran for President, he had a 10% chance of winning
  • Generally got to win arguments by saying he had thought about the topic more than ~anyone else on Earth
  • They travelled in a small unit with its own social bubble and v few people coming in and out that didn't respect the leader
  • (I could write more)

And at the end they'd taken a bunch of actions that they... couldn't have imagined taking otherwise. Like, they were just like "this makes sense, the boss said it's fine to drive illegally in a foreign country without a license for 2 months".

Returning to the topic at hand, I have seen a bunch of little red flags around MAPLE (which I could go into, e.g. my sense is that people are often suffering from sleep deprivation there, I also recall reading some tale of the leader endorsing some atrocity like it might be better for humanity to go extinct than continue as it is), so I am somewhat concerned about your local social environment.

So that's a key source of my probability estimate.

Ben Pace

To be clear, to me you seem stronger and to have more faith in yourself and to care about right action more, and you seem like a more powerful force in the world for it.

Ben Pace

Hmmm. 

Interesting. 

I take issue with the word 'confidently'. 

I would agree there was an issue if the thing were more like... 

In a way that was attached. Esp in a personal, egoic way. Like If I needed to defend myself and my beliefs. Or I couldn't face alternative realities that went against my sense of things. Like I was flinching against evidence. Or I was personally defending something or afraid to consider the possibility of my view being false. 

But I agree that I hold certain views confidently but I would say that's good, as long as I'm not attached to my views. I'm confident because ... I've personally examined a lot of evidence and investigated it for myself. And verified it etc. But I'm also not afraid of considering alternatives. It just ... the evidence is actually overwhelming in some cases. 

So I've developed a lot of inner confidence. But also, I hold all my views lightly because there's no frame or set of concepts that quite captures reality. And so... in some way... I don't hold that tightly to any particular view or frame? But somehow I still am able to act with confidence? I dunno. 

Unreal

Relatedly, I think I am slightly concerned that you will start saying a bunch of clearly false things at some point. I don't know that it's a super justified fear but I don't think I have seen sufficient evidence to be totally confident it won't happen.

Like, to give an overly concrete example that is probably rude (and not intended to be very accurate to be clear), if at some point you start saying "Well I've realized that beauty is truth and the one way and we all need to follow that path and I'm not going to change my mind about this Ben and also it's affecting all of my behavior and I know that it seems like I'm doing things that are wrong but one day you'll understand why actually this is good" then I'll be like "Oh no, Ren's gone crazy". Probably you won't! I'm pattern matching a bit to what people sometimes say who've gone to religious retreats for ages and changed their lives.

Like in the last few years I saw a person turn religious (which was worrying to me) and then also on a different issue say "I'm never going to stop <holding my position>" and that person had previously been rationalist-adjacent for a while. [Added: And now I am going to hold far stronger personal boundaries around that person and avoid them having much power over me and so on.]

Anyway, this is meant as a pointer to my fears about religious people, not as a very personal analysis.

Ben Pace

I want to reiterate that I don't believe this is the world I'm living in and I'm writing this because I think it's good to share concerns about possible worlds to avoid. I'm not at all saying "Ren is totally like this" but instead I more wrote "What am I afraid of". I like you and like you being in my life :)

Ben Pace

OK! Well reading your sense was helpful. 

I like knowing that. 

I am fairly confident that personally, I, at least, am not going thru the thing you described above (using Nonlinear as an example). 

I ... if anything ... am much more able to stand up to my teacher than before. 

[Edit: I removed some text here going into some specifics about the above line. I didn't feel like publishing that stuff but am open to private dialogue about it, if anyone is curious.]

Unreal

Friday, October 13th

Checking in, I'd like to ask if we can publish this dialogue? I liked the conversation. I've edited a few of my sentences for grammar/wording, deanonymized some people, added an intro for context, given it a title, and made two larger additions that I've flagged that look like: [Added: <new stuff>].

Ben Pace

I feel like I never really got to explain my views on gossip in a clear way. I feel dissatisfied about it. Might try to add it in.

Unreal

Oh interesting. I did get a sense of what you thought was bad behavior from some of your bullets 'delineating' it, but sounds like you don't feel you said it very crisply.

Some options:

  1. Spend some time now trying to add it.
  2. Publish it, then tonight/tomorrow write a comment with more thoughts.
  3. Publish it, then if we want to we can come back here and continue the dialogue another day.
  4. F*ck it ship it, even if it could be better.
Ben Pace

(I personally am excited about sharing real conversation dialogues so I am making some request to publish even though we could both add more.)

Ben Pace

Yeah I don't need to wait on publishing. 

Unreal

Okie dokie :)

Ben Pace

Shall I hit publish?

Ben Pace

Aye.

Unreal

Addendum:

Ren's views on Relationship

The relationship or 'line' drawn between individuals is "real" in her ontology, and it can be damaged, repaired, or honored. There's always something there, even between people who've never met before—like on the other side of the world or something. There's no such thing as a total disconnection.

However, one can choose to pretend or act like a relationship is nonexistent. This is a kind of dishonoring of the truth, which is that something exists there, but we can refuse to acknowledge it or take responsibility for it or treat it as real. 

Much of Ren's examination of this phenomenon comes from years of Circling practice and also her many years of living in community. However, this view is corroborated by ancient traditions, including indigenous / Native ones, Buddhist ones, etc. 

An invocation we regularly use at MAPLE is from the Lakota tradition: Mitakuye Oyasin

These ancient traditions and practices teach, in their culture, to honor all life because we're all actually interconnected—everything impacts everything else. This feels trivially true, even from a purely materialistic standpoint. 

In Buddhism, this is often referred to as interdependence. 

In Circling, this principle is referred to as "Commitment to Connection" in the Circling Europe school—where I first learned to start taking this idea seriously. 

When we act in ways that dishonor connection or relationship, it can cause "damage." This "damage" in my view is due to the delusional aspect of it. If we act out of a deluded or wrong view of reality, we cause "damage"—but I see "damage" as "lying to oneself, to others, or to the world." When we act out of accord with truth, this is itself "damage." 

So when I heard that people were gossiping about me behind my back in a way that seemed uncaring about me, this created a felt 'rift' between myself and these unknown gossipers. I became angry, more distrustful, felt hurt, etc. This is a kind of "damage" in my relational world. In fact, in OUR relational world. It hurt the relationships itself, more than the individuals in the relationship. 

The impacts on me in a physical or material way are a second-order concern, not the primary concern.

The move toward division, disconnection, and disharmony between people I thought I had some trust or bond with... this is what I see as "damaging" and thus "inadvisable." And why I advocate for a norm against speech that creates a wound or fracture, in a relational field. 

The material impacts are real, but they all come downstream from living out of accord with a certain reality—that we are all connected, and all our relationships matter. To harm any of them ultimately harms ourselves and causes us to be deluded. Disconnected FROM reality, rather than connected with reality. 

If I gossip about someone (in an unwholesome way), and they never find out, this still creates the rift, the disconnection, and still damages something real. (This can be observed, just watch how I behave around or to that person after I make secretly harmful comments about them. Or watch how I start treating relationships in general. See if I experience more shame, guilt, need to conceal things, lie, pretend, etc. My behavior starts lining up with the reality I have personally created, through my speech act. This is "damage".)

Of course, gossip is valid in all the ways discussed above. But there are ways to do gossip that doesn't cause these fractures or rifts and maintains the reality of the connection, even if the gossip is 'negative'. Friends CAN talk about their friends behind their backs. Even saying negative, unflattering things. This isn't off limit! But are we doing it in a way that honors the relationship, or are we creating a sense that the relationship isn't important or isn't real? 

OK I feel like I have outlined the most major thing missing, from my philosophy, from the above. 

Unreal

Gossiping in a way that causes people to believe I am 'insane' is damaging in particular because it also discourages people from engaging with me... and instead encourages people to start actively avoiding and shunning me. This removes even the option of a kind of repair, reconciliation, or investigation caused by this sort of gossip. 

If it were true that I am insane—like y'know, esp in a potentially harmful way—I would advocate for people to warn other people about me. 

There are such people in the world, who in general I would advise you to avoid (unless you are willing to take certain risks). And I would have no problem saying so. 

But such people are, like, more in the category of acting in ways that are really quite harmful and are kind of beyond help or correction, for whatever reason. (As in, many people have made the attempt and failed.) I have people that seem to be in this category, and it is very sad. I still make attempts to reach out to them though, from a distance. 

But currently I do not believe most people should bother making active attempts to avoid me (like, more than the default). The evidence is... as far as I can tell... way more in the other direction—that engaging with me is beneficial for the people I engage with. And people often come to me for all kinds of reasons. And report benefit. And this is more true now than it was a few years ago. 

This might be harder to believe(?), but people come to me for my clarity, my ability to understand things and communicate them well, and my devotion to truth-seeking. I help bring other people more clarity in their lives, in their sense-making, and try to deepen their connection to the world, reality, themselves, etc. 

So to cause people to instead be tempted to avoid me, to shun me, or see me as crazy, seems harmful and disingenuous from where I am standing, given the info I have (versus the info that speculators have, which is almost nothing). 

I advise against casual, careless speculation on my insanity, given basically no direct contact with me. I wouldn't do it others—for fun, for profit, for personal gain, for egoic validation, or due to my own insecurities or due to entitlement. If I do it to you, I owe you an apology. 

Unreal

New to LessWrong?

New Comment
31 comments, sorted by Click to highlight new comments since: Today at 4:18 PM

Ben Pace, honorably quoting aloud a thing he'd previously said about Ren:

the other day i said [ren] seemed to be doing well to me

to clarify, i am not sure she has not gone crazy

she might've, i'm not close enough to be confident

i'd give it 25%

I really don't like this usage of the word "crazy", which IME is fairly common in the bay area rationality community.  This is for several reasons.  The simple to express one is that I really read through like 40% of this dialog thinking (from its title plus early conversational volleys) that people were afraid Ren had gone, like, the kind of "crazy" that acute mania or psychosis or something often is, where a person might lose their ability to do normal  tasks that almost everyone can do, like knowing what year it is or how to get to the store and back safely.  Which was a set of worries I didn't need to have, in this case.  I.e., my simple complaint is that it caused me confusion here.

The harder to express but more heartfelt one, is something like: the word "crazy" is a license to write people off.  When people in wider society use it about those having acute psychiatric crises, they give themselves a license to write off the sense behind the perceptions of like 2% or something of the population.  When the word is instead used about people who are not practicing LW!rationality, including ordinary religious people, it gives a license to write off a much larger chunk of people (~95% of the population?), so one is less apt to seek sense behind their perceptions and actions.  

This sort of writing-off is a thing people can try doing, if they want, but it's a nonstandard move and I want it to be visible as such.  That is, I want people to spell it out more, like: "I think Ren might've stopped being double-plus-sane like all the rest of us are" or "I think Ren might've stopped following the principles of LW!rationality" or something.  (The word "crazy" hides this as though it's the normal-person "dismiss ~2% of the population" move; these other sentences make make visible that it's an unusual and more widely dismissive move.)  The reason I want this move to be made visible in this way is partly that I think the outside view on (groups of people who dismiss those who aren't members of the group) is that this practice often leads to various bad things (e.g. increased conformity as group members fear being dubbed out-group; increased blindness to outside perspectives; difficulty collaborating with skilled outsiders), and I want those risks more visible.

(FWIW, I'd have the same response to a group of democrats discussing republicans or Trump-voters as "crazy", and sometimes have.  But IMO bay area rationalists say this sort of thing much more than other groups I've been part of.)

[-]habryka6mo2917

Hmm, for what it's worth, my model of Ben here was not talking about the "95% of the population" type of crazy, and I also don't mean that when I use the word crazy. 

I mean the type of crazy that I've seen quite a lot in the extended EA and rationality community, where like, people start believing in demons and try to quarantine their friends from memetic viruses, or start having anxiety attacks about Roko's basilisk all the time, or claim strongly that all life is better off being exterminated because all existence is suffering, or threaten to kill people on the internet, or plan terrorist attacks.

I currently assign around 25% probability to Ren ending up like that, or already being like that. Maybe "crazy" is the wrong word here, but Maple really seems to me like the kind of environment that would produce that kind of craziness, and I am seeing a lot of flags. This is vastly above population baseline for me. 

Maybe Ben disagrees with me on this, in which case that confirms your point, but I guess I am getting a bit of a feeling that Ben didn't want to give as harsh of a comparison in this conversation as my inner model of him thinks is warranted, because it would have produced a bunch of conflict, and I would like to avoid a false model of consensus forming in the comments.

Another cluster I'd throw in: believing in "woo" stuff (e.g. crystal healing, astrology, acupuncture) as anything other than a placebo.  Now, if someone was raised to believe in some of those things, I wouldn't count it heavily against them.  But if they at one point were a hard-nosed skeptic and later got seriously into that stuff, I'd take this as strong evidence that something had gone wrong with their mind.

Not quite sure it's a cluster; I can name just one major case of "prominent rationalist -> woo promoter", though I feel like there might be more,  That person I would say went somewhat generally crazy.

though I feel like there might be more

I can definitely think of two obvious examples just offhand, and I know that I’ve noticed more, but haven’t exactly kept track.

Sadly I meant a more complicated thing. Within my claim of 25% on 'crazy', here's a rough distribution of world-states I think I was referring to:

  • 50%: Has joined a new religion and is devoting their ~entire life around it being true and good even though it's IMO pretty obviously false and a waste of time, in a way where I'm like "I guess we're basically going to increasingly drift apart and end ties".
  • 30%: Has joined a new cult that, while seeming positive for many people, does have some unhinged beliefs and unethical norms that will cause a lot of damage. This is like joining Scientology than like joining Christianity, and could involve things like suicide pacts, becoming a sex cult, or engaging in organized crime / other coordinated unethical action.
  • 20%: Has personally lost the plot (closer to the way habryka describes) and will start taking unpredictable and obviously harmful actions, not in a way especially coordinated with others in her religious group. Giving concrete examples here feels hard and like it will be overly hypothesis-promoting. But basically things that will hurt herself or other people.

I think these are more extreme than the majority of the population (incl. most ordinary religious people). But my guess is that the first bullet is probably unfair to call 'crazy', and instead I should've given 12.5% to the claim. I think it's reasonable to think that it is mean and unfair of me to refer to the first thing as 'crazy', and regret it a bit.

[-]lc6mo64

This is part of the problem though, I don't think all of those things are crazy, and some of them seem to follow from standard LW and EA axioms.

start believing in demons and try to quarantine their friends from memetic viruses...threaten people... plan terrorist attacks

Sure, those are really bad.

start having anxiety attacks about Roko's basilisk all the time

Having anxiety attacks about things is pretty universally unhelpful. But if you're using Roko's basilisk as a shorthand for all of the problems of "AIs carrying out threats", including near term AIs in a multipolar setting, then it seems perfectly reasonable to be anxious about that. Labeling people crazy for being scared is inaccurate if the thing they're fearing is actually real+scary.

claim strongly that all live is better off being exterminated because all existence is suffering

Again, this depends on what you mean. I think if you take the EA worldview seriously then the obvious conclusion is that Earth life up until and including today has been net-negative because of animal suffering. My guess is also that most current paths through AI run an unacceptable risk of humans being completely subjugated by either some tech-government coalition or hitting some awful near miss section of mind space. Do either of those beliefs make me crazy?

Having anxiety attacks about things is pretty universally unhelpful. But if you're using Roko's basilisk as a shorthand for all of the problems of "AIs carrying out threats", then it seems perfectly reasonable to be anxious about that. Labeling people crazy seems misguided if the thing they're fearing is actually scary.

Agree that being anxious is totally fine, but the LW team has to deal with a relatively ongoing stream of people (like 3-4 a year) who really seem to freak out a lot about either Roko's basilisk or quantum immortality/suicide. To be clear, these are interesting ideas, but usually when we deal with these people they are clearly not in a good spot.

Again, this depends on what you mean. I think if you take the EA worldview seriously then the obvious conclusion is that Earth life up until and including today has been net-negative because of animal suffering.

Net-negative I think is quite different from "all existence is suffering". But also, yeah, I do think that the reason why I've encountered a lot of this kind of craziness in the EA/Rationality space is because we discuss a lot of ideas with really big implications, and have a lot of people who take ideas really seriously, which increases the degree to which people do go crazy. 

My guess is you are dealing fine with these ideas, though some people are not, which is sad, but also does mean I just encounter a pretty high density of people I feel justified in calling crazy. 

I think if you take the EA worldview seriously then the obvious conclusion is that Earth life up until and including today has been net-negative because of animal suffering.

Nit: I don't consider "the EA worldview" to have any opinion on animal suffering. But (roughly speaking) I agree you can get this conclusion from the EA worldview plus some other stuff which is also common among EAs.

[-]Unreal6mo5-13

so okay i'm actually annoyed by a thing... lemme see if i can articulate it. 

  1. I clearly have orders of magnitude more of the relevant evidence to ascertain a claim about MAPLE's chances of producing 'crazy' ppl as you've defined—and much more even than most MAPLE people (both current and former). 
  2. Plus I have much of the relevant evidence about my own ability to discern the truth (which includes all the feedback I've received, the way people generally treat me, who takes me seriously, how often people seem to want to back away from me or tune me out when I start talking, etc etc). 
  3. A bunch of speculators, with relatively very little evidence about either, come out with very strong takes on both of the above, and don't seem to want to take into account EITHER of the above facts, but instead find it super easy to dismiss any of the evidence that comes from people with the relevant data. Because of said 'possibility they are crazy'. 

And so there is almost no way out of this stupid box,; this does not incline me to try to share any evidence I have, and in general, reasonable people advise me against it. And I'm of the same opinion. It's a trap to try.

It is both easy and an attractor for ppl to take anything I say and twist it into more evidence for THEIR biased or speculative ideas, and to take things I say as somehow further evidence that I've just been brainwashed. And then they take me less seriously. Which then further disinclines me to share any of my evidence. And so forth. 

This is not a sane, productive, and working epistemic process? As far as I can tell? 

Literally I was like "I have strong evidence" and Ben's inclination was to say "strong evidence is easy to come by / is everywhere" and links to a relevant LW article, somehow dismissing everything I said previously and might say in the future with one swoop. It effectively shut me down. 

And I'm like.... 

what is this "epistemic process" ya'll are engaged in

[Edit: I misinterpreted Ben's meaning. He was saying the opposite of what I thought he meant. Sorry, Ben. Another case of 'better wrong than vague' for me. 😅]

To me, it looks like [ya'll] are using a potentially endless list of back-pocket heuristics and 'points' to justify what is convenient for you to continue believing. And it really seems like it has a strong emotional / feeling component that is not being owned. 

[edit: you -> ya'll to make it clearer this isn't about Oliver] 

I sense a kind of self-protection or self-preservation thing. Like there's zero chance of getting access to the true Alief in there. That's why this is pointless for me.

Also, a lot of online talk about MAPLE is sooo far from realistic that it would, in fact, make me sound crazy to try to refute it. A totally nonsensical view is actually weirdly hard to counter, esp if the people aren't being very intellectually honest AND the people don't care enough to make an effort or stick through it all the way to the end. 

[-]habryka6mo2313

I mean, I am not sure what you want me to do. If I had taken people at their word when I was concerned about them or the organizations they were part of, and just believed them on their answer on whether they will do reckless or dangerous or crazy things in the future, I would have gotten every single one of the cases I know about wrong.

Like, it's not impossible but seems very rare that when I am concerned about the kind of thing I am concerned about here and say "hey I am worried that you will do a crazy thing" that my interlocutor goes "yeah, I totally might do a crazy thing". So the odds ratio on someone saying "trust me, I am fine" is just totally flat, and doesn't help me distinguish between the different worlds.

So would a sane, productive and epistemic process just take you at your word? I don't think so, that seems pretty naive to me. But I have trouble reading your comment as asking for anything else. As you yourself say, you haven't given me any additional evidence, and you don't want to go into the details.

I also don't know what things you are claiming about my psychology here. I haven't made many comments on Maple or you, and the ones I have made seem reasonably grounded to me, so I don't know on what basis you are accusing me of "endless list of back-pocket heuristics and points". I don't know how it's convenient for me to think that my friends and allies and various institutions around me tend to do reckless and dangerous things at alarming rates. Indeed, it makes me very sad and I really wish it wasn't so. 

To be clear, I wouldn't particularly dismiss concrete evidence you give about MAPLE being a fine environment to be in. I would be surprised if you e.g. lied about verifiable facts, and would update if you told me about the everyday life there (of course I don't know in which direction I would update, since I would have already updated if I could predict that, but I don't feel like evidence you are giving me is screened off by me being concerned about you going 'crazy' in the relevant ways, though of course I am expecting various forms of filtered evidence which will make the updates a bit messier)

I think if you ask people a question like, "Are you planning on going off and doing something / believing in something crazy?", they will, generally speaking, say "no" to that, and that is roughly more likely the more isomorphic your question is to that, even if you didn't exactly word it that way. My guess is that it was at least heavily implied that you meant "crazy" by the way you worded it.

To be clear, they might have said "yes" (that they will go and do the thing you think is crazy), but I doubt they will internally represent that thing or wanting to do it as "crazy." Thus the answer is probably going to be one of, "no" (as a partial lie, where no indirectly points to the crazy assertion), or "yes" (also as a partial lie, pointing to taking the action).

In practice, people have a very hard time instantiating the status identifier "crazy" on themselves, and I don't think that can be easily dismissed.

I think the utility of the word "crazy" is heavily overestimated by you, given that there are many situations where the word cannot be used the same way by the people relevant to the conversation in which it is used. Words should have the same meaning to the people in the conversation, and since some people using this word are guaranteed to perceive it as hostile and some are not, that causes it to have asymmetrical meaning inherently.

I also think you've brought in too much risk of "throwing stones in a glass house" here. The LW memespace is, in my estimation, full of ideas besides Roko's Basilisk that I would also consider "crazy" in the same sense that I believe you mean it: Wrong ideas which are also harmful and cause a lot of distress.

Pessimism, submitting to failure and defeat, high "p(doom)", both MIRI and CFAR giving up (by considering the problems they wish to solve too inherently difficult, rather than concluding they must be wrong about something), and people being worried that they are "net negative" despite their best intentions, are all (IMO) pretty much the same type of "crazy" that you're worried about.

Our major difference, I believe, is in why we think these wrong ideas persist, and what causes them to be generated in the first place. The ones I've mentioned don't seem to be caused by individuals suddenly going nuts against the grain of their egregore.

I know this is a problem you've mentioned before and consider it both important and unsolved, but I think it would be odd to notice both that it seems to be notably worse in the LW community, but also to only be the result of individuals going crazy on their own (and thus to conclude that the community's overall sanity can be reliably increased by ejecting those people).

By the way, I think "sanity" is a certain type of feature which is considerably "smooth under expectation" which means roughly that if p(person = insane) = 25%, that person should appear to be roughly 25% insane in most interactions. In other words, it's not the kind of probability where they appear to be sane most of the time, but you suspect that they might have gone nuts in some way that's hard to see or they might be hiding it.

The flip side of that is that if they only appear to be, say, 10% crazy in most interactions, then I would lower your assessment of their insanity to basically that much.

I still find this feature, however, not altogether that useful, but using it this way is still preferable over a binary feature.

[-]Viliam6mo4-2

I also think you've brought in too much risk of "throwing stones in a glass house" here. The LW memespace is, in my estimation, full of ideas (...) that I would also consider "crazy"

That seems to me like an extra reason to keep "throwing stones". To make clear the line between the kind of "crazy" that rationalists enjoy, and the kind of "crazy" that is the opposite.

As an insurance, just in the (hopefully unlikely) case that tomorrow Unreal goes on a shooting spree, I would like to have it in writing - before it happened - that it happened because of ideas that the rationalist community disapproves of.

Otherwise, the first thing everyone will do is: "see, another rationalist gone crazy". And whatever objection we make afterwards, it will be like "yeah, now that the person is a bad PR, everyone says 'comrades, this is not true rationalism, the true rationalism has never been tried', but previously no one saw a problem with them".

(I am exaggerating a lot, of course. Also, this is not a comment on Unreal specifically, just on the value of calling out "crazy" memes, despite being perceived as "crazy" ourselves.)

[-]Unreal6mo2-2

The 'endless list' comment wasn't about you, it was a more 'general you'. Sorry that wasn't clear. I edited stuff out and then that became unclear. 

I mostly wanted to point at something frustrating for me, in the hopes that you or others would, like, get something about my experience here. To show how trapped this process is, on my end.

I don't need you to fix it for me. I don't need you to change. 

I don't need you to take me for my word. You are welcome to write me off, it's your choice. 

I just wanted to show how I am and why. 

I had written a longer comment, illustrating how Oliver was basically committing the thing that I was complaining about and why this is frustrating. 

The shorter version:

His first paragraph is a strawman. I never said 'take me at my word' or anything close. And all previous statements from me and knowing anything about my stances would point to this being something I would never say, so this seems weirdly disingenuous. 

His second paragraph is weirdly flimsy, implying that ppl are mostly using the literal words out of people's mouths to determine whether they're lying (either to others or to themselves). I would be surprised if Oliver would actually find Alice and Bob both saying "trust me i'm fine" would be 'totally flat' data, given he probably has to discern deception on a regular basis.

Also I'm not exactly the 'trust me i'm fine' type, and anyone who knows me would know that about me, if they bothered trying to remember. I have both the skill of introspection and the character trait of frankness. I would reveal plenty about my motives, aliefs, the crazier parts of me, etc. So paragraph 2 sounds like a flimsy excuse to be avoidant? 

But the IMPORTANT thing is... I don't want to argue. I wasn't interested in that. I was hoping for something closer to perspective-taking, reconciliation, or reaching more clarity about our relational status. But I get that I was sounding argumentative. I was being openly frustrated and directing that in your general direction. Apologies for creating that tension. 

FTR, the reason I am engaging with LW at all, like right now... 

I'm not that interested in preserving or saving MAPLE's shoddy reputation with you guys. 

But I remain deeply devoted to the rationalists, in my heart. And I'm impacted by what you guys do. A bunch of my close friends are among you. And... you're engaging in this world situation, which impacts all of us. And I care about this group of people in general. I really feel a kinship here I haven't felt anywhere else. I can relax around this group in a way I can't elsewhere. 

I concern myself with your norms, your ethical conduct, etc. I wish well for you, and wish you to do right by yourselves, each other, and the world. The way you conduct yourselves has big implications. Big implications for impacts to me, my friends, the world, the future of the world. 

You've chosen a certain level of global-scale responsibility, and so I'm going to treat you like you're AT THAT LEVEL. The highest possible levels with a very high set of expectations. I hold myself AT LEAST to that high of a standard, to be honest, so it's not hypocritical. 

And you can write me off, totally. No problem. 

But in my culture, friends concern themselves with their friends' conduct. And I see you as friends. More or less. 

If you write me off (and you know me personally), please do me the honor of letting me know. Ideally to my face. If you don't feel you are gonna do that / don't owe me that, then it would help me to know that also. 

Literally I was like "I have strong evidence" and Ben's inclination was to say "strong evidence is easy to come by / is everywhere" and links to a relevant LW article, somehow dismissing everything I said previously and might say in the future with one swoop. It effectively shut me down. 

Oh, this is a miscommunication. The thing I was intending to communicate when I linked to that post was that it is indeed plausible that you have observed strong evidence and that your confidence that you are in a healthy environment is accurate. I am saying that I think it is not in-principle odd or questionable to have very confident beliefs. I did not mean this to dismiss your belief, but to say the opposite, that your belief is totally plausible!

Oh, okay, I found that a confusing way to communicate that? But thanks for clarifying. I will update my comment so that it doesn't make you sound like you did something very dismissive. 

I feel embarrassed by this misinterpretation, and the implied state of mind I was in. But I believe it is an honest reflection about something in my state of mind, around this subject. Sigh. 

Anonymized paraphrase of a question someone asked about me (reported to me later, by the person who was being asked the question): 

I have a prior about people who go off to monasteries sometimes going nuts, is Renshin nuts?

The person being asked responded "nah" and the question-asker was like "cool" 

I think this sort of exchange might be somewhat commonplace or normal in the sphere. 

I personally didn't feel angry, offended, or sad to hear about this exchange, but I don't feel the person asking the question was asking out of concern or care for me, as a person. But rather to get a quick update for their world model or something. And my "taste" about this is a "bad" taste. I don't currently have time to elaborate but may later. 

Thanks for adding this. I felt really hamstrung by not knowing exactly what kind of conversation we were talking about, and this helps a lot.

I think it's legit that this type of conversation feels shitty to the person it is about. Having people talk about you like you're not a person feels awful. If it included someone with whom you had a personal relation with, I think it's legit that this hurts the relationships. Relationships are based on viewing each other as people. And I can see how a lot of generators of this kind of conversation would be bad.

But I think it's pretty important that people be able to do these kind of checks, for the purpose of updating their world model, without needing to fully boot up personal caring modules as if you were a friend they had an obligation to take care of.  There are wholesome generators that would lead to this kind of conversation, and having this kind of conversation is useful to a bunch of wholesome goals.  

Which doesn't make it feel any less painful. You're absolutely entitled to feel hurt, and have this affect your relationship with the people who do it. But this isn't (yet) a sufficient argument for "...and therefore people shouldn't have these kinds of conversations".

But I think it's pretty important that people be able to do these kind of checks, for the purpose of updating their world model, without needing to fully boot up personal caring modules as if you were a friend they had an obligation to take care of.  There are wholesome generators that would lead to this kind of conversation, and having this kind of conversation is useful to a bunch of wholesome goals.  

There is a chance we don't have a disagreement, and there is a chance we do. 

In brief, to see if there's a crux anywhere in here:

  • Don't need ppl to boot up 'care as a friend' module. 
  • Do believe compassion should be the motivation behind these conversations, even if not friends, where compassion = treats people as real and relationships as real. 
  • So it matters if the convo is like (A) "I care about the world, and doing good in the world, and knowing about Renshin's sanity is about that, at the base. I will use this information for good, not for evil." Ideally the info is relevant to something they're responsible for, so that it's somewhat plausible the info would be useful and beneficial. 
  • Versus (B) "I'm just idly curious about it, but I don't need to know and if it required real effort to know, I wouldn't bother. It doesn't help me or anyone to know it. I just want to eat it like I crave a potato chip. I want satisfaction, stimulation, or to feel 'I'm being productive' even if it's not truly so, and I am entitled to feel that just b/c I want to. I might use the info in a harmful way later, but I don't care. I am not really responsible for info I take in or how I use info." 
  • And I personally think the whole endeavor of modeling the world should be for the (A) motive and not the (B) motive, and that taking in any-and-all information isn't, like, neutral or net-positive by default. People should endeavor to use their intelligence, their models, and their knowledge for good, not for evil or selfish gain or to feed an addiction to feeling a certain way. 
  • I used a lot of 'should' but that doesn't mean I think people should be punished for going against a 'should'. It's more like healthy cultures, imo, reinforce such norms, and unhealthy cultures fail to see or acknowledge the difference between the two sets of actions. 

This was a great reply, very crunchy, I appreciate you spelling out your beliefs so legibly. 

  • Do believe compassion should be the motivation behind these conversations, even if not friends, where compassion = treats people as real and relationships as real. 

I'm confused here because that's not my definition of compassion and the sentence doesn't quite make sense to me if you plug that definition in. 

But I agree those questions should be done treating everyone involved as real and human. I don't believe they need to be done out of concern for the person. I also don't think the question needs to be motivated by any specific concern; desire for good models is enough. It's good if people ultimately use their models to help themselves and others, but I think it's bad to make specific questions or models justify their usefulness before they can be asked. 

Hm, neither of the motives I named include any specific concern for the person. Or any specific concern at all. Although I do think having a specific concern is a good bonus? Somehow you interpreted what I said as though there needs to be specific concerns. 

RE: The bullet point on compassion... maybe just strike that bullet point.  It doesn't really affect the rest of the points. 

It's good if people ultimately use their models to help themselves and others, but I think it's bad to make specific questions or models justify their usefulness before they can be asked. 

I think I get what you're getting at. And I feel in agreement with this sentiment. I don't want well-intentioned people to hamstring themselves. 

I certainly am not claiming ppl should make a model justify its usefulness in a specific way. 

I'm more saying ppl should be responsible for their info-gathering and treat that with a certain weight. Like a moral responsibility comes with information. So they shouldn't be cavalier about it.... but especially they should not delude themselves into believing they have good intentions for info when they do not. 

And so to casually ask about Alice's sanity, without taking responsibility for the impact of speech actions and failing to acknowledge the potential damage to relationships (Alice's or others), is irresponsible. Even if Alice never hears about this exchange, it nonetheless can cause a bunch of damage, and a person should speak about these with eyes open, about that. 

Could you say more on what you mean by "with compassion" and "taking responsibility for the impact of speech actions"?

I'm fine with drilling deeper but I currently don't know where your confusion is. 

I assume we exist in different frames, but it's hard for me to locate your assumptions. 

I don't like meandering in a disagreement without very specific examples to work with. So maybe this is as far as it is reasonable to go for now. 

That makes sense. Let me take a stab at clarifying, but if that doesn't work seems good to stop.

You said

to casually ask about Alice's sanity, without taking responsibility for the impact of speech actions and failing to acknowledge the potential damage to relationships (Alice's or others), is irresponsible. Even if Alice never hears about this exchange, it nonetheless can cause a bunch of damage, and a person should speak about these with eyes open, about that

When I read that, my first thought is that before (most?) every question, you want people to think hard and calculate the specific consequences asking that question might have, and ask only if the math comes out strongly positive. They bear personal responsibility for anything in which their question played any causal role. I think that such a policy would be deeply harmful. 

But another thing you could mean is that people who have a policy of asking questions like this should be aware and open about the consequences of their general policies on questions they ask, and have feedback loops that steer themselves towards policies that produce good results on average. That seems good to me. I'm generally in favor of openly acknowledging costs even when they're outweighed by benefits, and I care more that people have good feedback loops than that any one action is optimal. 

I would never have put it as either of these, but the second one is closer. 

For me personally, I try to always have an internal sense of my inner motivation before/during doing things. I don't expect most people do, but I've developed this as a practice, and I am guessing most people can, with some effort or practice. 

I can pretty much generally tell whether my motivation has these qualities: wanting to avoid, wanting to get away with something, craving a sensation, intention to deceive or hide, etc. And when it comes to speech actions, this includes things like "I'm just saying something to say something" or "I just said something off/false/inauthentic" or "I didn't quite mean what I just said or am saying". 

Although, the motivations to really look out for are like "I want someone else to hurt" or "I want to hurt myself" or "I hate" or "I'm doing this out of fear" or "I covet" or "I feel entitled to this / they don't deserve this" or a whole host of things that tend to hide from our conscious minds. Or in IFS terms, we can get 'blended' with these without realizing we're blended, and then act out of them. 

Sometimes, I could be in the middle of asking a question and notice that the initial motivation for asking it wasn't noble or clean, and then by the end of asking the question, I change my inner resolve or motive to be something more noble and clean. This is NOT some kind of verbal sentence like going from "I wanted to just gossip" to "Now I want to do what I can to help." It does not work like that. It's more like changing a martial arts stance. And then I am more properly balanced and landed on my feet, ready to engage more appropriately in the conversation. 

What does it mean to take personal responsibility? 

I mean, for one example, if I later find out something I did caused harm, I would try to 'take responsibility' for that thing in some way. That can include a whole host of possible actions, including just resolving not to do that in the future. Or apologizing. Or fixing a broken thing. 

And for another thing, I try to realize that my actions have consequences and that it's my responsibility to improve my actions. Including getting more clear on the true motives behind my actions. And learning how to do more wholesome actions and fewer unwholesome actions, over time. 

I almost never use a calculating frame to try to think about this. I think that's inadvisable and can drive people onto a dark or deluded path 😅

I 100% agree it's good to cultivate an internal sense of motivation, and move to act from motives more like curiosity and care, and less like prurient gossip and cruelty. I don't necessarily think we can transition by fiat, but I share the goal.

But I strongly reject "I am responsible for mitigating all negative consequences of my actions". If I truthfully accuse someone of a crime and it correctly gets them fired, am I responsible for feeding and housing them? If I truthfully accuse someone of a crime but people overreact, am I responsible for harm caused by overreaction? Given that the benefits of my statement accrue mostly to other people, having me bear the costs seems like a great way to reduce the supply of truthful, useful negative facts being shared in public. 

I agree it's good to acknowledge the consequences, and that this might lead to different actions on the margin. But that's very different than making it a mandate. 

I'll list some benefits of gossiping about people who appear to have gone crazy.  "Knowing to beware of those people" was mentioned, but here are others:

  • If you know what they had been doing prior to going crazy, which seems potentially causally related (e.g. taking certain drugs, being already in a mentally vulnerable state for other reasons, hanging out with certain crazy-ish people, and/or obsessing about certain books or blogs), then you can update your beliefs about what's dangerous to do.  Which can inform your own behavior and possibly that of your friends.
  • If you know how that person behaves currently or in the past, you can update your model of how to estimate a given person's current or future sanity based on their behavior.

I'll note it seems common for people, when they hear that someone died, to want to know how they died, especially if they died young.  This seems obviously evolutionarily useful—learning about the dangers in your environment—and it seems plausibly an evolved desire.  You can replace "went crazy" with "died" above (or with any significantly negative outcome), and most of it applies directly.

Sometimes people want to go off and explore things that seem far away from their in-group, and perhaps are actively disfavored by their in-group. These people don't necessarily know what's going to happen when they do this, and they are very likely completely open to discovering that their in-group was right to distance itself from that thing, but also, maybe not. 

People don't usually go off exploring strange things because they stop caring about what's true. 

But if their in-group sees this as the person "no longer caring about truth-seeking," that is a pretty glaring red-flag on that in-group. 

Also, the gossip / ousting wouldn't be necessary if someone was already inclined to distance themselves from the group. 

Like, to give an overly concrete example that is probably rude (and not intended to be very accurate to be clear), if at some point you start saying "Well I've realized that beauty is truth and the one way and we all need to follow that path and I'm not going to change my mind about this Ben and also it's affecting all of my behavior and I know that it seems like I'm doing things that are wrong but one day you'll understand why actually this is good" then I'll be like "Oh no, Ren's gone crazy".

"I'm worried that if we let someone go off and try something different, they will suddenly become way less open to changing their mind, and be dead set on thinking they've found the One True Way" seems like something weird to be worried about. (It also seems like something someone who actually was better characterized by this fear would be more likely to say about someone else!) I can see though, if you're someone who tends not to trust themselves, and would rather put most of their trust in some society, institution or in-group, that you would naturally be somewhat worried about someone who wants to swap their authority (the one you've chosen) for another one.  

I sometimes feel a bit awkward when I write these types of criticisms, because they simultaneously seem:

  • Directed at fairly respected, high-level people.
  • Rather straightforwardly simple, intuitively obvious things (from my perspective, but I also know there are others who would see things similarly).
  • Directed at someone who by assumption would disagree, and yet, I feel like the previous point might make these criticisms feel condescending. 

The only times that people actually are incentivized to stop caring about the truth is in a situation where their in-group actively disfavors it by discouraging exploration. People don't usually unilaterally stop caring about the truth via purely individual motivations. 

(In-groups becoming culty is also a fairly natural process too, no matter what the original intent of the in-group was, so the default should be to assume that it has culty-aspects, accept that as normal, and then work towards installing mitigations to the harmful aspects of that.)

"I'm worried that if we let someone go off and try something different, they will suddenly become way less open to changing their mind, and be dead set on thinking they've found the One True Way" seems like something weird to be worried about.

This both seems like a totally reasonable concern to have, and also missing many of the concerning elements of the thing it's purportedly summarizing, like, you know, suddenly having totally nonsensical beliefs about the world.

People don’t usually go off exploring strange things because they stop caring about what’s true.

On the contrary, there are certain things which people do, in fact, only “explore” seriously if they’ve… “stopped” is a strong term, but, at least, stopped caring about the truth as much. (Or maybe reveal that they never cared as much as they said?) And then, reliably, after “exploring” those things, their level of caring about the truth drops even more. Precipitously, in fact.

(The stuff being discussed in the OP is definitely, definitely an example of this. Like, very obviously so, to the point that it seems bizarre to me to say this sort of stuff and then go “I wonder why anyone would think I’m crazy”.)