Viliam_Bur comments on Open thread, Oct. 27 - Nov. 2, 2014 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (400)
It is extremely important to find out how to have a successful community without sociopaths.
(In far mode, most people would probably agree with this. But when the first sociopath comes, most people would be like "oh, we can't send this person away just because of X; they also have so many good traits" or "I don't agree with everything they do, but right now we are in a confict with the enemy tribe, and this person can help us win; they may be an asshole, but they are our asshole". I believe that avoiding these - any maybe many other - failure modes is critical if we ever want to have a Friendly society.)
It seems to me there may be more value in finding out how to have a successful community with sociopaths. So long as the incentives are set up so that they behave properly, who cares what their internal experience is?
(The analogy to Friendly AI is worth considering, though.)
Ok, so start by examining the suspected sociopath's source code. Wait, we have a problem.
What do you mean with the phrase "sociopath"?
A person who's very low on empathy and follows intellectual utility calculations might very well donate money to effective charities and do things that are good for this community even when the same person fits the profile of what get's clinically diagnosed as sociopathy.
I think this community should be open for non-neurotypical people with low empathy scores provided those people are willing to act decently.
I'd rather avoid going too deeply into definitions here. Sometimes I feel that if a group of rationalists were in a house that is on fire, they would refuse to leave the house until someone gives them a very precise definition of what exactly does "fire" mean, and how does it differ on quantum level from the usual everyday interaction of molecules. Just because I cannot give you a bulletproof definition in a LW comment, it does not mean the topic is completely meaningless.
Specifically I am concerned about the type of people who are very low on empathy and their utility function does not include other people. (So I am not speaking about e.g. people with alexithymia or similar.) Think: professor Quirrell, in real life. Such people do exist.
(I once had a boss like this for a short time, and... well, it's like an experience from a different planet. If I tried to describe it using words, you would probably just round it to the nearest neurotypical behavior, which would completely miss the point. Imagine a superintelligent paperclip maximizer in a human body, and you will probably have a better approximation. Yeah, I can imagine how untrustworthy this sounds. Unfortunately, that also is a part of a typical experience with a sociopath: first, you start doubting even your own senses, because nothing seems to make sense anymore, and you usually need a lot of time afterwards to sort it out, and then it is already too late to do something about it; second, you realize that if you try to describe it to someone else, there is no chance they would believe you unless they already had this type of experience.)
I'd like to agree with the spirit of this. But there is the problem that the sociopath would optimize their "indecent" behavior to make it difficult to prove.
I'm not saying that the topic is meaningless. I'm saying that if you call for discrimination of people with a certain psychological illness you should know what you are talking about.
Base rates for clinical psychopathy is sometimes cited as 5%. In this community there are plenty of people who don't have a properly working empathy module. Probably more than average in society.
When Eliezer says that he thinks based on typical mind issues that he feels that everyone who says: "I feel your pain" has to be lying that suggests a lack of a working empathy module. If you read back the first April article you find wording about "finding willing victims for BDSM". The desire for causing other people pain is there. Eliezer also checks other things such as a high belief in his own importance for the fate of the world that are typical for clinical psychopathy. Promiscuous sexual behavior is on the checklist for psychopathy and Eliezer is poly.
I'm not saying that Eliezer clearly falls under the label of clinical psychopathy, I have never interacted with him face to face and I'm no psychologist. But part of being rational is that you don't ignore patterns that are there. I don't think that this community would overall benefit from kicking out people who fill multiple marks on that checklist.
Yvain is smart enough to not gather the data for amount of LW members diagnosed with psychopathy when he asks for mental illnesses. I think it's good that way.
If you actually want to do more than just signaling that you like people to be friendly and get applause, than it makes a lot of sense to specify which kind of people you want to remove from the community.
I am not an expert on this, but I think the kind of person I have in mind would not bother to look for willing BDSM victims. From their point of view, there are humans all around, and their consent is absolutely irrelevant, so they would optimize for some other criteria instead.
This feels to me like worrying about a vegetarian who eats "soy meat" because it exposes their unconscious meat-eating desire, while there are real carnivores out there.
I am not even sure if "removing a kind of people" is the correct approach. (Fictional evidence says no.) My best guess at this moment would be to create a community where people are more open to each other, so when some person harms another person, they are easily detected, especially if they have a pattern. Which also has a possible problem with false reporting; which maybe also could be solved by noticing patterns.
Speaking about society in general, we have an experience that sociopaths are likely to gain power in different kinds of organizations. It would be naive to expect that rationalist communities would be somehow immune to this; especially if we start "winning" in the real world. Sociopaths have an additional natural advantage that they have more experience dealing with neurotypicals, than neurotypicals have with dealing with sociopaths.
I think someone should at least try to solve this problem, instead of pretending it doesn't exist or couldn't happen to us. Because it's just a question of time.
Human beings frequently like to think of people they don't like and understand as evil. There various very bad mental habits associated with it.
Academic psychology is a thing. It actually describes how certain people act. It describes how psychopaths acts. They aren't just evil. Their emotional processes is screwed in systematic ways.
Translated into every day language that's: "Rationalists should gossip more about each other." Whether we should follow that maxime is a quite complex topic on it's own and if you think that's important write an article about it and actually address the reasons why people don't like to gossip.
You are not really addressing what I said. It's very likely that we have people in this community who fulfill the criteria of clinical psychopathy and I also remember an account of a person who said they trusted another person from a LW meetup who was a self declared egoist too much and ended up with a bad interaction because they didn't take the openness the person who said that they only care about themselves at face value.
Given your moderator position, do you think that you want to do something to garden but lack power at the moment? Especially dealing with the obvious case? If so, that's a real concern. Probably worth addressing more directly.
Unfortunately, I don't feel qualified enough to write an article about this, nor to analyze the optimal form of gossip. I don't think I have a solution. I just noticed a danger, and general unwillingness to debate it.
Probably the best thing I can do right now is to recommend good books on this topic. That would be:
I admit I do have some problems with moderating (specifically, the reddit database is pure horror, so it takes a lot of time to find anything), but my motivation for writing in this thread comes completely from offline life.
As a leader of my local rationalist community, I was wondering about the things that could happen if the community becomes greater and more successful. Like, if something bad happened within the community, I would feel personally responsible for the people I have invited there by visions of rationality and "winning". (And "something bad" offline can be much worse than mere systematic downvoting.) Especially if we would achieve some kind of power in real life, which is what I hope to do one day. I want to do something better than just bring a lot of enthusiastic people to one place and let the fate decide. I trust myself not to start a cult, and not to abuse others, but that itself is no reason for others to trust me; and also, someone else may replace me (rather easily, since I am not good at coalition politics); or someone may do evil things under my roof, without me even noticing. Having a community of highly intelligent people has the risk that the possible sociopaths, if they come, will likely also be highly intelligent. So, I am thinking about what makes a community safe or unsafe. Because if the community grows large enough, sooner or later problems start happening. I would rather be prepared in advance. Trying to solve the problem ad-hoc would probably totally seem like a personal animosity or joining one faction in an internal conflict.
Can you express what you want to protect against while tabooing words like "bad", "evil", and "abuse"?
In the ideal world we could fully trust all people in our tribe to do nothing bad. Simply because we have known a people for years we could trust a person to do good.
That's no rational heuristic. Our world is not structured in a way where the amount of time we know a person is a good heuristic for the amount of trust we can give that person.
There are a bunch of people I meet in the topic of personal development whom I trust very easily because I know the heuristics that those people use.
If you have someone in your local LW group who tells you that his utility function is that he maximizes his own utility and who doesn't have empathy that would make him feel bad when he abuses others, the rational thing is to not trust that person very much.
But if you use that as a criteria for kicking people out you people won't be open about their own beliefs anymore.
In general trusting people a lot who tick half of the criterias that constitute clinical psychopathy isn't a good idea.
On the other hand LW is per default inclusive and not structured in a way where it's a good idea to kick out people on such a basis.
Intelligent sociopaths generally don't go around telling people that they're sociopaths (or words to that effect), because that would put others on their guard and make them harder to get things out of. I have heard people saying similar things before, but they've generally been confused teenagers, Internet Tough Guys, and a few people who're just really bad at recognizing their own emotions -- who also aren't the best people to trust, granted, but for different reasons.
I'd be more worried about people who habitually underestimate the empathy of others and don't have obviously poor self-image or other issues to explain it. Most of the sociopaths I've met have had a habit of assuming those they interact with share, to some extent, their own lack of empathy: probably typical-mind fallacy in action.
The usually won't say it in a way that the would predict will put other people on guard. On the other hand that doesn't mean that they don't say it at all.
I don't find the link at the moment but a while ago someone posted on LW that he shouldn't have trusted another person from a LW meetup who openly said those things and then acted like that.
Categorising Internet Tough Guys is hard. Base rates for psychopathy aren't that low but you are right that not everyone who says those things is a psychopath. Even that it's a signal for not giving full trust to that person.
What do you mean by "harm". I have to ask because there is a movement (commonly called SJW) pushing an insanely broad definition of "harm". For example, if you've shattered someone's worldview have you "harmed" him?
Not per se, although there could be some harm in the execution. For example if I decide to follow someone every day from their work screaming at them "Jesus is not real", the problem is with me following them every day, not with the message. Or, if they are at a funeral of their mother and the priest is saying "let's hope we will meet our beloved Jane in heaven with Jesus", that would not be a proper moment to jump and scream "Jesus is not real".
(a) What exactly is the problem? I don't really see a sociopath getting enough power in the community to take over LW as a realistic scenario.
(b) What kind of possible solutions do you think exist?
Steve Sailer's description of Michael Milken:
Is that the sort of description you have in mind?
I really doubt the possibility to convey this in mere words. I had previous experience with abusive people, I studied psychology, I heard stories from other people... and yet all this left me completely unprepared, and I was confused and helpless like a small child. My only luck was the ability to run away.
If I tried to estimate a sociopathy scale from 0 to 10, in my life I have personally met one person who scores 10, two people somewhere around 2, and most nasty people were somewhere between 0 and 1, usually closer to 0. If I wouldn't have met than one specific person, I would believe today that the scale only goes from 0 to 2; and if someone tried to describe me how the 10 looks like, I would say "yeah, yeah, I know exactly what you mean" while having a model of 2 in my mind. (And who knows; maybe the real scale goes up to 20, or 100. I have no idea.)
Imagine a person who does gaslighting as easily as you do breathing; probably after decades of everyday practice. A person able to look into your eyes and say "2 + 2 = 5" so convincingly they will make you doubt your previous experience and believe you just misunderstood or misremembered something. Then you go away, and after a few days you realize it doesn't make sense. Then you meet them again, and a minute later you feel so ashamed for having suspected them of being wrong, when in fact it was obviously you who were wrong.
If you try to confront them in front of another person and say: "You said yesterday that 2 + 2 = 5", they will either look the other person in the eyes and say "but really, 2 + 2 = 5" and make them believe so, or will look at you and say: "You must be wrong, I have never said that 2 + 2 = 5, you are probably imagining things"; whichever is more convenient for them at the moment. Either way, you will look like a total idiot in front of the third party. A few experiences like this, and it will become obvious to you that after speaking with them, no one would ever believe you contradicting them. (When things get serious, these people seem ready to sue you for libel and deny everything in the most believable way. And they have a lot of money to spend on lawyers.)
This person can play the same game with dozens of people at the same time and not get tired, because for them it's as easy as breathing, there are no emotional blocks to overcome (okay, I cannot prove this last part, but it seems so). They can ruin lives of some of them without hesitation, just because it gives them some small benefit as a side effect. If you only meet them casually, your impression will probably be "this is an awesome person". If you get closer to them, you will start noticing the pattern, and it will scare you like hell.
And unless you have met such person, it is probably difficult to believe that what I wrote is true without exaggeration. Which is yet another reason why you would rather believe them than their victim, if the victim would try to get your help. The true description of what really happened just seems fucking unlikely. On the other hand their story would be exactly what you want to hear.
No, that is completely unlike. That sounds like some super-nerd.
Your first impression from the person I am trying to describe would be "this is the best person ever". You would have no doubt that anyone who said anything negative about such person must be a horrible liar, probably insane. (But you probably wouldn't hear many negative things, because their victims would easily predict your reaction, and just give up.)
Not a person, but I've had similar experiences dealing with Cthulhu and certain political factions.
Sure, human terms are usually applied to humans. Groups are not humans, and using human terms for them would at best be a metaphor.
On the other hand, for your purpose (keeping LW a successful community), groups that collectively act like a sociopath are just as dangerous as individual sociopaths.
Narcissist Characteristics
I was wondering if this sounds like your abusive boss-- it's mostly a bunch of social habits which could be identified rather quickly.
I think the other half is the more important one: to have a successful community, you need to be willing to be arbitrary and unfair, because you need to kick out some people and cannot afford to wait for a watertight justification before you do.
The best ruler for a community is an uncorruptible, bias-free, dictator. All you need to do to implement this is to find an uncorruptible, bias-free dictator. Then you don't need a watertight justification because those are used to avoid corruption and bias and you know you don't have any of that anyway.
I'm not being utopian, I'm giving pragmatic advice based on empirical experience. I think online communities like this one fail more often by allowing bad people to continue being bad (because they feel the need to be scrupulously fair and transparent) than they do by being too authoritarian.
I think I know what you mean. The situations like: "there is 90% probability that something bad happened, but 10% probability that I am just imagining things; should I act now and possibly abuse the power given to me, or should I spend a few more months (how many? I have absolutely no idea) collecting data?"
The thing is from what I've heard the problem isn't so much sociopaths as ideological entryists.
There is also that kinda-important bit about shared values...
How do you even reliably detect sociopaths to begin with? Particularly with online communities where long game false social signaling is easy. The obviously-a-sociopath cases are probably among the more incompetent or obviously damaged and less likely to end up doing long-term damage.
And for any potential social apparatus for detecting and shunning sociopaths you might come up with, how will you keep it from ending up being run by successful long-game signaling sociopaths who will enjoy both maneuvering themselves into a position of political power and passing judgment and ostracism on others?
The problem of sociopaths in corporate settings is a recurring theme in Michael O. Church's writings, but there's also like a million pages of that stuff so I'm not going to try and pick examples.
All cheap detection methods could be fooled easily. It's like with that old meme "if someone is lying to you, they will subconsciously avoid looking into your eyes", which everyone has already heard, so of course today every liar would look into your eyes.
I see two possible angles of attack:
a) Make a correct model of sociopathy. Don't imagine sociopaths to be "like everyone else, only much smarter". They probably have some specific weakness. Design a test they cannot pass, just like a colorblind person cannot pass a color blindness test even if they know exactly how the test works. Require passing the test for all positions of power in your organization.
b) If there is a typical way sociopaths work, design an environment so that this becomes impossible. For example, if it is critical for manipulating people to prevent their communication among each other, create an environment that somehow encourages communication between people who would normally avoid each other. (Yeah, this sounds like reversing stupidity. Needs to be tested.)
I think it's extremely likely that any system for identifying and exiling psychopaths can be co-opted for evil, by psychopaths. I think rules and norms that act against specific behaviors are a lot more robust, and also are less likely to fail or be co-opted by psychopaths, unless the community is extremely small. This is why in cities we rely on laws against murder, rather than laws against psychopathy. Even psychopaths (usually) respond to incentives.
Why is this important?
My goal is to create a rationalist community. A place to meet other people with similar values and "win" together. I want to optimize my life (not just my online quantum physics debating experience). I am thinking strategically about an offline experience here.
Eliezer wrote about how a rationalist community might need to defend itself from an attack of barbarians. In my opinion, sociopaths are even greater danger, because they are more difficult to detect, and nerds have a lot of blind spots here. We focus on dealing with forces of nature. But in the social world, we must also deal with people, and this is our archetypal weakness.
The typical nerd strategy for solving conflicts is to run away and hide, and create a community of social outcasts where everything is tolerated, and the whole group is safe more or less because it has so low status that typical bullies rather avoid it. But at the moment we start "winning", this protective shield is over, and we do not have any other coping strategy. Just like being rich makes you an attractive target for thieves, being successful (and I hope rationalist groups will become successful in near future) makes your community a target for people who love to exploit people and get power. And all they need to get inside is to be intelligent and memorize a few LW keywords. Once your group becomes successful, I believe it's just a question of time. (Even a partial success, which for you is merely a first step along a very long way, can already do this.) That will happen much sooner than any "barbarians" would consider you a serious danger.
(I don't want to speak about politics here, but I believe that many political conflicts are so bad because most of the sides have sociopaths as their leaders. It's not just the "affective death spirals", although they also play a large role. But there are people in important positions who don't think about "how to make the world a better place for humans", but rather "how could I most benefit from this conflict". And the conflict often continues and grows because that happens to be the way for those people to profit most. And this seems to happen on all sides, in all movements, as soon as there is some power to be gained. Including movements that ostensibly are against the concept of power. So the other way to ask my question would be: How can a rationalist community get more power, without becoming dominated by people who are willing to sacrifice anything for power? How to have a self-improving Friendly human community? If we manage to have a community that doesn't immediately fall apart, or doesn't become merely a debate club, this seems to me like the next obvious risk.)
How do you come to that conclusion? Simply because you don't agree with their actions? Otherwise are there trained psychologists who argue that position in detail and try to determine how politicians score on the Hare scale?
Uhm, no. Allow me to quote from my other comment:
I hope it illustrates that my mental model has separate buckets for "people I suspect to be sociopaths" and "people I disagree with".
Diagnosing mental illness based on the kind of second hand information you have about politicians isn't a trivial effort. Especially if you lack the background in psychology.
Are you directing this at LW? Ie. is there a sociopath that you think is bad for our community?
Well, I suspect Eugine Nier may have been one, to show the most obvious example. (Of course there is no way to prove it, there are always alternative explanations, et cetera, et cetera, I know.)
Now that was an online behavior. Imagine the same kind of person in real life. I believe it's just a question of time. Using the limited experience to make predictions, such person would be rather popular, at least at the beginning, because they would keep using the right words that are tested to evoke a positive response from many lesswrongers.
A "sociopath" is not an alternative label for [someone I don't like.] I am not sure what a concise explanation for the sociopath symptom cluster is, but it might be someone who has trouble modeling other agents as "player characters", for whatever reason. A monster, basically. I think it's a bad habit to go around calling people monsters.
I know; I know; I know. This is exactly what makes this topic so frustratingly difficult to explain, and so convenient to ignore.
The thing I am trying to say is that if a real monster would come to this community, sufficiently intelligent and saying the right keywords, we would spend all our energy inventing alternative explanations. That although in far mode we admit that the prior probability of a monster is nonzero (I think the base rate is somewhere around 1-4%), in near mode we would always treat it like zero, and any evidence would be explained away. We would congratulate ourselves for being nice, but in reality we are just scared to risk being wrong when we don't have convincingly sounding verbal arguments on our side. (See Geek Social Fallacy #1, but instead of "unpleasant" imagine "hurting people, but only as much as is safe in given situation".) The only way to notice the existence of the monster is probably if the monster decides to bite you personally in the foot. Then you will realize with horror that now all other people are going to invent alternative explanations why that probably didn't happen, because they don't want to risk being wrong in a way that would feel morally wrong to them.
I don't have a good solution here. I am not saying that vigilantism is a good solution, because the only thing the monster needs to draw attention away is to accuse someone else of being a monster, and it is quite likely that the monster will sound more convincing. (Reversed stupidity is not intelligence.) Actually, I believe this happens rather frequently. Whenever there is some kind of a "league against monsters", it is probably a safe bet that there is a monster somewhere at the top. (I am sure there is a TV Tropes page or two about this.)
So, we have a real danger here, but we have no good solution for it. Humans typically cope with such situations by pretending that the danger doesn't exist. I wish we had a better solution.
I can believe that 1% - 4% of people have little or no empathy and possibly some malice in addition. However, I expect that the vast majority of them don't have the intelligence/social skills/energy to become the sort of highly destructive person you describe below.
That's right. The kind of person I described seems like combination of sociopathy + high intelligence + maybe something else. So it is much less than 1% of population.
(However, their potential ratio in rationalist community is probably greater than in general population, because our community already selects for high intelligence. So, if high intelligence would be the only additional factor -- which I don't know whether it's true or not -- it could again be 1-4% among the wannabe rationalists.)
I would describe that person as a charismatic manipulator. I don't think it requires being a sociopath, though being one helps.
The kind of person you described has extraordinary social skills as well as being highly (?) intelligent, so I think we're relatively safe. :-)
I can hope that a people in a rationalist community would be better than average at eventually noticing they're in a mind-warping confusion and charisma field, but I'm really hoping we don't get tested on that one.
Returning to the original question ("Where are you right, while most others are wrong? Including people on LW!"), this is exactly the point where my opinion differs from the LW consensus.
For a sufficiently high value of "eventually", I agree. I am worried about what would happen until then.
I'm hoping that this is not the best answer we have. :-(
To what extent is that sort of sociopath dependent on in-person contact?
Thinking about the problem for probably less than five minutes, it seems to me that the challenge is having enough people in the group who are resistant to charisma. Does CFAR or anyone else teach resistance to charisma?
Would noticing when one is confused and writing the details down help?
https://allthetropes.orain.org/wiki/Hired_to_Hunt_Yourself
Why do you suspect so? Gaming ill-defined social rules of an internet forum doesn't look like a symptom of sociopathy to me.
You seem to be stretching the definition too far.
Abusing rules to hurt people is at least a weak evidence. Doing it persistently for years, even more so.