Basically, I cannot stand people who will not bow to the Truth.

I always had this trait, but I noticed lately that it is becoming worse, and has consequences.  Ironically, the main trigger seems to be the sequences. They gave me a confidence that sometimes frightens me. There are multiple manifestations:

  • Before, I had no problem whatsoever with believers of various religions (as long as they don't do bad things). I was still acting like an agnostic at that time. Now I tend to think less of them.
  • Before, I tolerated disagreement about some subjects, like the supernatural. Now I loath any form of epistemic relativism.
  • I now tend regard anyone who isn't Bayesian as either uneducated or moronic. Same thing about materialist reductionism, only with a slightly lower confidence. (And my inability to convince people of the validity of Occam's Razor doesn't help.)
  • I sound more and more arrogant, and possibly full of myself.
  • My urge to rewire the brain of anyone who won't listen grows stronger.

The closest semi-famous embodiment of this character trait I can think of is Xah Lee. I like much of his writing, but he can be very blunt, sometimes to the point of insult.

Needless to say, I do not endorse all these changes. The problem is, while I know I should calm down, I just can't lose when I'm confident truth is on my side. I'm not even sure I should. (Note however that I'm rather good at losing to evidence.)

So, what do you think? What should I do? Thanks.

New Comment
85 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

The best cure against such prideful attitudes is to ask yourself what you have to show in terms of practical accomplishments and status if you're so much more rational and intellectually advanced than ordinary people. If they are so stupid and delusional to be deserving of such intolerance and contempt, then an enlightened and intellectually superior person should be able to run circles around them and easily come out on top, no?

Now, if you actually have extremely high status and extraordinary accomplishments, then I guess you can justify your attitudes of contemptuous superiority. (Although an even higher status is gained by cultivating attitudes of aristocratic generosity and noblesse oblige.) If not, however, and if you're really good at "losing to evidence," as you put it, this consideration should be enough to make your attitudes more humble.

6scientism
I don't think it follows from being "more rational and intellectually advanced" that you would be more accomplished and have higher status. This is especially true if you're surrounded by incompetents. For example, how would a rational person achieve high status if the majority of people making status judgments are irrational? To "run circles around them" by exploiting their foolishness would require such a high-level of understanding of human psychology that it far out strips merely being "more rational and intellectually advanced." It's quite possible (perhaps likely) that greater intelligence and rationality would be a huge detriment in a society of incompetents. This would be true until science progressed to the point that we had a complete enough understanding of psychology to exploit or reform them. There's nothing in rationality that makes a rational person automatically able to understand and exploit the irrational.
[-][anonymous]180

For example, how would a rational person achieve high status if the majority of people making status judgments are irrational?

  1. Analyse their status assessing mechanism.
  2. Find exploits and hacks.
  3. Munchkin ...uh... I mean optimize away.
  4. (profit?)
8Nisan
The kind of "rationality" we're talking about is the kind that lets you win. If I notice that there are people who have more money than me, are happier than me, have better friends and friendships than me, are more able to achieve their goals — who acquired all those things through virtue or behaving a particular way, and not by chance — and if I haven't bothered to determine what those virtues and behaviors are, and whether they tend to actually work, and how I can implement them myself — why, then, I'm not such a hotshot rationalist after all. I'd agree that mere Traditional Rationality may not help one get ahead.
-2scientism
I'm talking about winning. In practical terms, I don't think anyone is going succeed in making great strides in success or status in general society without acquiring a great deal of knowledge about human psychology, so I doubt that winning will look like social or economic success in the short-term. I think when you speak of those "who acquired all those things through virtue or behaving a particular way, and not by chance" you betray a false dichotomy. Those aren't the only two options. The third option is that the game is rigged. We live in a society that intentionally confines "winning" to a small, highly-controlled, socially maligned group so that its fruits can be exploited by the larger majority who are unconcerned with such things. It's more than an issue of what individuals do and do not do, our society and its institutions are designed to reward and punish behaviour in a way that's at odds with rationality. Doing well in our society is indeed a product of "behaving in a particular" way in the most general sense of that term but is not a factor of anyone doing anything one could simply learn to do or do better. The only way to find success in our situation is by understanding human psychology at a deep level and having a much fuller operational understanding of it than we have now. It would be either a process of extreme reform (i.e., replacing the whole of society) or one of exploitation and subterfuge (essentially treating people as a means to an end).
1Nisan
This sentence makes me think that we're probably talking about entirely different things. I indicated in the grandparent that I consider people who are wealthy and happy and who have good friends to be "winning"; I don't believe such people are maligned; surely the opposite is the case. Perhaps we're talking about different things.
0scientism
Presumably you consider "winning" a self-directed act, so not everybody who is wealthy, happy and has good friends is necessarily a winner, they can also get lucky or be favoured in a rigged game. Furthermore, the majority of people, even the ones who have lived arguably self-directed lives, did not do so in the methodical way we're proposing to do so. Living a good life is, typically, a non-transferrable skill. "Winning", as I interpret it, is about creating a transferrable skill for achieving such goals. It's about identifying the things you or somebody like you need do to achieve such goals in a systematic way. What you, as a rationalist, would do in order to win is not necessarily the same as what those people you hold as exemplars of the state you hope to achieve have done. People who live good lives aren't maligned, that's true, but people who pursue goals in a systematic, transferable way are maligned. I agree that ultimately the outcome should be to be wealthy, happy, socially successful, etc. I simply disagree on how easy that is. If somebody came to me and told me they've made great strides in rationality, I wouldn't expect them to be rich and happy and to have the best of friends. I wouldn't expect them to have made great scientific breakthroughs. I'd expect them to have something to show for it but I'd expect that it would be a modest accomplishment and likely only recognised by their peers. I suspect if we took a poll on Less Wrong - if we could agree on who are the best rationalists and tallied up their achievements - those achievements would be in line with my expectations. Why is this? Because most success in our society is much more like, say, becoming a successful politician than becoming a successful athlete. One might suppose that I can become a successful athlete, given the right genes, simply by training hard and being acknowledged for my skill. But to become a successful politician I'd have to pretend to be somebody I'm not almost every waki
3prase
You are right that being more rational doesn't automatically imply the ability to exploit others. But this may miss the point. Loup-vaillant says not that he is more rational than the others, but that he feels superior because of that. Although rationality and intelligence aren't directly linked to status, the superiority feeling is part of the status game, and feeling superior when one's status is about average can be viewed as a sort of false belief.
1scientism
There's more than one status game though. For example, a high status scientist might nonetheless have a low social status generally, especially if his or her high status is in an esoteric field. Wouldn't it make sense for a rational person to reject the status judgments of the irrational and instead look for status among his peers? We might expect loup-vaillant to have some accomplishments that would set him apart from the irrational masses in the eyes of his peers - that he's not all talk - but I doubt this would set the bar very high. He'd then be free to feel superior to most people.
2prase
It's possible to feel superior to most people because you can recite Koran by heart even if you are a homeless beggar. It's possible to feel superior because you can solve a Rubik's cube faster than anybody else. There will always be some peers who would award you high status for unusual accomplishments. If that's what you want, there are hardly any objections to be made. But from my experience, the superiority feeling quickly fades away when I realise that it is based on status game which the "inferiors" don't wish to participate in.
1gimpf
I did not interpret his article as "I am superior to all", but as "Help, I act as I am superior to all!". I probably got that totally wrong, though, as like most of the times.
2loup-vaillant
Unfortunately, those two are related. My acting superior generally comes from a genuine feeling of being right. And it happen often enough to raise alarm bells even I can hear.

Slytherin answer: If you're surrounded by idiots, figure out how idiots work and use them to your advantage; ideally in ways that they don't even recognize. Getting irritated at them for being idiots is like getting irritated at a cat for not being a dog — it's bad instrumental rationality; the irritation doesn't help you accomplish your goals. They may be idiots, and you can't fix that; but you can treat them nicely enough that they won't get in your way and may even be useful to you. Find ways to practice this.

Hufflepuff answer: Sounds like you need the company of other rationalists. Does your area have a LW meetup yet? Meanwhile, try to consider the obstacles, distractions, and other cognitive interference that these other folks might be dealing with. Find ways to sympathize — after all, you're not perfect, either. (And for that matter, if religionists are so wrong, why does going to church make so many of them so happy? They must be right about something.)

Other Hufflepuff answer: Aww. Maybe I should find a way to be nicer to them so that I can help them find their mistakes in ways that don't make them think I dislike them. I wonder if having more accurate beliefs in some areas would actually hurt them...

Gryffindor answer: And that's why you must be strong to help save the world without their help.

Ravenclaw answer: Some people just don't care. If you want to talk about the truth, talk to other truthseekers, not other people. Non-truthseekers can still be fun, but you don't have to talk to them about your beliefs.

3fubarobfusco
Your Huffle-fu is stronger than mine. Seeking a rationalist meetup is Ravenclaw.
4atucker
I really like the idea of answering based on Hogwarts houses.
2nazgulnarsil
it's really a variation on internal families.
2atucker
True, but it has the added advantage of common knowledge as to what the different aspects refer too.
[-]Zed250

Very interesting, because my exposure to LW (and the sequences in particular) had the opposite effect. I'm now better at dealing with others and with dealing with stupidity in general.

My slightly exaggerated thought process used to be: "I'm clearly right about this, so I'll just repeat and rephrase my arguments until they figure out they're wrong and I'm right. If they don't understand it they're hopeless and I'll just "flip the bit" on them and move on with my life."

The problem, of course, is that the strategy is ineffective, and using an ineffective strategy again and again is not rational at all. So I would say the correct strategy is to ask yourself: "Given my understanding of the sequences and of human psychology, what line of argumentation is going to be most effective?". In this situation you probably want to leave a line of retreat and you probably want to make an effort to close the inferential gap.

If you're right (in a "facts are on my side" kind of way) you can usually force people to give in but at what cost? Resentment and burned bridges. You might win the battle, but you'll lose the war.

PS: Insulting your opponent, although an u... (read more)

0Vladimir_Nesov
It's not always positive-sum (or even often, if you pick random interlocutors). Your time spent arguing can easily be worth more than what the other person gains. Insulting probably doesn't help though.
0CaveJohnson
One of the nice things of being part of the academic establishment is that its other people's duty to explain things that are already covered. Except when public relations are concerned. shudders

Hang out with people who are smarter than you are, so that you get lots of practice being the one who's wrong in an argument.

Remember that when you are right, your goal is not to emit true statements but to cause the other person to believe true ideas. The default implicit model of argument is that if they don't get it at first, you just have to hit harder; try instead to think of convincing someone as navigating a maze or solving a puzzle, a complex and delicate process that may require lots of backtracking if you mess up.

5Vaniver
This seems useless at learning how to deal with people who are wrong, and instead reinforces the "life is an academic debate" meme.
1Pavitra
You shouldn't expect any single strategy to solve all possible problems. The course of action I recommended is, I believe, a good way to fix the problem described in the original post. If you also want to solve this different problem, you will probably have to also take a corresponding different course of action.
0Zetetic
That could be easier said than done if you live in a fairly isolated environment. I would love to find a group of people who are all smarter than me and would want to hang out and debate various topics, but I have no clue where to go. My university doesn't seem to have very many people that I can chat with, and even the people at the graduate mathematics courses I took at one of the neighboring universities were not very stimulating for the most part. I think this is largely due to the quality of the schools, but there is little I can do about that (at least that I know of) for the time being beyond waiting for grad school and chatting online; this might be the only non-specialist community site where I find multiple people who consistently know more than I do.
2Pavitra
Hanging out on the internet can work for this. LW is my personal smarter-than-me hangout of choice. (It would probably be a good idea to list some other candidates, to help maintain the metaphorical biodiversity of the meme pool. Unfortunately, I can't think of any. Suggestions?)

I emotionally/connotationally associate the condition of not thinking clearly with poverty. A person can be born in unfavorable conditions, in which case it might be almost impossible to get into a better situation without substantial help, or it might take a lot of luck, or significant ability.

Since there is already a well-absorbed set of emotional connotations with the condition of povery (low-status but with lean to status-agnostic; burden for others not in this condition; unfair, deserving of compassion and help; theoretical possibility of full recovery) that seems to match what one would wish to associate with people not thinking clearly, you could just transfer these intuitions by associating the categories in your thinking.

We also need a productive charity, to make use of comparative advantage.

[-]Emile160

I'm more tolerant of religion than I was a few years ago, mostly because once I got an idea of all the other ways humans (including myself!) are irrational, singling those who hold incorrect opinions on something irrelevant like metaphysics is a bit unfair.

Ways human tend to be irrational: choosing a career based on very little information (the idea of the number of well-off teenagers in western countries that know more about the World of Warcraft gameplay system (or equivalent) than about the costs and benefits of the various career paths they could choose is depressing); pretty much any strongly-held opinion on politics that isn't backed by some serious scholarship or experience, opinions on what others think of you and how much that matters, opinions on what kind of things are good and bad, buying unneeded stuff, getting in debt, moral panics, drugs ...

Next to those, does it matter if somebody incorrectly thinks the Bible was divinely inspired, or that we get reincarnated after death, as long as he's being a reasonable, civilized human being (and not a fanatical nut)? That'd be a good reason to ignore their opinions on abstract intellectual subjects, but not a reason to think very harshly of them.

1Morendil
People don't just hold these beliefs about death and the Bible and the bearded guy in the sky (who loves kids dunked in water at birth by priests more than he loves other living things). They also often send their kids to sunday school to contaminate them with these beliefs, instead of waiting for their kids to grow up enough to adopt the beliefs if they make sense to them.
-1Emile
If sending the kids to Sunday school makes the lives of the kids better than not sending them to sunday school, then why not? There may be better things to have your kids do on Sunday, but it's probably better then having them watch TV all day. (I've never been to Sunday school, but the people I know who did don't seem particularly worse off)
6sketerpot
I've been to sunday school, at several churches, when I was a child. I also "taught" sunday school when I was a teenager. In all cases, it was a glorified daycare blessed with the halo effect of God: a way to make parents feel virtuous about leaving their kids somewhere for an hour on Sunday while they have coffee and cookies. This was perhaps valuable as parental stress relief, but it wasn't a particularly great thing for the people actually in sunday school. If anything, it was kind of boring, and got everybody fidgety from being cooped up in a room. So, yeah, if you're looking for things to do with children on a Sunday morning, may I suggest hiking, or reading, or playing somewhere, or anything but sunday school? It's not horrible, but I would characterize it as intensely meh.
6NancyLebovitz
Training kids to tolerate intensely meh experiences, especially when there's no obvious gain from them, may be unhealthier than is obvious. At least in my case, I think it's done a lot to build a habit of killing time.
3Morendil
It's not clear to me that sunday school systematically makes kids' lives any better, and the epistemic danger seems real enough. For instance, the guilt-trip nature of the doctrine of "original sin" strikes me as a clear harm when inflicted upon children, who do not have the intellectual resources to receive it critically. It's one thing to tolerate people who choose to have certain beliefs. It's another, more difficult, to tolerate people who actively foist these beliefs onto the more vulnerable.
[-]prase150

Well, the effect LW had on me was the opposite. Many arguments have subtle sides which are hidden from the first sight, and much of this I have realised reading LW. It can happen that it's me who misses the point, and it's very unpleasant after having argued about the point passionately. And even if I am right and the opponent is wrong, I know that the path to the truth isn't usually simple and short. I used to have beliefs which today I see as clearly wrong. I am fairly confident that today I have beliefs which I would find wrong in the future, and which other better informed people consider wrong even today. If I don't want to call past myself a moron (I certainly don't) and don't want to be called a moron by the wiser people, I should be quite careful in putting the moron label onto others.

So, what to do if you want to be more tolerant, for example, when you meet a religious believer? My advice is based on things that usually help me:

  • Try to remember that you were effectively an agnostic not long ago, and if your interlocutor deserves to be called a moron for believing in God, you deserved to be called at least a half-moron for being agnostic about the question. Perhaps you wou
... (read more)
4loup-vaillant
Ouch. I can't. Even reading the whole sequences didn't trigger any feeling of updating. I learned quite a few things, and it just made sense as a whole. But nowhere I saw something that made me jump "wait, what?", followed by the mandatory Oops. I probably should take ideas I disagree with as seriously as possible. Surely there is one that will change my mind?
3prase
The updating needn't necessarily be instant, it can take months or years. For me, it is never an instant change. Not much "wait, what?!", it's rather more like "this can't be true, let's try to find a counter-argument", followed by "I can't find a satisfactory counterargument, so there may be some merit in that" after some time gap. But after that, I am able to see that I don't anymore hold a belief X which I was ready to defend fiercely a while ago.
0fubarobfusco
Was that "ouch" an oops, or a wince?
1loup-vaillant
A wince. I noticed my failure to update a while ago. (Or at least my failure to notice update. That doesn't feel likely, but I've seen my Mother do it, saying "of course" instead of "oops". It could let me update, which is good, but it wouldn't get rid of the "I've been right all along" feeling, which is bad.)

Can you elaborate on what you mean when you say you regard anyone who isn't Bayesian as moronic? I'm not sure what it means to "be Bayesian".

4BenLowell
Here is an article written for you! What is Bayesianism? My personal struggle is where this differs from 'clear-headedness.' I think that much of this website is geared towards helping us get closer to the ideal Bayesian, though the connections are not mentioned specifically. Can anyone give an example of where they explicitly used Bayesian reasoning? It makes sense that it is right, but ... unlike other things on this website that can be transferred into skills or habits. My guess is that having a deeper understanding of Bayesian probability would help with understanding what evidence is and how much confidence should be placed in what. A separate confusion of mine is that in Eliezer's explanation of Bayes theorem----I was able to do the math problems correctly and so I didn't make whatever the usual mistake was. Because of this, I have knowledge of the right way to solve probability problems (at least if I spend a long time thinking about them), buI never went down the wrong path got slapped by having an Incorrect Answer. That doesn't mean I won't notice a mistake, but I think that learning things the wrong way helps you understand why they are wrong later. So my confusion is that I am never very confident as to whether I am doing things the "Bayesian way" or not. I've found that the Law of Conservation of Expected Evidence has been the most helpful in understanding the consequences of Bayesian reasoning, beyond solving math problems. Edited for clarity.
1Kutta
My awareness of Bayesian reasoning doesn't quite enable me to use it explicitly with success most of the time, or maybe the successes are not vivid and spectacular enough to be noticed, but it does make me aware of Bayes-stupid inferences committed by me and others. Just yesterday my father proclaimed that a certain beggar who tends to frequent our street with a kid or two and claim to be a homeless is a liar, because, well, he's not a homeless because he is also often seen in a company of drunkards and he probably drags around the kids for show and they aren't even his. I asked my dad whether the beggar's claim of homelessness makes him more or less likely to be homeless. He said less likely, but after that he denied that the beggar's failure to claim so would make him more likely to be a homeless.
0Emile
I'm not sure I understand - why would he deny that the beggar's failure to claim so would make him less likely to be homeless? I have trouble imagining how the conversation you're describing went.
0Kutta
Uh, I mixed up a less likely and a more likely. Corrected.
0Emile
In that case: ... the first bit should probably be "He said less likely", in which case what you say makes much more sense.
0jsteinhardt
I personally feel like a deeper understanding of Bayesian probability has mainly just helped me to formalize things that are already obvious (the goal being to replicate what is obvious to humans in a computer, e.g. computer vision, robotics, AI, etc.). There have been few instances where it has actually helped me weigh evidence more effectively. But maybe I am missing some set of practical techniques. Also, I was unable to parse the final paragraph that you wrote, would you mind re-stating it?
0loup-vaillant
I basically mean using probability theory when you deal with your own beliefs. With the understanding that you only have partial (and flawed) information about the world. Understanding what is evidence, what counts as evidence to you (that last one depends on the relationship between your prior knowledge and the piece of evidence you look at). And most of all, understanding that Occam's Razor (or Solomonov induction / Kolmogorov complexity) isn't just a fancy trick to force atheism and manyworlds down people's throats. That said, my knowledge is still feels flaky. I may be a bit under-educated by my own standard.
7jsteinhardt
What does it mean to "use probability theory to deal with your beliefs"? How do you use probability, and how does it change your conclusions?
1Will_Newsome
This question might be worth a discussion post. I constantly use visuospatial and kinesthetic qualia when thinking, which to a non-negligible extent draw on intuitions begotten from understanding the basic concepts of algorithmic probability theory and its relations---information theory, probability theory, computer science, and statistical mechanics. That said, I almost never pull out pen and paper, and when I do pull out pen and paper it's to help structure my Fermi calculations, not to plug numbers into Bayes' theorem. It seems obvious to me both that there are large benefits to having Bayes-influenced intuitions firing all the time and also that there are few benefits of even remembering how to actually write out Bayes' theorem. Edited to separate the following trivial factoid from above less trivial factoids: (Formal use of Bayes is pretty popular among---and abused by---Christian apologists. User:lukeprog would know more about that though.)
1Dorikka
This doesn't seem to belong here. My guess is that you're just inserting a fact for general knowledge because you found it interesting, but it looks like an argument of the form "X does Y, X tends to exhibit low levels of rationality, so don't do Y", which is fallacious. I might remove it for potential mind-killing potential.
-1Will_Newsome
I praise your right view, and will edit my comment accordingly.
1jsteinhardt
I would be interested in reading such a post (it seems like it might even be worth a top-level post depending on how much you have to say).
0NancyLebovitz
Could you give an example of using visuospatial and kinesthetic qualia when thinking?
0Will_Newsome
Long-winded reply: I think it's not uncommon for folk to have kinestheticly-experienced conceptual aesthetics or decision-making processes. "That doesn't feel quite right" is commonly heard, as is the somewhat-ambiguous "Sorry, I just don't feel like going out tonight". Anyhow, others' apparent confidence in seemingly inelegant ontologies very distincly activates a lot of my thinking qualia. For example, if I hear a person resignedly accepting a uniform prior over vaguely defined objects in the mathematical universe hypothesis. The really fundamental feeling there is that it's just doesn't fit... I picture it in my head as an ocean of improper prior-ness flooding the Earth because some stupid primordial being didn't have enough philosophical aesthetics to realize that the mathematics shouldn't look like that, objectively speaking. And it feels... it just feels wrong. Often an idea or a hypotheses feels grinding, and sometimes it feels awkward, but most of the time things just feel not-right, inharmonious, off-kilter, dukkha. Sorry, very little sleep this week, not particularly coherent. Edit: Not sure if this matters at all, but I think that I wouldn't be able to do clear timeful/timeless reasoning if I didn't have access to those intuitions. I also doubt that I could grok the concepts of statistical mechanics. That said, I really don't understand things like algebra or geometry... it must be something to do with implicit-movement, static things just don't work. (Edit: Mixtures and measures, logarithms, symmetric limits, proportionality, physical dimensionality, raw stuff of creation, creation self-similarity, causal fluid, causal structure... it's like crack.) I think that's why I love ambient/timeless control so much, it lets me think about Platonic objects using my flow-structure intuitions, which is cool 'cuz the Forms are so metaphysically appealing. I'm getting an fMRI soon and doing a whole bunch of cognitive tests soon, maybe that'll give a hint.

Your experience is interesting. I find that while I have started looking on more things as more insane than before, it has made me less argumentative and more tolerant. My thought process is something like, "So many people are so wrong on so much stuff that my trying to help them usually won't make a difference. They never had a chance to become right because they were exposed to all the wrong memes. So I'll stop trying to improve other people's thinking except where I think it might actually work, and then I'll be nice so that I don't lose a rare chance to help somebody." The eventual effect is that I see more of the irrationality around me, but feel less need to do anything about it.

I don't know if trying to emulate my reaction sounds like something you'd want, and I'm even less confident that it will work, but I find that just seeing how other people deal with something can give me ideas about how I should do so.

What should I do?

Step 1: Stop being frustrated with them for not knowing/understanding. Instead, direct your frustration at yourself for not successfully communicating.

Step 2: Come to know that the reason for your failure to communicate is not a lack of mastery over your own arguments. It is a lack of understanding of their arguments. Cultivate the skill of listening. Ask which school of martial arts presents the best metaphor for your current disputation habits. Which school best matches the kind of disputation you would like to achieve?

Step 3: ... (read more)

It very well may be, that this intolerance of yours has nothing to do with this site. You would became intolerant anyway, only to a slightly different set of beliefs.

0loup-vaillant
Quite possible. Another possibility is that I at last found a tribe I can identify with. Also, reading the sequences didn't trigger updating. I either learned or readily agreed. That may have spoiled me a little.

You can't expect to win a singlehanded fight to protect the entire world from its own stupidity. You need to choose your battles.

Against stupidity, the gods themselves contend in vain.

-- Friedrich Schiller

I know I've believed some pretty stupid stuff that only seemed dumb in retrospect. I've found that keeping this in mind helps one be more tolerant of people believing in stupid things. Would you be intolerant of yourself from two years ago or five years ago? If you had a time machine, how would you treat your past self?

7Vladimir_Nesov
This is at best an intuition transfer tool, given without a reason that justifies its use, and one that happens not to apply to myself. I would easily bite the bullet and say that Nesov_2008 was a crackpot and Nesov_2006 was a shallow naive fool. There seems to be no common reason to tolerate interaction with either of them beyond necessity. (A desire to help, to make better, is a different question. It doesn't need to involve valuating the damaged condition as better than it is.)
3Perplexed
Ah. But would you make the obvious predictions about the opinion Nesov2013 and Nesov2015 will have regarding Nesov2011?
4Vladimir_Nesov
You are being overly cryptic (obvious predictions?). Judgments like this are not relative. I don't think I'm currently in anywhere close to that much of a trouble, and haven't been since about summer 2010 (2009 was so-so, since I regarded some known-to-be-confused hypotheses I entertained then at unjustified level of confidence, and was too quick to make cryptic statements that weren't justified by much more detailed thoughts). I'm currently confused about some important questions in decision theory, but that's the state of my research, and I understand the scope of my confusion well enough.
3Perplexed
My implicit point was this: Nesov2006 probably did not realize that Nesov2006 was a fool and Nesov2008 probably did not judge himself to be a crackpot. Therefore, a naive extrapolation ("obvious prediction") suggests that Nesov2011 has some cognitive flaws which he doesn't yet recognize; he will recognize this, though, in a few years. JoshuaZ, as I understood him, was suggesting that one improves ones tolerance by enlarging ones prior for the hypothesis that one is currently flawed oneself. He suggests the time-machine thought experiment as a way of doing so. You, as I understood you, claimed that JoshuaZ's self-hack doesn't work on you. I'm still puzzled as to why not.
1Vladimir_Nesov
To a significant extent, both would agree with my judgment. The problem was not so much inability to see the presence of a problem, they just didn't know its nature in enough detail to figure out how to get better. So the situation is not symmetric in the way you thought. See The Modesty Argument for a discussion of why I won't believe myself crazy just because there are all those crazy people around who don't believe themselves crazy.

Perhaps you should solidify in your mind whether you think it's a good thing or a bad thing on net. Come up with ways in which it could be a good thing and ways in which it could be a bad thing. One particular way that it could be a bad thing is that you dramatically underestimate inferential distance, so it's much harder to actually change people's minds than it feels (there's a reason the sequences are long; those had more design time go into them than whatever you come up with on the fly). This means that if there are any social drawbacks to arguing with people, they can easily outweigh the benefits improving thought.

1torekp
I'd like to echo jsalvatier's first point, and add one plea in "favor" of intolerance. Namely, tolerate your intolerance. What this means in practice is roughly, instead of thinking "I'm acting/feeling intolerant - I'm a bad person," try "I'm acting/feeling intolerant - let me note the context, and the results so far. Let me think about what to do next." Try some of the alternative, more-tolerant responses suggested by LWers, and note their results too. Keep separate in your mind your thoughts versus emotions versus behavior. You can have intolerant thoughts and tolerant behavior, which might (or might not) give you all the benefits you seek from tolerance. Emotions are sort of a middle ground, since they tend to be harder to keep private, but are often less salient to others than your behavior.

Would it help to do a cost-benefit analysis of being more tolerant vs. the status quo? I've found that the amount of enlightenment that I can give certain people is small enough that I lose more utility through the emotional impact of the argument than I gain through giving them knowledge.

I now tend regard anyone who isn't Bayesian as either uneducated or moronic. Same thing about materialist reductionism, only with a slightly lower confidence.

To be blunt, that is a bit strange. In my experience you are far more likely to find a material reductionist than you are to find a Bayesian. This leads me to think that you might be too withdrawn, which might be causing you to have a poor model of what ideas other people are likely to hold, and adding to your general sense of misanthropy. Of course, I'm generalizing from only a few examples (I ... (read more)

0loup-vaillant
My slightly lower confidence doesn't flow from popularity, but from the fact that Bayesianism is more meta than materialist reductionism. Along with the current state of the art of science, it causes my belief in a reductionistic world. Without it, I would be allowed to believe in souls. But it would be harder to abandon Bayesianism if I discovered that we do have immaterial souls. I currently work at a programming shop. I intend to do a thesis soon. I live in France, far from Paris (or I promise I would have gone to that meetup not long ago).
0Zetetic
I suppose I thought it was strange because I was a reductionist long before I knew about Bayesianism; I've always had an interest in science and I always gave scientific theories precedence (though when I was very young this was more out of my recognition that science had the authority on truth rather than a rational dissection of epistemology). I read A.J. Ayer and Karl Popper before I read Jaynes (unfortunately, I really wish that it had been Jaynes I read first). I'm still an undergraduate and I live in the U.S. so I'm afraid that I can offer little in the way of insight. I could perhaps share my experience with I can tell you that I often have similar feelings. I do not live near any meetups and none of my friends share any interest in mathematics, the sciences or rationality. I do have one friend who is very intellectual, but he's a soft science type who, again, doesn't share any of my specific interests.
[-]djcb20

Suppose you hear someone stating that yesterday it was 7C and today it is 14C, so it's "twice as warm".

When I hear that, I cringe a bit, but these days (older, wiser, milder) I think the better thing to do is just to lightly smile or something. The 'higher status behavior' is to not always to try to "score", but instead to ignore it, unless there is some direct negative effect.

I know exactly how you feel.

As far as I'm concerned, recognizing that I could be that completely oblivious and ignorant person if I was subjected to a different personal experience from my current one helps a lot to not think significantly less of them.

Actually, I once was that ignorant person. So I try to imagine how someone would have needed to talk to me, in order to convince me of something, back when I was an ignorant superstitious fool myself. It's not easy, that's for sure. Try to thoroughly imagine how you would talk to someone whom you love and re... (read more)

Another approach is to contemplate the various virtues that people can have, and consider their relative importance. You might need to do this as a sort of regular meditation.

As an off-the-cuff, how would you sort by importance: rationality, creativity, knowledge, diligence, empathy^1 , kindness, honor, and generosity^2 ? Does how you act correspond to how you answer? If not, make a practice of reminding yourself.

You may also find it useful to enumerate the virtues of the specific people who are annoying you. If you cannot think of any, stop associatin... (read more)

I know this sounds snarky, but it's serious: Are you married?

Ideally a life partner will share many of your values, but no two people share all values, and you'll need to respect the ones that differ. (Even if you're both Bayesian, in area where you have different values/axioms you will not necessarily agree).

0loup-vaillant
I live with my SO. As far as I can tell, she didn't completely abandoned belief in belief. She also doesn't seem to accept Occam's Razor (seemingly because it "doesn't interest" her), and use that to reject many-worlds. Or maybe many-worlds sounds absurd, and she only reject Occam's Razor by contraposition. Anyway, all this has been a source of significant tension, which is now subsidized (I hope). The factual disagreements remains, though. Lesson learned: "Thou will not convince everyone". As for our values, I didn't noted any significant divergence yet.

I wonder if it's not a problem with compartmentalization? Because in many contexts, these issues about truth needn't be in the forefront. In contexts where issues about truth are at the forefront, wherever people are having intellectual discussions are making decisions, it is often more contextually appropriate to be argumentative.

Maybe your concern about intolerance is a warning that you need more interactions of the former type for a better balance in your life. That is, interactions that are social and comfortable and bring back your sense of humor and comradery towards other people.

I think I can relate quite a bit. It is absolutely infuriating when someone does anyone care to try to be rational. I am always having to explain to people why I care about what is true. The question to me has become like nails on a chalkboard. the thing that has helped me mildly is that most people do not have any education on what it means to be rational. they they have not even been introduced to the concept ( other than Hollywood rationality which is almost as irritating). I also remember that at one time I was kind of like them which makes it so that I tend to educate them about it (although I think I am as a teacher/mentor).

Find some really intolerant people to hang out with. Objectivists would be -f-o-o-d- good. (But that was an interesting idea for a while).

3Dorikka
I'm failing to parse your comment.
6Normal_Anomaly
I think Peterdjones means that Objectivists would be good people for Loup-Vaillant to hang out with, to teach him what it feels like to be subjected to obnoxious argumentation and make him realize on a gut level that it doesn't help and causes needless unhappiness. I suspect from personal experience that it would backfire--I tend to act more like the people I hang out with, and I was a lot more obnoxious when I spent time on Pharyngula before coming here--but YMMV. At least, that makes more sense to me than Peterdjones actually wanting Loup-Vaillant to eat Objectivists.
0NancyLebovitz
I interpret it as "Objectivists would be food (food is struck out to indicate a mixture of humor and hostility)-- a good opportunity to dump anger and feel like you're winning arguments, or should be.
-5SarahNibs