John_Maxwell_IV comments on A Parable On Obsolete Ideologies - Less Wrong

113 Post author: Yvain 13 May 2009 10:51PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (272)

You are viewing a single comment's thread. Show more comments above.

Comment author: Z_M_Davis 18 May 2009 02:07:00AM 15 points [-]

As soon as you start talking about your "honor as an aspiring rationalist", you're moving from the realm of rationality to ideology.

Well, sure, but the ideological stance is "You should care about rationality." I should think that that's one of the most general and least objectionable ideologies there is.

Like I said, I don't think this question matters and I'm mostly indifferent to what the answer actually is. I'm just trying to protect the people who do care.

But I do care, and I no longer want to be protected from the actual answer. When I say that I speak from experience, it's really true. There's a reason that this issue has me banging out dramatic, gushy, italics-laden paragraphs on the terrible but necessary and righteous burden of relinquishing your cherished beliefs---unlike in the case of, say, theism, in which I'm more inclined to just say, "Yeah, so there's no God; get over it"--although I should probably be more sympathetic.

So, why does it matter? Why can't we just treat the issue with benign neglect, think of ourselves as strictly as individuals, and treat other people strictly as individuals? It is such a beautiful ideal--that my works and words should be taken to reflect only on myself alone, and that the words and works of other people born to a similar form should not be taken to reflect on me. It's a beautiful ideal, and it seems like it should be possible to swear our loyalty to the general spirit of this ideal, while still recognizing that---

In this world, it's not that simple. In a state of incomplete information (and it is not all clear to me what it would even mean to have complete information), you have to make probabilistic inferences based on what evidence you do have, and to the extent that there are systematic patterns of cognitive sex and race differences, people are going to update their opinions of others based on sex and race. You can profess that you're not interested in these questions, that you don't know---but just the same, when you see someone acting against type, you're probably going to notice this as unusual, even if you don't explicitly mention it to yourself.

There are those who argue--as I used to argue--that this business about incomplete information, while technically true, is irrelevant for practical purposes, that it's easy to acquire specific about an individual, which screens off any prior information based on sex and race. And of course it's true, and a good point, and an important point to bear in mind, especially for someone who comes to this issue with antiegalitarian biases, rather than the egalitarian biases that I did. But for someone with initial egalitarian biases, it's important not to use it---as I think I used to use it---as some kind of point scored for the individualist/egalitarian side. Complex empirical questions do not have sides. And to the extent that this is not an empirical issue; to the extent that it's about morality---then there are no points to score.

It gets worse---you don't even have anywhere near complete information about yourself. People form egregiously false beliefs about themselves all the time. If you're not ridiculously careful, it's easy to spend your entire life believing that you have an immortal soul, or free will, or that the fate of the light cone depends solely on you and your genius AI project. So information about human nature in general can be useful even on a personal level: it can give you information about yourself that you would never have gotten from mere introspection and naive observation. I know from my readings that if I'm male, I'm more likely to have a heart attack and less likely to get breast cancer than would be the case if I were female, whereas this would not at all be obvious if I didn't read. Why should this be true of physiology, but not psychology? If it turns out that women and men have different brain designs, and I don't have particularly strong evidence that I'm a extreme genetic or developmental anomaly, then I should update my beliefs about myself based on this information, even if it isn't at all obvious from the inside, and even though the fact may offend me and make me want to cry. For someone with a lot of scientific literacy but not as much rationality skill, the inside view is seductive. It's tempting to cry out, "Sure, maybe ordinary men are such-and-this, and normal women are such-and-that, but not me; I'm different, I'm special, I'm an exception; I'm a gloriously androgynous creature of pure information!" But if you actually want to achieve your ideal (like becoming a gloriously androgynous creature of pure information), rather than just having a human's delusion of it, you need to form accurate beliefs about just how far this world is from the ideal, because only true knowledge can help you actively shape reality.

It could very well be that information about human differences could have all sorts of terrible effects if widely or selectively disseminated. Who knows what the masses will do? I must confess that I am often tempted to say that I have no interest in such political questions--that I don't know, that it doesn't matter to me. This attitude probably is not satisfactory for the same sorts of reasons I've listed above. (How does the line go? "You might not care about politics, but politics cares about you"?) But for now, on a collective or political or institutional level, I really don't know: maybe ignorance is bliss. But for the individual aspiring rationalist, the correct course of action is unambiguous: it's better to know, than to not know, it's better to make decisions explicitly and with reason, then to let your subconscious decide for you and for things to take their natural course.

Comment author: John_Maxwell_IV 18 May 2009 03:22:30AM 1 point [-]

One day, I started looking at the people around me, and I began to realize how much they looked like apes. I quickly stopped doing this, because I feared it would cause me to treat them with contempt. And you know what? Doublethink worked. I didn't start treating people with more contempt as a result of purposely avoiding certain knowledge that I knew would cause me to treat people with more contempt.

If you think that my efforts to suppress observations relating the appearance of humans with the appearance of apes were poorly founded, then you have a very instrumentally irrational tendency towards epistemic rationality.

Comment author: CronoDAS 30 November 2011 08:47:34AM 9 points [-]

Maybe the lesson to draw is that you don't respect apes enough?

Comment author: Z_M_Davis 18 May 2009 03:52:08AM 6 points [-]

[...] then you have a very instrumentally irrational tendency towards epistemic rationality.

Guilty as charged! I don't want to win by means of doublethink---it sounds like the sort of thing an ape would do. A gloriously androgynous creature of pure information wins cleanly or not at all. (I think I'm joking somewhat, but my probability distribution on to what extent is very wide.)

Comment author: Strange7 18 December 2011 05:40:30AM 1 point [-]

At some point you will have limited resources, and you will need to decide how much that preference for winning cleanly is worth. How much risk you're willing to take of not winning at all, in exchange for whatever victory you might still get being a clean one.

For example, say there are two people you love dangling off a cliff. You could grab one of them and pull them to safety immediately, but in doing so there's some chance you could destabilize the loose rocks and cause the other to fall to certain death.

The gloriously androgynous creature of pure information (henceforth GACoPI) that you wish you were cannot prioritize one love over the other in a timely fashion, and will simply babble ineffectual encouragement to both until professional rescuers arrive, during which time there is some chance that somebody's arms will get tired or the cliff will shift due to other factors, killing them both. An ape's understanding of factional politics, on the other hand, has no particular difficulty judging one potential ally or mate as more valuable, and then rigging up a quick coin-flip or something to save face.

Comment author: [deleted] 18 December 2011 08:41:08AM 0 points [-]

The gloriously androgynous creature of pure information (henceforth GACoPI) that you wish you were cannot prioritize one love over the other in a timely fashion, and will simply babble ineffectual encouragement to both until professional rescuers arrive, during which time there is some chance that somebody's arms will get tired or the cliff will shift due to other factors, killing them both.

The grandparent uses "winning cleanly" to mean winning without resorting to doublethink. To go from that to assuming that a GACoPI is unable to enact your proposed ape solution or come up with some other courses of action than standing around like an idiot smacks of Straw Vulcanism (WARNING: TVTropes).

Comment author: MarkusRamikin 30 November 2011 09:15:29AM *  5 points [-]

Wouldn't your efforts be better directed at clearing up whatever confusion leads you to react with contempt to the similarity to apes?

I can maybe see myself selling out epistemic rationality for an instrumental advantage in some extreme circumstance, but I find abhorrent the idea of selling it so cheaply. It seems to me a rationalist should value their ability to see reality higher, not give it up at the first sign of inconvenience.

Even on instrumental grounds. Just like theoretical mathematics tends to end up having initially unforeseen practical application, giving up on epistemic rationality carries potential of unforeseen instrumental disadvantage.

Comment author: John_Maxwell_IV 01 December 2011 07:33:55AM 0 points [-]

Good point.