John_Maxwell_IV comments on A Parable On Obsolete Ideologies - Less Wrong

113 Post author: Yvain 13 May 2009 10:51PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (272)

You are viewing a single comment's thread. Show more comments above.

Comment author: cousin_it 15 May 2009 10:47:55AM *  11 points [-]

"Don't matter" sounds to me like a cop-out akin to religion's "retreat to faith". A lot of evidence for absence of systematic variation in ability gets noticed and promoted by SWPL followers, indicating the question's importance to the belief system.

Comment author: John_Maxwell_IV 15 May 2009 02:48:27PM 5 points [-]

If all of science agreed that members of one race really were slighty smarter on average than members of another race, that could be extremely demoralizing for the people of the slightly-dumber race. See stereotype threat. This seems like a high price to pay to eliminate a popular falsehood--I imagine their are other falsehoods that would be easier to eliminate for a smaller price. And unlike religion, there's no long-term benefit associated with its removal.

Comment author: CaveJohnson 17 December 2011 11:29:20PM 7 points [-]

See stereotype threat.

Maybe. Maybe not.

Comment author: thomblake 15 May 2009 04:54:07PM 16 points [-]

It's mentioned below to some extent, but if you start with the assumption that there's no systematic variation between groups, then if you see a statistical difference in how groups are treated, that's evidence of a systematic bias. If it turns out that there is systematic variation between groups, the data can be explained without the bias. It's the difference between "Harvard is racist" and "Harvard accepts bright people regardless of race".

Comment author: John_Maxwell_IV 17 May 2009 03:19:25PM *  1 point [-]

Harvard has two responses to these claims: they can continue getting called racist, or they can unfairly admit minorities who don't deserve to get in. Both of these alternatives are preferable to telling a large group with normal human psychology that they are inferior.

Edit: I tend to think it's OK for Harvard to admit too many minorities because I think having the upper class be made up of people from a variety of backgrounds is valuable. But I could see how this could be a problem in other domains, such as cousin_it's mortgage example.

Comment author: [deleted] 22 February 2011 12:19:13AM *  9 points [-]

You do realize the top universities (and especially the Ivy Leagues) systematically discriminate against Asian students right?

Its really hard to fight or talk about something like that if one can't just point out that East Asians have higher average IQs than say Europeans.

Comment author: cousin_it 15 May 2009 02:57:29PM *  3 points [-]

Science makes a lot of vastly more demoralizing conclusions like Darwinism or the possibility of nuclear weapons. If you really believe what you say you believe, you should focus on debunking those first.

Comment author: hrishimittal 15 May 2009 03:13:52PM 5 points [-]

there's no long-term benefit associated with its removal.

The first step in solving a problem is to recognise it. If I discovered I have cancer I would be demoralised immensely but I'd prefer that and take a shot at recovering rather than die unknowingly.

Denial is not a path to improvement.

Comment author: Z_M_Davis 16 May 2009 06:40:48AM 7 points [-]

Strongly seconded. I speak from experience: when evidence starts mounting for some horrible, nightmarish proposition that you're scared of, it is tempting to tell yourself that even if it were true, it wouldn't really matter, that there would be no benefit to acknowledging it, that you can just go on acting as you've always done, as if nothing's changed. But on your honor as an aspiring rationalist, you must face the pain directly. When you get a hint that this world is not what you thought it was, that you are not what you thought you were--look! And update!---no matter how much it hurts, no matter how much your heart may cry for the memory of the world you thought this was. Do it selfishly, in the name of the world you thought you knew: because once you have updated, once you see this hellish wasteland for what it really is, then you can start to try to patch what few things up that you can.

Suppose you really don't like gender roles, and you're quietly worried about something you read about evolutionary psychology. Brushing it all under the rug won't help. Investigate, learn all you can, and then do something. Maybe something drastic, maybe something trivial, but something. Experiment with hormones! Donate a twenty to GenderPAC! Use your initials in your byline! But something, anything other than defaulting to ignorance and letting things take their natural course.

Comment author: John_Maxwell_IV 17 May 2009 03:31:13PM 0 points [-]

As soon as you start talking about your "honor as an aspiring rationalist", you're moving from the realm of rationality to ideology.

Like I said, I don't think this question matters and I'm mostly indifferent to what the answer actually is. I'm just trying to protect the people who do care.

Comment author: Z_M_Davis 18 May 2009 02:07:00AM 15 points [-]

As soon as you start talking about your "honor as an aspiring rationalist", you're moving from the realm of rationality to ideology.

Well, sure, but the ideological stance is "You should care about rationality." I should think that that's one of the most general and least objectionable ideologies there is.

Like I said, I don't think this question matters and I'm mostly indifferent to what the answer actually is. I'm just trying to protect the people who do care.

But I do care, and I no longer want to be protected from the actual answer. When I say that I speak from experience, it's really true. There's a reason that this issue has me banging out dramatic, gushy, italics-laden paragraphs on the terrible but necessary and righteous burden of relinquishing your cherished beliefs---unlike in the case of, say, theism, in which I'm more inclined to just say, "Yeah, so there's no God; get over it"--although I should probably be more sympathetic.

So, why does it matter? Why can't we just treat the issue with benign neglect, think of ourselves as strictly as individuals, and treat other people strictly as individuals? It is such a beautiful ideal--that my works and words should be taken to reflect only on myself alone, and that the words and works of other people born to a similar form should not be taken to reflect on me. It's a beautiful ideal, and it seems like it should be possible to swear our loyalty to the general spirit of this ideal, while still recognizing that---

In this world, it's not that simple. In a state of incomplete information (and it is not all clear to me what it would even mean to have complete information), you have to make probabilistic inferences based on what evidence you do have, and to the extent that there are systematic patterns of cognitive sex and race differences, people are going to update their opinions of others based on sex and race. You can profess that you're not interested in these questions, that you don't know---but just the same, when you see someone acting against type, you're probably going to notice this as unusual, even if you don't explicitly mention it to yourself.

There are those who argue--as I used to argue--that this business about incomplete information, while technically true, is irrelevant for practical purposes, that it's easy to acquire specific about an individual, which screens off any prior information based on sex and race. And of course it's true, and a good point, and an important point to bear in mind, especially for someone who comes to this issue with antiegalitarian biases, rather than the egalitarian biases that I did. But for someone with initial egalitarian biases, it's important not to use it---as I think I used to use it---as some kind of point scored for the individualist/egalitarian side. Complex empirical questions do not have sides. And to the extent that this is not an empirical issue; to the extent that it's about morality---then there are no points to score.

It gets worse---you don't even have anywhere near complete information about yourself. People form egregiously false beliefs about themselves all the time. If you're not ridiculously careful, it's easy to spend your entire life believing that you have an immortal soul, or free will, or that the fate of the light cone depends solely on you and your genius AI project. So information about human nature in general can be useful even on a personal level: it can give you information about yourself that you would never have gotten from mere introspection and naive observation. I know from my readings that if I'm male, I'm more likely to have a heart attack and less likely to get breast cancer than would be the case if I were female, whereas this would not at all be obvious if I didn't read. Why should this be true of physiology, but not psychology? If it turns out that women and men have different brain designs, and I don't have particularly strong evidence that I'm a extreme genetic or developmental anomaly, then I should update my beliefs about myself based on this information, even if it isn't at all obvious from the inside, and even though the fact may offend me and make me want to cry. For someone with a lot of scientific literacy but not as much rationality skill, the inside view is seductive. It's tempting to cry out, "Sure, maybe ordinary men are such-and-this, and normal women are such-and-that, but not me; I'm different, I'm special, I'm an exception; I'm a gloriously androgynous creature of pure information!" But if you actually want to achieve your ideal (like becoming a gloriously androgynous creature of pure information), rather than just having a human's delusion of it, you need to form accurate beliefs about just how far this world is from the ideal, because only true knowledge can help you actively shape reality.

It could very well be that information about human differences could have all sorts of terrible effects if widely or selectively disseminated. Who knows what the masses will do? I must confess that I am often tempted to say that I have no interest in such political questions--that I don't know, that it doesn't matter to me. This attitude probably is not satisfactory for the same sorts of reasons I've listed above. (How does the line go? "You might not care about politics, but politics cares about you"?) But for now, on a collective or political or institutional level, I really don't know: maybe ignorance is bliss. But for the individual aspiring rationalist, the correct course of action is unambiguous: it's better to know, than to not know, it's better to make decisions explicitly and with reason, then to let your subconscious decide for you and for things to take their natural course.

Comment author: steven0461 18 May 2009 09:22:06AM *  16 points [-]

that it's easy to acquire specific about an individual, which screens off any prior information based on sex and race

I think people may be overapplying the concept of "screening off", though?

If for example RACE -> INTELLIGENCE -> TEST SCORE, then knowing someone's test score does not screen off race for the purpose of predicting intelligence (unless I'm very confused). Knowing test score should still make knowing race less useful, but not because of screening off.

On the other hand, if for example GENDER -> PERSONALITY TRAITS -> RATIONALIST TENDENCIES, then knowing someone's personality traits does screen off gender for the purpose of predicting rationalist tendencies.

Comment author: conchis 18 May 2009 10:42:47AM *  5 points [-]

I agree with the overall thrust of this, but I do have some specific reservations:

It's tempting to cry out, "Sure, maybe ordinary men are such-and-this, and normal women are such-and-that, but not me; I'm different, I'm special, I'm an exception.

The temptation is real, and it's a temptation we should be wary of. But it's also important to realize that the more variation there is within the groups, the more reasonable it may be to suppose that we are special (of course, we should still have some evidence for this, but the point is that the more within-group variation there is, the weaker that evidence needs to be). It's difficult to know what to do with information about average differences between groups unless one also knows something about within-group variation.

But for the individual aspiring rationalist, the correct course of action is unambiguous: it's better to know, than to not know, it's better to make decisions explicitly and with reason, then to let your subconscious decide for you and for things to take their natural course.

Again, I'm very sympathetic to this view, but I think you're overselling the case. If (correctly) believing your group performs poorly causes you to perform worse then there's a tradeoff between that and the benefits of accurate knowledge. Maybe accurate knowledge is still best, all things considered, but that's not obvious to me.

Beyond the context of this specific debate, the theory of the second-best shows that incremental movements in the direction of perfect rationality will not always improve decision-making. And there's a lot of evidence that "subconscious" decision-making outperforms more explicit reasoning in some situations. Jonah Lehrer's How We Decide, and pretty much all of Gerd Gigerenzer's books make this argument (in addition to the more anecdotal account in Malcolm Gladwell's Blink). I tend to think they oversell the case for intuition somewhat, but it's still pretty clear that blanket claims about the superiority of more information and conscious deliberation are false. The really important question is how we can best make use of the strengths of different decision-strategies.

Comment author: John_Maxwell_IV 18 May 2009 03:22:30AM 1 point [-]

One day, I started looking at the people around me, and I began to realize how much they looked like apes. I quickly stopped doing this, because I feared it would cause me to treat them with contempt. And you know what? Doublethink worked. I didn't start treating people with more contempt as a result of purposely avoiding certain knowledge that I knew would cause me to treat people with more contempt.

If you think that my efforts to suppress observations relating the appearance of humans with the appearance of apes were poorly founded, then you have a very instrumentally irrational tendency towards epistemic rationality.

Comment author: CronoDAS 30 November 2011 08:47:34AM 9 points [-]

Maybe the lesson to draw is that you don't respect apes enough?

Comment author: Z_M_Davis 18 May 2009 03:52:08AM 6 points [-]

[...] then you have a very instrumentally irrational tendency towards epistemic rationality.

Guilty as charged! I don't want to win by means of doublethink---it sounds like the sort of thing an ape would do. A gloriously androgynous creature of pure information wins cleanly or not at all. (I think I'm joking somewhat, but my probability distribution on to what extent is very wide.)

Comment author: Strange7 18 December 2011 05:40:30AM 1 point [-]

At some point you will have limited resources, and you will need to decide how much that preference for winning cleanly is worth. How much risk you're willing to take of not winning at all, in exchange for whatever victory you might still get being a clean one.

For example, say there are two people you love dangling off a cliff. You could grab one of them and pull them to safety immediately, but in doing so there's some chance you could destabilize the loose rocks and cause the other to fall to certain death.

The gloriously androgynous creature of pure information (henceforth GACoPI) that you wish you were cannot prioritize one love over the other in a timely fashion, and will simply babble ineffectual encouragement to both until professional rescuers arrive, during which time there is some chance that somebody's arms will get tired or the cliff will shift due to other factors, killing them both. An ape's understanding of factional politics, on the other hand, has no particular difficulty judging one potential ally or mate as more valuable, and then rigging up a quick coin-flip or something to save face.

Comment author: [deleted] 18 December 2011 08:41:08AM 0 points [-]

The gloriously androgynous creature of pure information (henceforth GACoPI) that you wish you were cannot prioritize one love over the other in a timely fashion, and will simply babble ineffectual encouragement to both until professional rescuers arrive, during which time there is some chance that somebody's arms will get tired or the cliff will shift due to other factors, killing them both.

The grandparent uses "winning cleanly" to mean winning without resorting to doublethink. To go from that to assuming that a GACoPI is unable to enact your proposed ape solution or come up with some other courses of action than standing around like an idiot smacks of Straw Vulcanism (WARNING: TVTropes).

Comment author: MarkusRamikin 30 November 2011 09:15:29AM *  5 points [-]

Wouldn't your efforts be better directed at clearing up whatever confusion leads you to react with contempt to the similarity to apes?

I can maybe see myself selling out epistemic rationality for an instrumental advantage in some extreme circumstance, but I find abhorrent the idea of selling it so cheaply. It seems to me a rationalist should value their ability to see reality higher, not give it up at the first sign of inconvenience.

Even on instrumental grounds. Just like theoretical mathematics tends to end up having initially unforeseen practical application, giving up on epistemic rationality carries potential of unforeseen instrumental disadvantage.

Comment author: John_Maxwell_IV 01 December 2011 07:33:55AM 0 points [-]

Good point.

Comment author: Multiheaded 29 March 2012 08:08:11PM *  1 point [-]

If it turns out that women and men have different brain designs, and I don't have particularly strong evidence that I'm a extreme genetic or developmental anomaly, then I should update my beliefs about myself based on this information, even if it isn't at all obvious from the inside, and even though the fact may offend me and make me want to cry.

Might be just mind projection on my part, but it seems to me that in those accursed cases, where various aspects of an "egalitarian" way of thought - that includes both values, moral intuitions, ethical rules, actual beliefs about the world and beliefs about one's beliefs - all get conflated, and the entire (silly for an AI but identity-forming for lots of current humans) system perceives some statement of fact as a challenge to its entire existence... well, the LW crowd at least would pride themselves on not being personally offended.

If tomorrow it was revealed with high certainty that us Slavs are genetically predisposed against some habits that happen to be crucial for civilized living and running a state nicely, I'd most definitely try to take it in stride. But when something like this stuff is said, I tend to feel sick and uncertain; I don't see a purely consequentialist way out which would leave me ethically unscathed.

Ah, if only it was so easy as identifying the objective state of the world, than trying to act in accordance with (what you've settled on as) your terminal values.* But this would require both more rationality and more compartmentalization than I've seen in humans so far.

  • "You come to amidst the wreckage of your own making. Do you stay there, eyes squeezed shut, afraid to move, hoping to bleed to death? Or do you crawl out, help your loved ones, make sure the fire doesn’t spread, try to fix it?" - Max Payne 2: The Fall of Max Payne
Comment author: conchis 17 May 2009 03:48:43PM *  0 points [-]

Denial is not a path to improvement.

I agree that denial usually seems like a bad idea, but the problem with things like stereotype threat is that they suggest (and more importantly provide evidence) that sometimes it might actually be useful (a path to improvement, even if not necessarily the first-best path).

The trick, presumably, is to distinguish the situations when this will hold from those when it doesn't.

Comment author: John_Maxwell_IV 17 May 2009 03:21:42PM -1 points [-]

Yes, but there is a long-term benefit associated with the removal of your cancer. On the other hand, if you had a blemish on your shoulder, you'd be better off not noticing it.