Strange7 comments on The Hostile Arguer - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (76)
How do you deal with Professor Quirrel's version of the argument? It seems basically correct to me that people who are mostly rational will manage to learn from their experience things which the young would not guess. Do you think that the risk you're being lied to about their experience is what justifies ignoring this argument? I think that's often true, but not always. Perhaps the best answer is that people who are rational enough to justify your trusting their word over known arguments do not generally exist?
It's not that there aren't any people whose unsupported assertion is more trustworthy than an explicit, persuasive-sounding argument for the opposite side, though certainly individuals of such discernment and integrity are rare. The issue is that any person so reliable would also necessarily have enough underlying intelligence to be able to, in any situation not involving implausibly extreme levels of time pressure, construct a better (or at least more contextually specific) argument than "you'll understand when you're older." The only plausible explanation for being so vague is if they not only don't want to tell you, but are further trying not to provide enough keywords for you to look up the real reason yourself.
I'm slightly less cynical; I think they usually do in fact genuinely believe that you'll change your mind and agree with them many years later. The people I've seen this with tend not to be good at putting feelings into words.
By the way, I'd love to see someone steelman the experience argument (but am too lazy to do myself). Anyone up for it?
I'm not saying that someone making the "understand when you're older" argument is being dishonest. They might not even be incorrect. It's just that, if that's the best case they can come up with, even after thinking it over, you're probably better off making your decision on some basis other than their opinion.
In addition, experience should be transferable; in other words, if you think there's some "experience" to be had that will convince me that I'm wrong about something, you should be able to convey the details of such an experience (as well as why you think the experience would be convincing) to me directly, e.g. Quirrell's conversation with Hermione:
Quirrell didn't just say, "Oh, you'll change your mind later when you experience life a bit more" (although he has done that to Harry sometimes); he actually laid out the details of the argument. That's not to say he was right, but at least his argument is a lot more convincing than just a naked claim that "you'll change your mind with experience".
In other words, any time someone makes the experience argument and is able to do so in a legitimate manner, he/she should also be able to give a fairly concise summary of the experience as well as why it should be convincing to his/her opponent. Ideally, then, no one should make the experience argument at all, because every time someone does so, either (a) the argument is illegitimate or (b) there's a better argument readily available that should be used instead. Because of this, if someone makes the experience argument to me without an accompanying summary, then it immediately tells me that he/she is just using it as a semantic stopsign and is not arguing in good faith. This in turn leads me to get out of the conversation rather quickly, if possible.
Not every experience has a concise summary.
If someone tells me that a complex mathematical proof comes to a certain conclusion, than often I have to make a decision about trusting their expertise or spending a few semesters studying math to get the underlying understanding to be able to follow the proof myself.
Even in the case you described, it should be possible to lay out something to support the argument, even if it takes too long to cover the argument itself. Like, instead of just saying "I know this proof seems counterintuitive right now, but trust me, once you study a bit more, it'll all make sense," the person would do better to say, "Well, I know it sounds absurd that you'd be able to take a single object, disassemble it, and reassemble it to form two of the same object, but in fact it has been proven to be possible given infinite divisibility and something called the Axiom of Choice. If you're not familiar with that, I'd suggest reading a bit about set theory." My credence in the latter case would be much higher than in the former case, and it didn't seem to take that long just to say, "Okay, this statement has its roots in X, Y, and Z, so to understand, you'll want to study those." I maintain that it should be possible to give some context for the experience argument no matter what, and that if you don't do so, you're not trying to argue in good faith.
I actually can give you an "intuitive" justification of the Banach-Tarski theorem.
Suppose you have a rigid ball full of air. If you take half the air out and put it into another, identical ball, you now have twice the volume of air, at half the density. However, the points in a mathematical ball are infinitely dense - half of infinity is still infinity, so it turns out that if you do it just right, you can take out "half" of the points from a mathematical ball, put it inside another one, and end up with two balls that are both "completely full" and identical to the original one.
Your explanation suggests the wrong intuition for Banach-Tarski.
It's relatively easy to show that there's a bijection between the points contained in one ball and the points contained in two balls. (Similarly, there is a bijection between the interval [0,1] and the interval [0,2].)
The Banach-Tarski theorem proves a harder statement: you can take a unit ball, partition it into finitely many pieces (I think it can be done with five), and then rearrange those pieces, using only translations and rotations, into two unit balls.
(If there's a canonical weird thing about the theorem, it's that we can do this in three dimensions but not in two.)
Agreed; it's not a real justification, it's just something that makes it sound less absurd. (When you look at the theorem a little bit closer, the weird part becomes not that you can make two balls out of one ball, but that you can do it with just translations and rotations. And if you look really, really hard, the weird part becomes that you can't do it with only four pieces.)
This intuitive justification likewise indicates that one should be able to do the Banach-Tarski thing with a 2-dimensional disc rather than a 3-dimensional ball. Unfortunately, that isn't true. (Though it is if you allow area-preserving affine transformations as well as isometries.)
My steelman is this (without having read anything downstairs, so I apologise if there's a better on extant): the world is a complicated place, and we all form beliefs based on the things we think are important in the world; and since we are all horrible reasoners, it's impossible to believe about some things that they are important movers of the world without seeing it actually happen and viscerally feeling it change things.
Cognitive biases in yourself are like this, methinks. Your thought processes really need to be broken down repeatedly for you to be able to start seeing the subtle shifts happening inside you - and anticipating that they happen even when you don't see them (generalising from many examples here, but not nearly enough).
Another difficult tripping point for me was intuitive reasoning. Till I saw people who couldn't make any sense do significantly better than me I could not possibly believe it, even fighting against people who told me I over-analysed and spoke too much.
I'm slowly coming around on dishonest rhetorical stances, because of the amount of time I've spent trying to convince hostile arguers. Let me soothe the raised hackles of your inner LW-cat by saying that I can't endorse anything like this without finding a Schelling fence,* and am willing to consider anyone who takes such a stance on LW (or in LW-related contexts) evil.
*In fact, based on the world being as it is, I strongly suspect there isn't one.
I'm a professional computer programmer, a field founded on logic and reason. I've been doing it for a while, and am, if I say so myself, pretty good at it.
I still find myself frequently hitting places where the best argument I can make for a particular decision is "this feels intuitively like decision x and decision y, and those were the correct choices in those cases, therefore I think this is the correct decision too." And very often that's right.
My understanding of the evidence is that within specific fields, experts in that field develop intuitions that really can yield better decisionmaking than conscious reason. Logically, doesn't it seem like this would be true of living in general?
Mentioning a similarity to past successful decisions seems like it qualifies as "constructing a more contextually specific argument than 'you'll understand when you're older'".
I guess, but the explicit comparison is usually pretty indefensible. "Isn't this actually more like decision w, where the opposite choice was correct?" would be a natural response, and one I wouldn't have any counterargument for.
Excellent point, thanks a bunch for supplying it. That makes a lot of sense to me.