JoshuaZ comments on Harry Potter and the Methods of Rationality discussion thread - Less Wrong

34 Post author: Unnamed 27 May 2010 12:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (866)

You are viewing a single comment's thread. Show more comments above.

Comment author: JoshuaZ 02 June 2010 01:27:05AM *  4 points [-]

I don't exactly fit your set since I had seen LW before, but there's some good reason that I should be included in your sample. Explanation follows: I had read most of the sequences before (and frankly didn't learn that much from them. A handful of cogsci and psych classes along with a fair bit of phil sci gives one a lot of the same material) and had previously read some of Eliezer's fiction. I hadn't really taken that detailed a look at LW as a whole, until HPMR. That was partially due to a conversation with a friend that went something like

Friend: So who is the author of this stuff? JZ: He's Eliezer Yudkowsky who is an all around very bright guy. He has some a bit off ideas about the Singularity. Friend: What evidence do you have of that he's bright and not just a good fiction writer? The one thing you've mentioned is something you disagree with. JZ: Um, let me get back to you.

Then when reading I felt a need to register an account to make a comment, and then it has been downhill from there (I just linked an LW post to a friend who said that she refused to read it because "I'm not sure I'm willing to let myself -oh god oh god- be sucked into Less Wrong. I have heard it wastes time like tvtropes on crack." I'm not sure if that's a good or a bad thing).

I've linked HPMR to a fair number of people, and it seems to be having some impact on some of them. Indeed, it seems that it is quite effective at getting through defense mechanisms that some people have against being more rational, because the arguments aren't being coached in an obvious way of trying to just present what is wrong with their thinking processes. I'm running into concerns about whether linking HPMR to people without telling them about that is ethical or not.

Comment author: MBlume 07 June 2010 06:27:59AM 7 points [-]

That which can be destroyed by the truth should be.

(On the other hand, Michael Vassar often claims that this quote is as disingenuous as a strong man saying "That which can be destroyed by lions should be.")

Comment author: Blueberry 07 June 2010 06:56:03AM 5 points [-]

(On the other hand, Michael Vassar often claims that this quote is as disingenuous as a strong man saying "That which can be destroyed by lions should be.")

I'm not sure I understand. Lions can destroy any human, no matter how strong, right? Is the implication that truth is a weapon? Or that the only people who support truth are the ones who think they're right? But people frequently think they're right when they're not.

Comment author: RomanDavis 07 June 2010 08:28:59AM *  2 points [-]

If you are rational, you both are already more likely believe things that are true (or less wrong than your competitors) and more able to defend your false beliefs using knowledge of argument and cognitive biases.

Substitute well armed for strong if you like.

Comment author: Blueberry 09 June 2010 07:30:52PM 2 points [-]

"Well-armed" makes a little more sense, but I still don't think it's a good analogy. Lions destroy people who aren't well-armed, so it's disingenous for a well-armed person to say that a fair procedure for who lives is to let the lions attack and see who survives. Truth destroys false ideas, not people, and people frequently don't know in advance which ideas will be destroyed by the truth. People, even rational ones, are often wrong in their predictions, unlike the well-armed man.

A precommitment to letting experiments and truth decide what ideas will survive doesn't stack the deck in your favor, unlike in the lions example. The whole point is that you are willing to take the chance of having your ideas die, as long as the true ideas survive.

Comment author: RomanDavis 09 June 2010 11:17:44PM *  4 points [-]

I think you could say that the truth does destroy people. You can't be the same person once you've really accepted an entirely new, important idea, and rejected an old belief.

When someone says "that which should be destroyed by the truth should be" and he's talking to a Christian or a white supremacist or thousand other people defined by the silly idea they take very seriously, you are often asking them to do something a lot more scary than go up against a lion.

If you've already seen the truth and accepted it, the deck is as stacked as it could be. And if you haven't but are otherwise making your bet rationally, while the other is not, then you've still got a lot better chance.

Comment author: JoshuaZ 07 June 2010 01:09:46PM 1 point [-]

That which can be destroyed by the truth should be.

And if that destruction itself requires withholding information? In most contexts I'm pretty sure most people here would think that something of the form "I know I'm right, but they'll more likely to not believe the truth if I don't tell them X" is not good rational behavior.