ata comments on Harry Potter and the Methods of Rationality discussion thread - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (866)
Reply to this comment if you found LW through Harry Potter and the Methods of Rationality!
A survey for anyone who cares to respond (edit: specifically for people who did find LW through HPMoR):
I don't exactly fit your set since I had seen LW before, but there's some good reason that I should be included in your sample. Explanation follows: I had read most of the sequences before (and frankly didn't learn that much from them. A handful of cogsci and psych classes along with a fair bit of phil sci gives one a lot of the same material) and had previously read some of Eliezer's fiction. I hadn't really taken that detailed a look at LW as a whole, until HPMR. That was partially due to a conversation with a friend that went something like
Friend: So who is the author of this stuff? JZ: He's Eliezer Yudkowsky who is an all around very bright guy. He has some a bit off ideas about the Singularity. Friend: What evidence do you have of that he's bright and not just a good fiction writer? The one thing you've mentioned is something you disagree with. JZ: Um, let me get back to you.
Then when reading I felt a need to register an account to make a comment, and then it has been downhill from there (I just linked an LW post to a friend who said that she refused to read it because "I'm not sure I'm willing to let myself -oh god oh god- be sucked into Less Wrong. I have heard it wastes time like tvtropes on crack." I'm not sure if that's a good or a bad thing).
I've linked HPMR to a fair number of people, and it seems to be having some impact on some of them. Indeed, it seems that it is quite effective at getting through defense mechanisms that some people have against being more rational, because the arguments aren't being coached in an obvious way of trying to just present what is wrong with their thinking processes. I'm running into concerns about whether linking HPMR to people without telling them about that is ethical or not.
That which can be destroyed by the truth should be.
(On the other hand, Michael Vassar often claims that this quote is as disingenuous as a strong man saying "That which can be destroyed by lions should be.")
I'm not sure I understand. Lions can destroy any human, no matter how strong, right? Is the implication that truth is a weapon? Or that the only people who support truth are the ones who think they're right? But people frequently think they're right when they're not.
If you are rational, you both are already more likely believe things that are true (or less wrong than your competitors) and more able to defend your false beliefs using knowledge of argument and cognitive biases.
Substitute well armed for strong if you like.
"Well-armed" makes a little more sense, but I still don't think it's a good analogy. Lions destroy people who aren't well-armed, so it's disingenous for a well-armed person to say that a fair procedure for who lives is to let the lions attack and see who survives. Truth destroys false ideas, not people, and people frequently don't know in advance which ideas will be destroyed by the truth. People, even rational ones, are often wrong in their predictions, unlike the well-armed man.
A precommitment to letting experiments and truth decide what ideas will survive doesn't stack the deck in your favor, unlike in the lions example. The whole point is that you are willing to take the chance of having your ideas die, as long as the true ideas survive.
I think you could say that the truth does destroy people. You can't be the same person once you've really accepted an entirely new, important idea, and rejected an old belief.
When someone says "that which should be destroyed by the truth should be" and he's talking to a Christian or a white supremacist or thousand other people defined by the silly idea they take very seriously, you are often asking them to do something a lot more scary than go up against a lion.
If you've already seen the truth and accepted it, the deck is as stacked as it could be. And if you haven't but are otherwise making your bet rationally, while the other is not, then you've still got a lot better chance.
And if that destruction itself requires withholding information? In most contexts I'm pretty sure most people here would think that something of the form "I know I'm right, but they'll more likely to not believe the truth if I don't tell them X" is not good rational behavior.
I knew of LW's existence before HPMoR, through the same source that referred me to HPMoR (ESR).
Yes. I went from LW to the OB archives, I created an account to comment on an old post there.
I've been ignoring the Sequences as such, but have been working my way through the OB archives chronologically, which I gather covers the same material.
Hard to answer that question. The cognitive bias stuff is fairly old hat. The timeless-physics stuff is new to me, but isn't really a skill. I'm currently working my way through the metaethics stuff, which I'm not finding particularly convincing but haven't finished thinking about.
One friend, to both HPMoR and the OB archives. Not so much LW per se, which (sorry) seems to have a higher noise:signal ratio than the old stuff.
I've been paying a little bit of attention to recent posts, but not a lot; mostly I've been "time-travelling" through the archives.
I've been responding to posts here and there when I have something to say I don't see in the comments. I do this even though I don't expect anyone is reading old comments (though sometimes they get upvoted or responded to, so it's not a complete vacuum), mostly because I often don't really know what I think about something until I've tried to formulate a response to it.
1: No. Most of my time I was lurking. Lot of stuff on LW.
2: Following links, like I was on TVtropes
3: Nothing yet. Eliezer has a distinct way of expressing himself, which is why I enjoy HPMoR, but most of the ideas he is expressing I have heard before.
4: Yes to HPMoR, no to LW.
You're not very rational for a bunch of extreme rationalists, are you? It's only possible to answer this survey if you register for the site, so excluding nearly all possible commenters (there is science on this) and presumably an even greater proportion of those who are uninterested in the ideas in LW. So that's a Big Old Fail.
So, here goes:
You really think we've never talked about selection bias here? It is constantly a concern every time we do a survey. This is why ata's questions were directed at those who had registered and not at the entire group that read the fanfiction. If you know of some way we could poll everyone who read the fanfiction without response bias by all means tell us.
Something about us rubbed you the wrong way. Which is fine, things about us rub me the wrong way. But I'd much rather you articulate what that was than go searching for random things to criticize us about just because you want us to be irrational.
What specifically?
Please Elaborate.
Are you asking because you don't know, or because you want to know which ones BohemianCoast noticed?
Most of the world is wrong. Formal education is overrated. The world as we know it is may cease within a century. Lots Of Math. Simultaneously mentioning the word quantum and talking about psychology. For that matter, mentioning the word quantum.
Those are just the ones off the top of my head, and I'm not BohemianCoast. But a lot of stuff written here (and in the "Sequences") is true despite setting off nutcase detectors, not without setting them off.
There Isn't That Much Math, Really. And none of the cargo-cultish use of mathy writing as impressive-looking gibberish that tends to mark nutcase stuff. Agree with the rest though. Oh, and also: The scientific method is poor and needs to be improved. A central notion on physics held by most practicing physicists is fundamentally misguided.
I've seen plenty of nutcase-stuff in which the math wasn't gibberish - it was correct as math but was simply window-dressing for the nutcase argument. Sometimes, it seems that EY is just using it as garnish for his arguments as well. So, I think that there is a kernel of truth in what GuySrinivasan said about the mathiness of the site. It fits the pattern.
Which is not to say that EY is a nutcase. Those nutcase detectors may be returning false positives. But that doesn't mean that the nutcase detectors are defective.