Lumifer comments on Marketing Rationality - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (220)
(Both agreeing with and refining your position, and directed less to you than the audience):
Personally, I'm at level 21, and I'm trying to raise the rest of you to my level.
Now, before you take that as a serious statement, ask yourself how you feel about that proposition, and how inclined you would be to take anything I said seriously if I actually believed that. Think about to what extent I behave like I -do- believe that, and how that changes the way what I say is perceived.
http://lesswrong.com/lw/m70/visions_and_mirages_the_sunk_cost_dilemma/ <- This post, and pretty much all of my comments, had reasonably high upvotes before I revealed what I was up to. Now, I'm not going to say it didn't deserve to get downvoted - I learned a lot from that post that I should have known going into it - but I'd like to point out the fundamental similarities, but scaled up a level, between what I do there, and typical rationalist "education". "Here's a thing. It was a trick! Look at how easily I tricked you! You should now listen to what I say about how to avoid getting tricked in the future." Worse, cognitive dissonance will make it harder to fix that weakness in the future. As I said, I learned a -lot- in that post; I tried to shove at least four levels of plots and education into it, and instead, turned people off with the first or second one. I hope I taught people something, but in retrospect, and far removed from it, I think it was probably a complete and total failure which mostly served to alienate people from the lessons I was attempted to impart.
The first step to making stupid people slightly less stupid is to make them realize the way in which they're stupid in the first place, so that they become willing to fix it. But you can't do that, because, obviously, people really dislike being told they're stupid. Because there are some issues inherent in approaching other people with the assumption that they're less than you, and that they should accept your help in raising them up. You're asserting a higher status than them. They're going to resent that, and cognitive dissonance is going to make them decide that the thing you're better at, either you aren't, or that it isn't that important. So if you think that you can make "stupid people slightly less stupid", you're completely incompetent at the task.
But... show them that -you- are stupid, and show them you becoming less stupid, and cognitive dissonance will tell them that they were smarter than you, and that they already knew what you were trying to teach them. That's a huge part of what made the Sequences so successful - riddled throughout it were admissions of Eliezer's own weakness. "This is a mistake I made. This is what I realized. This is how I started to get past that mistake." What made them failures, however, is the way they made those who read them feel Enlightened, like they had just Leveled Up twenty times and were now far above ordinary plebeians. The critical failure of the Sequences is that they didn't teach humility; the lesson you -should- come away from them with is the idea that, however much Less Wrong you've become, you're still deeply, deeply wrong. And that's okay.
Which provokes a dilemma. Everybody who wants to teach rationality to others, because it leveled them up twenty times and look at those stupid people falling prey to the non-central fallacy on a constant basis, are completely unsuitable to do so.
So did I succeed? Or did I fail? And why?
The solution to the meta-level confusion (it's turtles all the way down, anyway) is to spend a few years building up an immunity to iocane powder.