daenerys comments on Elevator pitches/responses for rationality / AI - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (68)
One of the most difficult arguments I've had making is convincing people that they can be more rational. Sometimes people have said that they're simply incapable of assigning numbers and probabilities to beliefs, even though they acknowledge that it's superior for decision making.
I agree that it can be difficult convincing people that they can be more rational. But I think starting new people off with the idea of assigning probabilities to their beliefs is the wrong tactic. It's like trying to get someone who doesn't know how to walk, to run a marathon.
What do you think about starting people off with the more accessible ideas on Less Wrong? I can think of things like: Sunk Costs Fallacy, not arguing things "by definition, and admitting to a certain level of uncertainty. I'm sure you can think of others.
I would bet that pointing people to a more specific idea, like those listed above, would make them more likely to feel like there are actual concepts on LW that they personally can learn and apply. It's sort of like the "Shock Level" theory, but instead it's "Rationality Level":
Rationality Level 0- I don't think being rational is at all a good thing. I believe 100% in my intuitions!
Rationality Level 1- I see how being rational could help me, but I doubt my personal ability to apply these techniques
Rationality Level 2- I am trying to be rational, but rarely succeed (this is where I would place myself.) Rationality Level 3- I am pretty good at this whole "rationality" thing!
Rationality Level 4- I Win At Life!
I bet with some thought, someone else can come up with a better set of "Rationality Levels".