RobertLumley comments on Elevator pitches/responses for rationality / AI - Less Wrong

17 Post author: lukeprog 02 February 2012 08:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (68)

You are viewing a single comment's thread.

Comment author: RobertLumley 02 February 2012 10:05:12PM 4 points [-]

One of the most difficult arguments I've had making is convincing people that they can be more rational. Sometimes people have said that they're simply incapable of assigning numbers and probabilities to beliefs, even though they acknowledge that it's superior for decision making.

Comment author: [deleted] 02 February 2012 10:25:22PM 10 points [-]

assigning numbers and probabilities to beliefs

I have never seen this explained accessibly on LW.

Comment author: drethelin 02 February 2012 10:35:35PM 8 points [-]

This. I'm skeptical of almost every numerical probability estimate I hear unless the steps are outlined to me.

Comment author: Postal_Scale 03 February 2012 11:36:39PM 3 points [-]

No joke intended, but how much more skeptical are you, percentage-wise, of numerical probability estimates than vague, natural language probability estimates? Please disguise your intuitive sense of your feelings as a form of math.

Ideally, deliver your answer in a C-3PO voice.

Comment author: drethelin 04 February 2012 01:07:27AM 2 points [-]

40 percent.

Comment author: Giles 13 March 2012 07:44:20PM 0 points [-]

This may be one reason why people are reluctant to assign numbers to beliefs in the first place. People equate numbers with certainty and authority, whereas a probability is just a way of saying how uncertain you are about something.

When giving a number for a subjective probability, I often feel like it should be a two-dimensional quantity: probability and authority. The "authority" figure would be an estimate of "if you disagree with me now but we manage to come to an agreement in the next 5 minutes, what are the chances of me having to update my beliefs versus you?"

Comment author: badger 03 February 2012 01:20:30AM 4 points [-]

Techniques for probability estimates by Yvain is the best we have.

Comment author: [deleted] 02 February 2012 11:00:49PM *  7 points [-]

I agree that it can be difficult convincing people that they can be more rational. But I think starting new people off with the idea of assigning probabilities to their beliefs is the wrong tactic. It's like trying to get someone who doesn't know how to walk, to run a marathon.

What do you think about starting people off with the more accessible ideas on Less Wrong? I can think of things like: Sunk Costs Fallacy, not arguing things "by definition, and admitting to a certain level of uncertainty. I'm sure you can think of others.

I would bet that pointing people to a more specific idea, like those listed above, would make them more likely to feel like there are actual concepts on LW that they personally can learn and apply. It's sort of like the "Shock Level" theory, but instead it's "Rationality Level":

Rationality Level 0- I don't think being rational is at all a good thing. I believe 100% in my intuitions!
Rationality Level 1- I see how being rational could help me, but I doubt my personal ability to apply these techniques
Rationality Level 2- I am trying to be rational, but rarely succeed (this is where I would place myself.) Rationality Level 3- I am pretty good at this whole "rationality" thing!
Rationality Level 4- I Win At Life!

I bet with some thought, someone else can come up with a better set of "Rationality Levels".