You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

drethelin comments on Elevator pitches/responses for rationality / AI - Less Wrong Discussion

17 Post author: lukeprog 02 February 2012 08:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (68)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 02 February 2012 10:25:22PM 10 points [-]

assigning numbers and probabilities to beliefs

I have never seen this explained accessibly on LW.

Comment author: drethelin 02 February 2012 10:35:35PM 8 points [-]

This. I'm skeptical of almost every numerical probability estimate I hear unless the steps are outlined to me.

Comment author: Postal_Scale 03 February 2012 11:36:39PM 3 points [-]

No joke intended, but how much more skeptical are you, percentage-wise, of numerical probability estimates than vague, natural language probability estimates? Please disguise your intuitive sense of your feelings as a form of math.

Ideally, deliver your answer in a C-3PO voice.

Comment author: drethelin 04 February 2012 01:07:27AM 2 points [-]

40 percent.

Comment author: Giles 13 March 2012 07:44:20PM 0 points [-]

This may be one reason why people are reluctant to assign numbers to beliefs in the first place. People equate numbers with certainty and authority, whereas a probability is just a way of saying how uncertain you are about something.

When giving a number for a subjective probability, I often feel like it should be a two-dimensional quantity: probability and authority. The "authority" figure would be an estimate of "if you disagree with me now but we manage to come to an agreement in the next 5 minutes, what are the chances of me having to update my beliefs versus you?"