dlthomas comments on Connecting Your Beliefs (a call for help) - Less Wrong

24 Post author: lukeprog 20 November 2011 05:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (73)

You are viewing a single comment's thread. Show more comments above.

Comment author: dlthomas 20 November 2011 07:41:15PM 1 point [-]

What rules are used to decide how much certain evidence can influence my probability estimates?

Bayes' Theorem is precisely that rule.

Comment author: XiXiDu 20 November 2011 08:41:25PM 5 points [-]

Bayes' Theorem is precisely that rule.

That's not what I meant, I have been too vague. It is clear to me how to update on evidence given concrete data or goats behind doors in game shows. What I meant is how one could possible update on evidence like the victory of IBM Watson at Jeopardy regarding risks of AI. It seems to me that assigning numerical probability estimates to such evidence, that is then used to update on the overall probability of risks from AI, is a very shaky affair that might distort the end result as one just shifted the use of intuition towards the interpretation of evidence in favor of an outcome rather than the outcomes itself.

Comment author: [deleted] 20 November 2011 09:49:55PM 4 points [-]

Causal analysis is probably closer to what you're looking for. It displays stability under (small) perturbation of relative probabilities, and it's probably closer to what humans do under the hood than Bayes' theorem. Pearl often observes that humans work with cause and effect with more facility than numerical probabilities.

Comment author: pnrjulius 05 June 2012 04:08:37PM 0 points [-]

Numerical stability is definitely something we need in our epistemology. If small errors make the whole thing blow up, it's not any good to us, because we know we make small errors all the time.