Decius comments on Rationality Quotes August 2013 - Less Wrong

7 Post author: Vaniver 02 August 2013 08:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (733)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 02 August 2013 09:01:03PM 23 points [-]

One who possesses a maximum-entropy prior is further from the truth than one who possesses an inductive prior riddled with many specific falsehoods and errors. Or more to the point, someone who endorses knowing nothing as a desirable state for fear of accepting falsehoods is further from the truth than somebody who believes many things, some of them false, but tries to pay attention and go on learning.

Comment author: Decius 07 August 2013 06:20:27PM 2 points [-]

In what units does one measure distance from the truth, and in what manner?

Comment author: linkhyrule5 10 August 2013 01:56:22AM 3 points [-]
Comment author: Decius 10 August 2013 02:26:22AM 1 point [-]

That's half of the answer. In what manner does one measure the number of bits of Shannon entropy that a person has?

Comment author: [deleted] 13 August 2013 06:15:19PM 2 points [-]

If you make a numerical statement of your confidence -- P(A) = X, 0 < X < 1 -- measuring the shannon entropy of that belief is a simple matter of observing the outcome and taking the binary logarithm of your prediction or the converse of it, depending on what came true. S is shannon entropy: If A then S = log2(X), If ¬A then S = log2(1 - X).

The lower the magnitude of the resulting negative real, the better you faired.

Comment author: Decius 13 August 2013 08:15:33PM 1 point [-]

That allows a prediction/confidence/belief to be measured. How do you total a person?

Comment author: [deleted] 13 August 2013 11:44:07PM *  0 points [-]

Simple, under dubiously ethical and physically possible conditions, you turn their internal world model into a formal bayesian network, and for every possible physical and mathematical observation and outcome, do the above calculation. Sum, print, idle.

It's impossible in practise, but only like, four line formal definition.

Comment author: Decius 14 August 2013 05:40:36AM 2 points [-]

How do you measure someone whose internal world model is not isomorphic to one formal Bayesian network (for example, someone who is completely certain of something)? Should it be the case that someone whose world model contains fewer possible observations has a major advantage in being closer to the truth?

Note also that a perfect Bayesian will score lower than some gamblers using this scheme. Betting everything on black does better than a fair distribution almost half the time.

Comment author: [deleted] 16 August 2013 01:23:35PM 1 point [-]

I am not very certain that humans actually can have an internal belief model that isn't isomorphic to some bayesian network. Anyone who proclaims to be absolutely certain; I suspect that they are in fact not.

Comment author: pragmatist 16 August 2013 09:39:07PM 2 points [-]

How do you account for people falling prey to things like the conjunction fallacy?

Comment author: Decius 17 August 2013 04:05:04AM 1 point [-]

How likely do you believe it is that there exists a human who is absolutely certain of something?

Comment author: Lumifer 16 August 2013 03:09:23PM 1 point [-]

Anyone who proclaims to be absolutely certain; I suspect that they are in fact not.

Is this a testable assertion? How do you determine whether someone is, in fact, absolutely certain?

It's not unheard of people to bet their life on some belief of theirs.