Eliezer_Yudkowsky comments on Rationality Quotes August 2013 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (733)
-Thomas Jefferson
One who possesses a maximum-entropy prior is further from the truth than one who possesses an inductive prior riddled with many specific falsehoods and errors. Or more to the point, someone who endorses knowing nothing as a desirable state for fear of accepting falsehoods is further from the truth than somebody who believes many things, some of them false, but tries to pay attention and go on learning.
How about "If you know nothing and are willing to learn, you're closer to the truth than someone who's attached to falsehoods"? Even then, I suppose you'd need to throw in something about the speed of learning.
It would seem that the difference of opinion here originates in the definition of further. Someone who knows nothing is further (in the information-theoretic sense) from the truth than someone who believes a falsehood, assuming that the falsehood has at least some basis in reality (even if only an accidental relation), because they must flip more bits of their belief (or lack thereof) to arrive at something resembling truth. On the other hand, in the limited, human, psychological sense, they are closer, because they have no attachments to relinquish, and they will not object to having their state of ignorance lifted from them, as one who believes in falsehoods might object to having their state of delusion destroyed.
Right, I'd take it as a statement on how humans actually think, not how a perfect rationalist thinks. Or maybe how most humans think since humans can be unattached to their beliefs.
To me "filled with falsehoods and errors" translates into more falsehoods than "some". Though I agree its not a very good quote within the context of LW.
-LessWrong Community
In what units does one measure distance from the truth, and in what manner?
Bits of Shannon entropy.
That's half of the answer. In what manner does one measure the number of bits of Shannon entropy that a person has?
If you make a numerical statement of your confidence -- P(A) = X, 0 < X < 1 -- measuring the shannon entropy of that belief is a simple matter of observing the outcome and taking the binary logarithm of your prediction or the converse of it, depending on what came true. S is shannon entropy: If A then S = log2(X), If ¬A then S = log2(1 - X).
The lower the magnitude of the resulting negative real, the better you faired.
That allows a prediction/confidence/belief to be measured. How do you total a person?
Simple, under dubiously ethical and physically possible conditions, you turn their internal world model into a formal bayesian network, and for every possible physical and mathematical observation and outcome, do the above calculation. Sum, print, idle.
It's impossible in practise, but only like, four line formal definition.
How do you measure someone whose internal world model is not isomorphic to one formal Bayesian network (for example, someone who is completely certain of something)? Should it be the case that someone whose world model contains fewer possible observations has a major advantage in being closer to the truth?
Note also that a perfect Bayesian will score lower than some gamblers using this scheme. Betting everything on black does better than a fair distribution almost half the time.
I am not very certain that humans actually can have an internal belief model that isn't isomorphic to some bayesian network. Anyone who proclaims to be absolutely certain; I suspect that they are in fact not.
Maybe it's just where my mind was when I read it but I interpreted the quote as meaning something more like:
"It is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence."