Eliezer_Yudkowsky comments on Rationality Quotes August 2013 - Less Wrong

7 Post author: Vaniver 02 August 2013 08:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (733)

You are viewing a single comment's thread. Show more comments above.

Comment author: Ambition 02 August 2013 02:32:30AM 9 points [-]

He who knows nothing is closer to the truth than he whose mind is filled with falsehoods and errors.

-Thomas Jefferson

Comment author: Eliezer_Yudkowsky 02 August 2013 09:01:03PM 23 points [-]

One who possesses a maximum-entropy prior is further from the truth than one who possesses an inductive prior riddled with many specific falsehoods and errors. Or more to the point, someone who endorses knowing nothing as a desirable state for fear of accepting falsehoods is further from the truth than somebody who believes many things, some of them false, but tries to pay attention and go on learning.

Comment author: NancyLebovitz 03 August 2013 12:31:21AM 8 points [-]

How about "If you know nothing and are willing to learn, you're closer to the truth than someone who's attached to falsehoods"? Even then, I suppose you'd need to throw in something about the speed of learning.

Comment author: AndHisHorse 04 August 2013 11:19:15PM 5 points [-]

It would seem that the difference of opinion here originates in the definition of further. Someone who knows nothing is further (in the information-theoretic sense) from the truth than someone who believes a falsehood, assuming that the falsehood has at least some basis in reality (even if only an accidental relation), because they must flip more bits of their belief (or lack thereof) to arrive at something resembling truth. On the other hand, in the limited, human, psychological sense, they are closer, because they have no attachments to relinquish, and they will not object to having their state of ignorance lifted from them, as one who believes in falsehoods might object to having their state of delusion destroyed.

Comment author: felzix 19 August 2013 06:47:25PM 1 point [-]

Right, I'd take it as a statement on how humans actually think, not how a perfect rationalist thinks. Or maybe how most humans think since humans can be unattached to their beliefs.

Comment author: Grant 05 August 2013 07:56:18AM 4 points [-]

To me "filled with falsehoods and errors" translates into more falsehoods than "some". Though I agree its not a very good quote within the context of LW.

Comment author: Ambition 03 August 2013 01:18:16AM *  3 points [-]

He who knows nothing is further from the truth than he whose mind is filled with falsehoods and errors, but has the courage to acknowledge them as so.

-LessWrong Community

Comment author: Decius 07 August 2013 06:20:27PM 2 points [-]

In what units does one measure distance from the truth, and in what manner?

Comment author: linkhyrule5 10 August 2013 01:56:22AM 3 points [-]
Comment author: Decius 10 August 2013 02:26:22AM 1 point [-]

That's half of the answer. In what manner does one measure the number of bits of Shannon entropy that a person has?

Comment author: [deleted] 13 August 2013 06:15:19PM 2 points [-]

If you make a numerical statement of your confidence -- P(A) = X, 0 < X < 1 -- measuring the shannon entropy of that belief is a simple matter of observing the outcome and taking the binary logarithm of your prediction or the converse of it, depending on what came true. S is shannon entropy: If A then S = log2(X), If ¬A then S = log2(1 - X).

The lower the magnitude of the resulting negative real, the better you faired.

Comment author: Decius 13 August 2013 08:15:33PM 1 point [-]

That allows a prediction/confidence/belief to be measured. How do you total a person?

Comment author: [deleted] 13 August 2013 11:44:07PM *  0 points [-]

Simple, under dubiously ethical and physically possible conditions, you turn their internal world model into a formal bayesian network, and for every possible physical and mathematical observation and outcome, do the above calculation. Sum, print, idle.

It's impossible in practise, but only like, four line formal definition.

Comment author: Decius 14 August 2013 05:40:36AM 2 points [-]

How do you measure someone whose internal world model is not isomorphic to one formal Bayesian network (for example, someone who is completely certain of something)? Should it be the case that someone whose world model contains fewer possible observations has a major advantage in being closer to the truth?

Note also that a perfect Bayesian will score lower than some gamblers using this scheme. Betting everything on black does better than a fair distribution almost half the time.

Comment author: [deleted] 16 August 2013 01:23:35PM 1 point [-]

I am not very certain that humans actually can have an internal belief model that isn't isomorphic to some bayesian network. Anyone who proclaims to be absolutely certain; I suspect that they are in fact not.

Comment author: BlueSun 05 August 2013 05:03:07PM 2 points [-]

Maybe it's just where my mind was when I read it but I interpreted the quote as meaning something more like:

"It is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence."