"Statistical Bias"

13 Eliezer_Yudkowsky 30 March 2007 06:55PM

(Part one in a series on "statistical bias", "inductive bias", and "cognitive bias".)

"Bias" as used in the field of statistics refers to directional error in an estimator.  Statistical bias is error you cannot correct by repeating the experiment many times and averaging together the results.

The famous bias-variance decomposition states that the expected squared error is equal to the squared directional error, or bias, plus the squared random error, or variance.  The law of large numbers says that you can reduce variance, not bias, by repeating the experiment many times and averaging the results.

continue reading »

A Request for Open Problems

25 MrHen 08 May 2009 01:33PM

Open problems are clearly defined problems1 that have not been solved. In older fields, such as Mathematics, the list is rather intimidating. Rationality, on the other, seems to have no list.

While we have all of us here together to crunch on problems, let's shoot higher than trying to think of solutions and then finding problems that match the solution. What things are unsolved questions? Is it reasonable to assume those questions have concrete, absolute answers?

The catch is that these problems cannot be inherently fuzzy problems. "How do I become less wrong?" is not a problem that can be clearly defined. As such, it does not have a concrete, absolute answer. Does Rationality have a set of problems that can be clearly defined? If not, how do we work toward getting our problems clearly defined?

See also: Open problems at LW:Wiki

continue reading »

Reductionism

40 Eliezer_Yudkowsky 16 March 2008 06:26AM

Followup toHow An Algorithm Feels From Inside, Mind Projection Fallacy

Almost one year ago, in April 2007, Matthew C submitted the following suggestion for an Overcoming Bias topic:

"How and why the current reigning philosophical hegemon (reductionistic materialism) is obviously correct [...], while the reigning philosophical viewpoints of all past societies and civilizations are obviously suspect—"

I remember this, because I looked at the request and deemed it legitimate, but I knew I couldn't do that topic until I'd started on the Mind Projection Fallacy sequence, which wouldn't be for a while...

But now it's time to begin addressing this question.  And while I haven't yet come to the "materialism" issue, we can now start on "reductionism".

First, let it be said that I do indeed hold that "reductionism", according to the meaning I will give for that word, is obviously correct; and to perdition with any past civilizations that disagreed.

This seems like a strong statement, at least the first part of it.  General Relativity seems well-supported, yet who knows but that some future physicist may overturn it?

On the other hand, we are never going back to Newtonian mechanics.  The ratchet of science turns, but it does not turn in reverse.  There are cases in scientific history where a theory suffered a wound or two, and then bounced back; but when a theory takes as many arrows through the chest as Newtonian mechanics, it stays dead.

"To hell with what past civilizations thought" seems safe enough, when past civilizations believed in something that has been falsified to the trash heap of history.

And reductionism is not so much a positive hypothesis, as the absence of belief—in particular, disbelief in a form of the Mind Projection Fallacy.

continue reading »

Qualitatively Confused

26 Eliezer_Yudkowsky 14 March 2008 05:01PM

Followup toProbability is in the Mind, The Quotation is not the Referent

I suggest that a primary cause of confusion about the distinction between "belief", "truth", and "reality" is qualitative thinking about beliefs.

Consider the archetypal postmodernist attempt to be clever:

"The Sun goes around the Earth" is true for Hunga Huntergatherer, but "The Earth goes around the Sun" is true for Amara Astronomer!  Different societies have different truths!

No, different societies have different beliefs.  Belief is of a different type than truth; it's like comparing apples and probabilities.

Ah, but there's no difference between the way you use the word 'belief' and the way you use the word 'truth'!  Whether you say, "I believe 'snow is white'", or you say, "'Snow is white' is true", you're expressing exactly the same opinion.

No, these sentences mean quite different things, which is how I can conceive of the possibility that my beliefs are false.

Oh, you claim to conceive it, but you never believe it.  As Wittgenstein said, "If there were a verb meaning 'to believe falsely', it would not have any significant first person, present indicative."

And that's what I mean by putting my finger on qualitative reasoning as the source of the problem.  The dichotomy between belief and disbelief, being binary, is confusingly similar to the dichotomy between truth and untruth.

continue reading »

The Quotation is not the Referent

20 Eliezer_Yudkowsky 13 March 2008 12:53AM

Followup toThe Mind Projection Fallacy, Probability is in the Mind 

In classical logic, the operational definition of identity is that whenever 'A=B' is a theorem, you can substitute 'A' for 'B' in any theorem where B appears.  For example, if (2 + 2) = 4 is a theorem, and ((2 + 2) + 3) = 7 is a theorem, then (4 + 3) = 7 is a theorem.

This leads to a problem which is usually phrased in the following terms:  The morning star and the evening star happen to be the same object, the planet Venus.  Suppose John knows that the morning star and evening star are the same object.  Mary, however, believes that the morning star is the god Lucifer, but the evening star is the god Venus.  John believes Mary believes that the morning star is Lucifer. Must John therefore (by substitution) believe that Mary believes that the evening star is Lucifer?

Or here's an even simpler version of the problem.  2 + 2 = 4 is true; it is a theorem that (((2 + 2) = 4) = TRUE).  Fermat's Last Theorem is also true.  So:  I believe 2 + 2 = 4 => I believe TRUE => I believe Fermat's Last Theorem.

Yes, I know this seems obviously wrong.  But imagine someone writing a logical reasoning program using the principle "equal terms can always be substituted", and this happening to them.  Now imagine them writing a paper about how to prevent it from happening.  Now imagine someone else disagreeing with their solution.  The argument is still going on.

P'rsnally, I would say that John is committing a type error, like trying to subtract 5 grams from 20 meters.  "The morning star" is not the same type as the morning star, let alone the same thing.  Beliefs are not planets.

continue reading »

Probability is in the Mind

60 Eliezer_Yudkowsky 12 March 2008 04:08AM

Monsterwithgirl_2

Followup toThe Mind Projection Fallacy

Yesterday I spoke of the Mind Projection Fallacy, giving the example of the alien monster who carries off a girl in a torn dress for intended ravishing—a mistake which I imputed to the artist's tendency to think that a woman's sexiness is a property of the woman herself, woman.sexiness, rather than something that exists in the mind of an observer, and probably wouldn't exist in an alien mind.

The term "Mind Projection Fallacy" was coined by the late great Bayesian Master, E. T. Jaynes, as part of his long and hard-fought battle against the accursèd frequentists.  Jaynes was of the opinion that probabilities were in the mind, not in the environment—that probabilities express ignorance, states of partial information; and if I am ignorant of a phenomenon, that is a fact about my state of mind, not a fact about the phenomenon.

I cannot do justice to this ancient war in a few words—but the classic example of the argument runs thus:

You have a coin.
The coin is biased.
You don't know which way it's biased or how much it's biased.  Someone just told you, "The coin is biased" and that's all they said.
This is all the information you have, and the only information you have.

You draw the coin forth, flip it, and slap it down.

Now—before you remove your hand and look at the result—are you willing to say that you assign a 0.5 probability to the coin having come up heads?

continue reading »

Mind Projection Fallacy

35 Eliezer_Yudkowsky 11 March 2008 12:29AM

Followup toHow an Algorithm Feels From Inside

Monsterwithgirl_2In the dawn days of science fiction, alien invaders would occasionally kidnap a girl in a torn dress and carry her off for intended ravishing, as lovingly depicted on many ancient magazine covers.  Oddly enough, the aliens never go after men in torn shirts.

Would a non-humanoid alien, with a different evolutionary history and evolutionary psychology, sexually desire a human female?  It seems rather unlikely.  To put it mildly.

People don't make mistakes like that by deliberately reasoning:  "All possible minds are likely to be wired pretty much the same way, therefore a bug-eyed monster will find human females attractive."  Probably the artist did not even think to ask whether an alien perceives human females as attractive.  Instead, a human female in a torn dress is sexy—inherently so, as an intrinsic property.

They who went astray did not think about the alien's evolutionary history; they focused on the woman's torn dress.  If the dress were not torn, the woman would be less sexy; the alien monster doesn't enter into it.

continue reading »

Righting a Wrong Question

68 Eliezer_Yudkowsky 09 March 2008 01:00PM

Followup toHow an Algorithm Feels from the Inside, Dissolving the Question, Wrong Questions

When you are faced with an unanswerable question—a question to which it seems impossible to even imagine an answer—there is a simple trick which can turn the question solvable.

Compare:

  • "Why do I have free will?"
  • "Why do I think I have free will?"

The nice thing about the second question is that it is guaranteed to have a real answer, whether or not there is any such thing as free will.  Asking "Why do I have free will?" or "Do I have free will?" sends you off thinking about tiny details of the laws of physics, so distant from the macroscopic level that you couldn't begin to see them with the naked eye.  And you're asking "Why is X the case?" where X may not be coherent, let alone the case.

"Why do I think I have free will?", in contrast, is guaranteed answerable.  You do, in fact, believe you have free will.  This belief seems far more solid and graspable than the ephemerality of free will.  And there is, in fact, some nice solid chain of cognitive cause and effect leading up to this belief.

If you've already outgrown free will, choose one of these substitutes:

  • "Why does time move forward instead of backward?" versus "Why do I think time moves forward instead of backward?"
  • "Why was I born as myself rather than someone else?" versus "Why do I think I was born as myself rather than someone else?"
  • "Why am I conscious?" versus "Why do I think I'm conscious?"
  • "Why does reality exist?" versus "Why do I think reality exists?"

continue reading »

Semantic Stopsigns

53 Eliezer_Yudkowsky 24 August 2007 07:29PM

And the child asked:

Q:  Where did this rock come from?
A:  I chipped it off the big boulder, at the center of the village.
Q:  Where did the boulder come from?
A:  It probably rolled off the huge mountain that towers over our village.
Q:  Where did the mountain come from?
A:  The same place as all stone: it is the bones of Ymir, the primordial giant.
Q:  Where did the primordial giant, Ymir, come from?
A:  From the great abyss, Ginnungagap.
Q:  Where did the great abyss, Ginnungagap, come from?
A:  Never ask that question.

Consider the seeming paradox of the First Cause.  Science has traced events back to the Big Bang, but why did the Big Bang happen?  It's all well and good to say that the zero of time begins at the Big Bang—that there is nothing before the Big Bang in the ordinary flow of minutes and hours.  But saying this presumes our physical law, which itself appears highly structured; it calls out for explanation.  Where did the physical laws come from?  You could say that we're all a computer simulation, but then the computer simulation is running on some other world's laws of physics—where did those laws of physics come from?

At this point, some people say, "God!"

continue reading »

The Modesty Argument

26 Eliezer_Yudkowsky 10 December 2006 09:42PM

The Modesty Argument states that when two or more human beings have common knowledge that they disagree about a question of simple fact, they should each adjust their probability estimates in the direction of the others'.  (For example, they might adopt the common mean of their probability distributions.  If we use the logarithmic scoring rule, then the score of the average of a set of probability distributions is better than the average of the scores of the individual distributions, by Jensen's inequality.)

Put more simply:  When you disagree with someone, even after talking over your reasons, the Modesty Argument claims that you should each adjust your probability estimates toward the other's, and keep doing this until you agree.  The Modesty Argument is inspired by Aumann's Agreement Theorem, a very famous and oft-generalized result which shows that genuine Bayesians literally cannot agree to disagree; if genuine Bayesians have common knowledge of their individual probability estimates, they must all have the same probability estimate.  ("Common knowledge" means that I know you disagree, you know I know you disagree, etc.)

I've always been suspicious of the Modesty Argument.  It's been a long-running debate between myself and Robin Hanson.

continue reading »

View more: Prev | Next