Are calibration and rational decisions mutually exclusive? (Part one)

3 Cyan 23 July 2009 05:15AM

I'm planning a two-part sequence with the aim of throwing open the question in the title to the LW commentariat. In this part I’ll briefly go over the concept of calibration of probability distributions and point out a discrepancy between calibration and Bayesian updating.

It's a tenet of rationality that we should seek to be well-calibrated. That is, suppose that we are called on to give interval estimates for a large number of quantities; we give each interval an associated epistemic probability. We declare ourselves well-calibrated if the relative frequency with which the quantities fall within our specified intervals matches our claimed probability. (The Technical Explanation of Technical Explanations discusses calibration in more detail, although it mostly discusses discrete estimands, while here I'm thinking about continuous estimands.)

Frequentists also produce interval estimates, at least when "random" data is available. A frequentist "confidence interval" is really a function from the data and a user-specified confidence level (a number from 0 to 1) to an interval. The confidence interval procedure is "valid" if in a hypothetical infinite sequence of replications of the experiment, the relative frequency with which the realized intervals contain the estimand is equal to the confidence level. (Less strictly, we may require "greater than or equal" rather than "equal".) The similarity between valid confidence coverage and well-calibrated epistemic probability intervals is evident.

This similarity suggests an approach for specifying non-informative prior distributions, i.e., we require that such priors yield posterior intervals that are also valid confidence intervals in a frequentist sense. This "matching prior" program does not succeed in full generality. There are a few special cases of data distributions where a matching prior exists, but by and large, posterior intervals can at best produce only asymptotically valid confidence coverage. Furthurmore, according to my understanding of the material, if your model of the data-generating process contains more than one scalar parameter, you have to pick one "interest parameter" and be satisfied with good confidence coverage for the marginal posterior intervals for that parameter alone. For approximate matching priors with the highest order of accuracy, a different choice of interest parameter usually implies a different prior.

The upshot is that we have good reason to think that Bayesian posterior intervals will not be perfectly calibrated in general. I have good justifications, I think, for using the Bayesian updating procedure, even if it means the resulting posterior intervals are not as well-calibrated as frequentist confidence intervals. (And I mean good confidence intervals, not the obviously pathological ones.) But my justifications are grounded in an epistemic view of probability, and no committed frequentist would find them as compelling as I do. However, there is an argument for Bayesian posteriors over confidence intervals than even a frequentist would have to credit. That will be the focus of the second part.

Marketing rationalism

10 PhilGoetz 12 April 2009 09:41PM

Suppose you're a protestant, and you want to convince other people to do what the Bible says to do.  Would you persuade them by showing them that the Bible says that they should?

Now suppose you're a rationalist, and you want to convince other people to be rational.  Would you persuade them with a rational argument?

If not, how?

ADDED:  I'm not talking about persuading others who already accept reason as final arbiter to adopt Bayesian principles, or anything like that.  I mean persuading Joe on the street who does whatever feels good, and feels pretty good about that.  Or a doctor of philosophy who believes that truth is relative and reason is a social construct.  Or a Christian who believes that the Bible is God's Word, and things that contradict the Bible must be false.

Christians don't place a whole set of the population off-limits and say, "These people are unreachable; their paradigms are too different."  They go after everyone.  There is no class of people whom they are unsuccessful with.

Saying that we have to play by a set of self-imposed rules in the competition for the minds of humanity, while our competitors don't, means we will lose.  And isn't rationality about winning?

ADDED:  People are missing the point that the situation is symmetrical for religious evangelists.  For them to step outside of their worldview, and use reason to gain converts, is as epistemically dangerous for them, as it is for us to gain converts using something other than reason.  Contemporary Christians consider themselves on good terms with reason; but if you look back in history, you'll find that many of the famous and influential Christian theologians (starting with Paul) made explicit warnings against the temptation of reason.  The proceedings from Galileo's trial contain some choice bits on the relation between reason and faith.

Using all sorts of persuasive techniques that are not grounded in religious truth, and hence are epistemically repulsive to them and corrosive to their belief system, has proven a winning strategy for all religions.  It's a compromise; but these compromises did not weaken those religions.  They made them stronger.

Spay or Neuter Your Irrationalities

2 Alicorn 10 April 2009 08:08PM

No human person has, so far as I am aware, managed to eradicate all irrationalities from their thinking.  They are unavoidable, and this is particularly distressing when the irrationalities are lurking in your brain like rats in the walls and you don't know what they are.  Of course you don't know what they are - they are irrationalities, and you are a rationalist, so if you had identified them, they would be dying (quickly or slowly, but dying).  It's only natural for someone committed to rationality to want to indiscriminately exterminate the threats to the unattainable goal.

But are they all worth getting rid of?

It is my opinion that they are not: some irrationalities are small and cute and neutered, and can be confined and kept where you can see them, like pet gerbils instead of rats in the walls.

I'll give you an example: I use iTunes for my music organization and listening.  iTunes automatically records the number of times I have listened to each song and displays it.  Within a given playlist, I irrationally believe that all of these numbers have to match: if I have listened to the theme from The Phantom of the Opera exactly fifty-two times, I have to also have listened to "The Music of the Night" exactly fifty-two times, no matter how much I want to listen to the theme on repeat all afternoon.

Does this make any sense?  No, of course not, but it isn't worth my time to get rid of it.  It is small - it affects only a tiny corner of my life, and if it starts to get in the way of my musical preferences, I can cheat it by resetting play counts or fast-forwarding through songs (like I could get around the chore of feeding a gerbil with an automatic food dispenser).  It is "cute" - I can use it as a conversation starter and people generally find it a mildly entertaining quirk, not evidence that I need psychiatric help.  I have it metaphorically neutered - since I make no effort to suppress it, I'm able to recognize the various emotional reactions that satsifying or frustrating this irrational preference creates, and I would notice them if they cropped up anywhere else, where I would deal with them appropriately.  I also don't encourage it to memetically spread to others.  I keep it where I can see it - I make note of when I take actions to satisfy my irrational preference, and acknowledge in so doing that it's my reason and my reason doesn't make much sense.

In short, I treat it like a pet.  If it started being more trouble than it would be to root it out of my brain, I'd go through the necessary desensitization, just as I would get rid of a pet gerbil that bit me or kept me up at night even if this meant two hours each way on the bus to the Humane Society.

Rational Defense of Irrational Beliefs

2 cleonid 12 March 2009 06:48PM

“Everyone complains of his memory, but nobody of his judgment." This maxim of La Rochefoucauld rings as true today as it did back in the XVIIth century. People tend overestimate their reasoning abilities even when this overconfidence has a direct monetary cost. For instance, multiple studies have shown that investors who are more confident of their ability to beat the market receive lower returns on their investments. This overconfidence penalty applies even to the supposed experts, such as fund managers. 

So what an expert rationalist should do to avoid this overconfidence trap? The seeming answer is that we should rely less on our own reasoning and more on the “wisdom of the crowds”. To a certain extent this is already achieved by the society pressure to conform, which acts as an internal policeman in our minds. Yet those of us who deem themselves not very susceptible to such pressures (overconfidence, here we go again) might need to shift their views even further.

I invite you now to experiment on how this will work in practice. Quite a few of the recent posts and comments were speaking with derision about religion and the supernatural phenomena in general. Did the authors of these comments fully consider the fact that the existence of God is firmly believed by the majority?  Or that this belief is not restricted to the uneducated but shared by many famous scientists, including Newton and Einstein?  Would they be willing to shift their views to accommodate the chance that their own reasoning powers are insufficient to get the right answer?

Let the stone throwing begin.

 

On Things that are Awesome

23 Eliezer_Yudkowsky 24 March 2009 03:24AM

This post, which touched on the allowedness of admiration, started me thinking about the nature of things that are awesome.

The first thing one does in such a situation is generate examples.  And my brain, asked to enumerate things that are awesome, said:  "Douglas Hofstadter, E. T. Jaynes, Greg Egan..."

Upon that initial output of my brain, I had many other thoughts:

(1)  My brain was able to list more than one thing that is awesome.  I am not going to dwell on this, because I think it needless to go around saying, "Douglas Hofstadter is awesome, but E. T. Jaynes is awesome too," as though to deliberately moderate or subtract from the admiration of Hofstadter.  The enjoyment of things that are awesome is an important part of life, and I don't think a healthy mind should have to hold back.  But the more things you know that are awesome, the more there is to enjoy—this doesn't mean you should artificially inflate your estimations of awesomeness, but it does mean that if you can think of only one awesome thing, you must be missing out on a lot of life.  And some awesome things, but not all, are compatible enough with yourself that you can draw upon the awesome—Hofstadter and Jaynes are both like this for me, but Greg Egan is not.  So even leaving aside certain mental health risks from having only one awesome thing—it is both enjoyable, and strengthening, to know of many things that are awesome.

(2)  I can think of many places where I disagree with statements emitted by Douglas Hofstadter and Greg Egan, and even one or two places where I would want to pencil in a correction to Jaynes (his interpretation of quantum mechanics being the most obvious).  In fact, when my brain says "Greg Egan" it is really referring to two novels, Permutation City and Quarantine, which overshadow all his other works in my book.  And when my brain says "Hofstadter" it is referring to Gödel, Escher, Bach with a small side order of some essays in Metamagical Themas.  For most people their truly awesome work is usually only a slice of their total output, from some particular years (I find that scary as hell, by the way).

continue reading »