timtyler comments on Taking Ideas Seriously - Less Wrong

51 Post author: Will_Newsome 13 August 2010 04:50PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (257)

You are viewing a single comment's thread. Show more comments above.

Comment author: timtyler 25 August 2010 05:13:37PM *  3 points [-]

Instead of trying to get a probability that something is true, you should look for criticisms.

If you were asked to bet on whether it was true or not, then you should assign a probability.

Scientists often do something like that when deciding how to allocate their research funds.

Comment author: [deleted] 25 August 2010 05:26:46PM 0 points [-]

But then we have to develop a quantitative formalism for both beliefs and utilities. Is it really necessary to attack both problems at once?

Comment author: [deleted] 25 August 2010 05:50:57PM 2 points [-]

Human beings don't actually seem to have utility functions, all they really have are "preferences" i.e. a method for choosing between alternatives. But von Neumann and Morgenstern showed that under some conditions this is the same as having a utility function.

Now Scurfield is saying that human beings, even smart ones like scientists, don't have prior probability distributions, all they really have is a database of claims and criticisms of those claims. Is there any result analogous to von Neumann-Morgenstern that says this is the same thing as having a prior, under conditions?

Comment author: Perplexed 26 August 2010 12:32:40AM 4 points [-]

Yes. The question has been addressed repeatedly by a variety of people. John Maynard Keynes may have been the first. Notable formulations since his include de Finetti, Savage, and Jeffrey's online book.

Discovering subjective probabilities is usually done in conjunction with discovering utilities by revealed preferences because much of the machinery (choices between alternatives, lotteries) is shared between the two problems. People like Jaynes who want a pure epistemology uncontaminated by crass utility considerations have to demand that their "test subjects" adhere to some fairly hard-to-justify consistency rules. But people like de Finetti don't impose arbitrary consistency, instead they prove that inconsistent probability assignments lose money to clever gamblers who construct "Dutch books".

Comment author: Cyan 26 August 2010 01:30:34AM 0 points [-]

some fairly hard-to-justify consistency rules

I'd be interested in reading more about your views on this (unless you're referring to Halpern's papers on Cox's theorem).

Comment author: Perplexed 26 August 2010 01:57:55AM 1 point [-]

I'm not even familiar with Halpern's work. The only serious criticism I have seen regarding the usual consistency rules for subjective probabilities dealt with the "sure thing rule". I didn't find it particularly convincing.

No, I have no trouble justifying a mathematical argument in favor of this kind of consistency. But not everyone else is all that convinced by mathematics. Their attention can be grabbed, however, by the danger of being taken to the cleaners by Dutch book professional bookies.

One of these days, I will get around to producing a posting on probability, developing it from what I call the "surprisal" of a proposition - the amount, on a scale from zero to positive infinity, by which you would be surprised upon learning that a proposition is true.

  • Prob(X) = 2^(-Surp(X)).
  • Surp(coin flip yields heads)= 1 bit.
  • Surp(A) + Surp(B|A) = Surp(A&B)

That last formula strikes me as particularly easy to justify (surprisals are additive). Given that and the first formula, you can easily derive Bayes law. The middle formula simply fixes the scale for surprisals. I suppose we also need a rule that Surp(True)=0

Comment author: Sniffnoy 26 August 2010 03:33:57AM 0 points [-]

developing it from what I call the "surprisal" of a proposition

Actually "Surprisal" is a pretty standard term, I think.

Comment author: [deleted] 26 August 2010 02:02:41AM 0 points [-]

surprisal

Yudkowsky suggests calling it "absurdity" here

Comment author: Perplexed 26 August 2010 02:29:00AM 1 point [-]

Cool! Saves me the trouble of writing that posting. :)

Absurdity is probably a better name for the concept. Except that it sounds objective, whereas amount of surprise more obviously depends on who is being surprised.

Comment author: [deleted] 26 August 2010 12:43:15AM 0 points [-]

Wild. Is there an exposition of subjective expected utility better than wikipedia's?

Comment author: Perplexed 26 August 2010 12:47:57AM 1 point [-]

Jeffrey's book, which I already linked, or any good text on Game theory. Myerson, for example, or Luce and Raiffa.

Comment author: timtyler 25 August 2010 05:32:41PM 0 points [-]

Agents can reasonably be expected to quantify both beliefs and utilities. How the ability to do that is developed - is up to the developer.

Comment author: [deleted] 25 August 2010 05:33:55PM 0 points [-]

People are agents, and they are very bad at quantifying their beliefs and utilities.