Percontations: The Nature of Probability

Source.

Background on Gelman.

New Comment
4 comments, sorted by Click to highlight new comments since: Today at 10:00 PM

What does Eliezer mean by "ideal Bayesian computation reacting to a stream of data (given some prior)"? I think he means that if a person (or AI) invents a new model, that he prefers to think of it as having had some prior all along (and had various posteriors along the way); that is, ideally there are no new models; that no actual updating was done prior to the invention of the hypothesis (or promotion of it to explicit consideration) is merely about approximation to his ideal. Did Gelman not understand Eliezer to mean that, or did he disagree that it was a useful perspective?

[Edit: in the final two minutes (@56m) it seems that Eliezer made the above explicit, and Gelman responds as if he understands it.]

I was delighted by the method of Robin Hanson reported by Eliezer: to find the correlation between two variables (or even just the distribution of a single variable in a population), look only at published papers for which the variables of interest (to him) are merely controls (for the authors' goal). In Robin's opinion, variables of political significance, when they're the focus of a paper, are distorted by dishonesty or publication bias (in some politically correct direction).

It sounds as though it would be more work to find such papers.

One interesting thing in the discussion was the Netflix challenge, unfortunately they didn't get much into it. Would a simpler method be able to solve it more efficiently?

I saw an interesting argument that weighted average of prediction by each system should be expected to win based on a property of the error metric used (root mean square error is convex).