Tyrrell_McAllister comments on What Bayesianism taught me - Less Wrong

62 Post author: Tyrrell_McAllister 12 August 2013 06:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (201)

You are viewing a single comment's thread. Show more comments above.

Comment author: Tyrrell_McAllister 15 August 2013 09:05:17PM *  0 points [-]

I agree with your points about the value of information. Indeed, as Vaniver said, Bayesianism (i.e., "qualitative Bayes"), together with the idea of expected-utility maximization, makes the importance of VoI especially salient and easy to understand. So I'm a little puzzled by your conclusion that

This is sort of like going on about what Maxwell's equations taught you about painting.

... because your argument leading up to this conclusion seems to me to be steeped in Bayesian thinking through-and-through :). E.g., this:

The expected utility sums for things such as expenditure of resources have a term for resources being kept for the future uses which may be more conditional on evidence, and the actions that are less evidence conditioned than usual ought to lose out to the bulk of possible ways one may act in the future (edit: the ways which you can't explicitly enumerate).

That's just some of the thresholds that an optimally programmed intelligence (on a physically plausible computer) would apply.

Comment author: private_messaging 15 August 2013 11:22:59PM *  2 points [-]

I'd describe Bayesianism as a belief in powers of qualitative Bayes.

E.g. you seem to actually believe that taking into account low grade evidence, and qualitatively at that, is going to make you form more correct beliefs. No it won't. Myths about Zeus are weak evidence for great many things, a lot of which would be evidence against Zeus.

The informal algebra of "small", "a little", "weak", "strong", "a lot", just doesn't work for the equations involved, and even if you miraculously used actual real numbers behind those labels, you'd still have enormously huge sums over all the things implied by existence of the myths.

... because your argument leading up to this conclusion seems to me to be steeped in Bayesian thinking through-and-through :).

Firstly, I'm trying to deal just with the things that I am very confident about (computational difficulties), so the inferences are normal logic, and secondarily, I'm trying to persuade you, so I express that in your ideology.

edit: To summarize. You are accustomed to processing evidence1, and to saying that many things are not evidence1. Bayes taught you that everything is evidence2 . You started treating everything as evidence1 because it's the same word. Whereas evidence1 is evidence that is strong enough and unequivocal enough that a lot of quite rough but absolutely essential approximations work correctly (and it can be more or less usefully processed), and evidence2 is weak and nearly equivocal, all things considering, and those approximations will just plain not work, while exact solutions are too expensive and very complicated even for simple cases such as my Bob example above.