kilobug comments on An Intuitive Explanation of Solomonoff Induction - Less Wrong

53 Post author: Alex_Altair 11 July 2012 08:05AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (210)

You are viewing a single comment's thread.

Comment author: kilobug 09 July 2012 09:45:05AM 1 point [-]

Great article (at least the beginning, didn't finish it yet, will do so later on).

There is one thing which unsettles me : « Bayes’ Theorem can tell us how likely a hypothesis is, given evidence (or data, or observations). » Well, no, Bayes' Theorem will only tell you how more or less likely a hypothesis is. It requires the prior to give a "full" probability. It's said a bit later on that the "prior" is required, but not just after, so I fear the sentence may confuse people who don't actually know well enough Bayes.

IMHO it would be less confusing to alter the sentence to ensure it makes clear that the prior is required (or that Bayes' theorem makes updates on the probability, but doesn't tell it).

Comment author: private_messaging 13 July 2012 08:30:49AM *  2 points [-]

It's especially confusing when the other use that you have for probability of evidence given hypothesis is not mentioned.

Suppose you have a non-sniper test which has 10% probability of returning nonsniper if enemy is a sniper, and near-certainty of returning nonsniper if enemy is not a sniper. Lacking a prior, you may not know how likely it is that it is a sniper, but you know that executing a program do the test 3 times then run if all indicate non-sniper has expected risk of running out into sniper of 0.1% or less if the tests are statistically independent. This is very useful when determining advance policies such as the testing required for some airplane hardware component, or for a drug. Or when designing an AI for a computer game, and in many other competitive situations.