TobyBartels comments on Bayes' rule =/= Bayesian inference - Less Wrong

37 Post author: neq1 16 September 2010 06:34AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (70)

You are viewing a single comment's thread. Show more comments above.

Comment author: TobyBartels 16 September 2010 06:33:34PM 3 points [-]

Hopefully an AI will be able to get its hands on large amounts of data. Once it has that, it doesn't matter very much what its priors were.

Comment author: Jonathan_Graehl 16 September 2010 10:30:53PM 1 point [-]

Agreed, but the priors can in principle be strong enough that hypothesis A will always be favored over B no matter how much data you have, even though B gives an orders of magnitude higher P(data|B) than P(data|A).

Comment author: JohnDavidBustard 16 September 2010 06:53:39PM 0 points [-]

Is there a bound on the amount of data that is necessary to adjust a prior of a given error magnitude? Likewise, if the probability is the result of a changing system I presume it could well be the case that the pdf estimates will be consistently inaccurate as they are constantly adjusting to events whose local probability is changing. Does the Bayesian approach help, over say, model fitting to arbitrary samples? Is it, in effect, an example of a model fitting strategy no more reasonable than any other?