timtyler comments on How to Be Oversurprised - Less Wrong

13 Post author: royf 07 January 2013 04:02AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (14)

You are viewing a single comment's thread.

Comment author: timtyler 13 January 2013 11:18:28PM *  0 points [-]

Intelligence requires observations. An intelligent agent needs to strike a delicate balance between the persistence of its internal structure and its susceptibility to new evidence. The optimal balance, a Bayesian update, has been explained many times before, and was shown to be optimal in keeping information about the world.

What's optimal in terms of keeping information about the world is to store your entire sense data history - and everything you were born with. That is often extremely wasteful of memory (even if you compress) - but surely much more compact than keeping track of all the possible hypotheses that might explain your sense data, and their respective probabilities. I think it is hard to come up with a sense in which that is optimal.

Comment author: IlyaShpitser 13 January 2013 11:57:41PM 0 points [-]

You know, non-parametric methods along the lines of k-nearest neighbors basically do this. The cleverness in these methods is finding a log(n) access scheme for your data (a database problem, essentially, rather than a machine learning problem).