You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Houshalter comments on Approximating Solomonoff Induction - Less Wrong Discussion

6 Post author: Houshalter 29 May 2015 12:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (45)

You are viewing a single comment's thread. Show more comments above.

Comment author: jacob_cannell 30 May 2015 04:46:10AM 1 point [-]

The purpose of my writing is to show that they are elegant. Even further, that if you tried to come up with the ideal approximation of SI from first principles, you would just end up with NNs.

Indeed. Although SGD is probably not the optimal approximation of Bayesian inference - for example it doesn't handle track/handle uncertainty at all, but that is a hot current area of research.

Comment author: Houshalter 30 May 2015 12:12:26PM 1 point [-]

I only barely mentioned it in my post, but there are ways of approximating bayesian inference like MCMC. And in fact there are methods which can take advantage of stochastic gradient information, which should make them roughly as efficient as SGD.

There is also a recent paper by Deepmind, Weight Uncertainty in Neural Networks.