You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Daniel_Burfoot comments on Open thread, Mar. 9 - Mar. 15, 2015 - Less Wrong Discussion

5 Post author: MrMind 09 March 2015 07:48AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (109)

You are viewing a single comment's thread. Show more comments above.

Comment author: ShardPhoenix 09 March 2015 10:20:28AM *  2 points [-]

Hmm, mostly just articles where they get better results with more NN layers/more examples, which are both limited by hardware capacity and have seen large gains from things like using GPUs. Current algos still have far fewer "neurons" than the actual brain AFAIK. Plus, in general, faster hardware allows for faster/cheaper experimentation with different algorithms.

I've seen some AI researchers (eg Yann Lecun on Facebook) emphasizing that fundamental techniques haven't changed that much in decades, yet results continue to improve with more computation.

Comment author: Daniel_Burfoot 10 March 2015 12:05:36AM 3 points [-]

Current algos still have far fewer "neurons" than the actual brain AFAIK.

This is not primarily because of limitations in computing power. The relevant limitation is on the complexity of the model you can train, without overfitting, in comparison to the volume of data you have (a larger data set permits a more complex model).