Daniel_Burfoot comments on Open thread, Mar. 9 - Mar. 15, 2015 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (109)
Hmm, mostly just articles where they get better results with more NN layers/more examples, which are both limited by hardware capacity and have seen large gains from things like using GPUs. Current algos still have far fewer "neurons" than the actual brain AFAIK. Plus, in general, faster hardware allows for faster/cheaper experimentation with different algorithms.
I've seen some AI researchers (eg Yann Lecun on Facebook) emphasizing that fundamental techniques haven't changed that much in decades, yet results continue to improve with more computation.
This is not primarily because of limitations in computing power. The relevant limitation is on the complexity of the model you can train, without overfitting, in comparison to the volume of data you have (a larger data set permits a more complex model).