okay comments on [Link]"Neural Turing Machines" - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (21)
Some good comments on related work in https://www.reddit.com/r/MachineLearning/comments/2kth6d/googles_secretive_deepmind_startup_unveils_a/
(After reading the paper, I still don't really get how you wire some RAM into a neural network. Maybe it makes more sense to others.)
It's not really RAM, but rather a tape. (like a doubly linked list) The LSTM controller can't specify any location in logarithmic space / time. They add multiple tape readers at one point though.
This tape could be compared to the phonological loop and the stream of consciousness.