Umm, It looks like he did not read the book "Perceptrons," because he repeats a lot of misinformation from others who also did not read it.
First, none of the theorems in that book are changed or 'refuted' by the use of back-propagation. This is because almost all the book is about whether, in various kinds of connectionist networks, there exist any sets of coefficients to enable the net to recognize various kinds of patterns.
Anyway, because BP is essentially a gradient climbing process, it has all the consequent problems -- such as
getting stuck on local peaks.
Those who read the book will see (on page 56) that we did not simple show that the
Umm, It looks like he did not read the book "Perceptrons," because he repeats a lot of misinformation from others who also did not read it.
- First, none of the theorems in that book are changed or 'refuted' by the use of back-propagation. This is because almost all the book is about whether, in various kinds of connectionist networks, there exist any sets of coefficients to enable the net to recognize various kinds of patterns.
- Anyway, because BP is essentially a gradient climbing process, it has all the consequent problems -- such as
getting stuck on local peaks.
- Those who read the book will see (on page 56) that we did not simple show that the
... (read 401 more words →)