You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

CarlShulman comments on Why could you be optimistic that the Singularity is Near? - Less Wrong Discussion

22 Post author: gwern 14 July 2012 11:33PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (36)

You are viewing a single comment's thread. Show more comments above.

Comment author: CarlShulman 16 July 2012 04:19:24PM 4 points [-]

A lot of existing improvement trends would have to suddenly stop, along with the general empirical trend of continued software progress. On many applications we are well short of the performance of biological systems, and those biological systems show large internal variation (e.g. the human IQ distribution) without an abrupt "wall" visible, indicating that machines could go further (as they already have on many problems).

Comment author: private_messaging 16 July 2012 07:34:33PM *  1 point [-]

I'm not quite sure software is well short of the performance of biological systems in terms of what software can do with given number of operations per second. Consider the cat image recognition: Google's system has miniscule computing power comparing to human visual cortex, and performs accordingly (badly).

What I suspect though, is that the greatest advances in speeding up technological progress, would come from better algorithm that works on well defined problems like making better transistors - something where even the humans make breakthroughs not by verbally doing some i think therefore i am philosophy in their heads but by either throwing science at the wall and seeing what sticks, or by imagining it in their heads, visually, trying to imitate the non-intelligent simulator. Likewise for the automated software development; so much of the thought that human does to do such tasks is, really, unrelated to this human capacity to see meaning and purpose to life, or the symbol grounding or anything of this kind that makes us fearsome, dangerous, survival machines - things you don't need to make for automated programming software.