Perplexed comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong

32 Post author: ciphergoth 30 October 2010 09:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (432)

You are viewing a single comment's thread. Show more comments above.

Comment author: Perplexed 31 October 2010 02:39:57AM 4 points [-]

One of my fundamental contentions is that empathy is a requirement for intelligence beyond a certain point because the consequences of lacking it are too severe to overcome.

Now if you had suggested that intelligence cannot evolve beyond a certain point unless accompanied by empathy ... that would be another matter. I could easily be convinced that a social animal requires empathy almost as much as it requires eyesight, and that non-social animals cannot become very intelligent because they would never develop language.

But I see no reason to think that an evolved intelligence would have empathy for entities with whom it had no social interactions during its evolutionary history. And no a priori reason to expect any kind of empathy at all in an engineered intelligence.

Which brings up an interesting thought. Perhaps human-level AI already exists. But we don't realize it because we have no empathy for AIs.

Comment author: timtyler 31 October 2010 09:21:53AM 1 point [-]

The most likely location for an "unobserved" machine intelligence is probably the NSA's basement.

However, it seems challenging to believe that a machine intelligence would need to stay hidden for very long.

Comment author: timtyler 01 November 2010 10:00:42PM *  0 points [-]

But I see no reason to think that an evolved intelligence would have empathy for entities with whom it had no social interactions during its evolutionary history.

MIT's Leonardo? Engineered super-cuteness!

MIT Leonardo