timtyler comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong

32 Post author: ciphergoth 30 October 2010 09:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (432)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 03 November 2010 05:58:12PM 4 points [-]

I agree, but I want "AI ethics" to mean something different from what you probably mean by it. The question is what sort of ethics we want our AIs to have?

Paperclipping the universe with humans is still paperclipping.

Comment author: timtyler 04 November 2010 08:36:09AM 0 points [-]

One distinctive feature of the hypothetical "paperclipers" is that they attempt to leave a low-entropy state behind - one which other organisms would normally munch through. Humans don't tend to do that - like most living things, they keep consuming until there is (practically) nothing left - and then move on.

Leaving a low entropy state behind seems like the defining feature of the phenomenon to me. From that perspective, a human civilisation would not really qualify.

Comment author: PhilGoetz 07 November 2010 02:58:05PM *  0 points [-]

It sounds like you're saying humanity is worse than paperclips, if what distinguishes them is that they increase entropy more.

Comment author: timtyler 07 November 2010 04:16:13PM -1 points [-]

Only if you adopt the old-fashioned "entropy is bad" mindset.

However, life is a great increaser of entropy - and potentially the greatest.

If you are against entropy, you are against life - so I figure we are all pro-entropy.