Phil_Goetz4 comments on No Universally Compelling Arguments - Less Wrong

33 Post author: Eliezer_Yudkowsky 26 June 2008 08:29AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (46)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Phil_Goetz4 26 June 2008 04:33:04PM 1 point [-]

I agree with Mike Vassar, that Eliezer is using the word "mind" too broadly, to mean something like "computable function", rather than a control program for an agent to accomplish goals in the real world.

The real world places a lot of restrictions on possible minds.

If you posit that this mind is autonomous, and not being looked after by some other mind, that places more restrictions on it.

If you posit that there is a society of such minds, evolving over time; or a number of such minds, competing for resources; that places more restrictions on it. By this point, we could say quite a lot about the properties these minds will have. In fact, by this point, it may be the case that variation in possible minds, for sufficiently intelligent AIs, is smaller than the variation in human minds.