JamesAndrix comments on That Magical Click - Less Wrong

58 Post author: Eliezer_Yudkowsky 20 January 2010 04:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (400)

You are viewing a single comment's thread. Show more comments above.

Comment author: denisbider 26 January 2010 04:51:18PM *  0 points [-]

I actually think that all our current ways of thinking, feeling and going about life would be so antiquated, post-FAI, as a horse buggy on an interstate highway. Once an AI can reforge us into more exalted creatures than we currently are, I'm not sure why anyone would want to continue living (falling in love? having children?) the old fashioned way. It would be as antiquated as the lifestyle of the Amish.

Comment author: JamesAndrix 26 January 2010 05:30:55PM 1 point [-]

What would we want to be exalted for? So we can more completely appreciate our boredom?

It doesn't make sense to me that we'd get some arbitrary jump in mindpower, and then start an optimized advancement. (we might get some immediate patches, but there will be reasons for them.) Why not pump us all the way to multi-galaxy-brains? Then the growth issues are moot.

Either way, if we're abandoning our complex evolved values, then we don't need to be very complex beings at all. If we don't, then I don't expect that even our posthuman values will be satisfied by puppet zombie companions.

Comment author: denisbider 26 January 2010 05:44:36PM *  0 points [-]

Is there some reason to believe our current degree of complexity is optimal?

Why would we want to be reforged as something that suffers boredom, when we can be reforged as something that never experiences a negative feeling at all? Or experiences them just for variety, if that is what one would prefer?

If complexity is such a plus, then why stop at what we are now? Why not make ourselves more complex? Right now we chase after air, water, food, shelter, love, social status, why not make things more fun by making us all desire paperclips, too? That would be more complex. Everything we already do now, but now with paperclips! Sounds fun? :)

Comment author: thomblake 26 January 2010 06:56:54PM 5 points [-]

Possibly relevant: I already desire paperclips.

Comment author: JamesAndrix 26 January 2010 06:43:44PM 3 points [-]

Is there some reason to believe our current degree of complexity is optimal?

I don't, at all. Also you're conflating our complexity with the complexity of our values.

I think that our growth will best start from a point relatively close to where we are now in terms of intelligence. We should grow into jupiter brains, but that should be by learning.

I'm not clear on what it is you want to be reforged as, or why. By what measure is postFAI-Dennis better than now-dennis? By what measure is it still 'Dennis', and why were those features retained?

The complexity of human value is not good for its being complex. Rather, these are the things we value, there happens to be a lot of them and they are complexly interrelated. Chopping away at huge chunks of them and focusing on pleasure is probablya bad thing, which we would not want.

It may be the case that the FAI will extrapolate much more complex values, or much simpler values, but our current values must be the starting point and our current values are complex.