denisbider comments on That Magical Click - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (400)
Yes, if the parents will always be there to take care of you.
We can wirehead children now.
We want them to be more than that.
The only reason we want that is that civilization would collapse without anyone to bear it. If FAI bears it, there is no pressure on anyone.
What does it mean for FAI to bear civilization? It can give us bridges, but if I'm going to spend time with you, you'd better be socialized. A life of obedient catgirls would harm your ability to deal with real humans (or posthumans)
And ignoring that, I don't think that we want to be more than we are just in order to get stuff done.
Both of these are things we to to achieve complex values. Some of the things we want are things which can't be handed to us, and some of those are thing which we can't achieve if everything which can be handed to us, is handed to us.
The companions FAI creates for you don't have to be obedient, nor catgirls. Instead, they can be companions that far exceed the value you can get from socializing with fellow humans or posthumans.
Once there is FAI, the best companion for anyone is FAI.
The only reason you want "complex values" is because your environment has inculcated in you that you want them. The reason your environment has inculcated this in you is because such inculcation is necessary in order to have people who will uphold civilization. Once there is FAI, such inculcation is no longer necessary, and is in fact counter-productive.
How rude can I be to my FAI companion before it starts crying in the corner? How rude will I become if it doesn't? Why didn't it just build the bridge the first time I asked? then I wouldn't have to yell. Does she mind that I call her 'it'?
Proper companions don't always give you what you want.
Also, even though FAI could create perfectly balanced agents, and even if creating said agents wasn't in itself morally reprehensible, I think the is a value for interacting with other 'real' humans.
Edit: Newline: Ok, this is a big deal:
The fact that a value I have is something evolution gave me is not a reason to abandon that value. Pleasure is also something I want because evolution made me want it.
Right now, I want those complex values, and I'm not going to press a button to self modify to stop wanting them
I don't see why creating perfectly balanced agents would be morally reprehensible - nor why, given such agents, there would be value in interacting with other humans - necessarily less suited to each other's progress than the agents would be.
It may well be considered morally reprehensible to communicate with other humans, because it may undermine and slow down the personal development that each human would otherwise benefit from in the company of custom-tailored companions, designed perfectly for one's individual progress.
It may well be morally better for the FAI to make you think that you're communicating with a 'real' human, when in fact you are communicating with an agent specifically designed to provide you with that learning experience.
If these agents are people in a morally significant way, then their needs must be taken into account. FAI can't just create slave beings. It's very difficult for me at this point to say whether it's possible for the FAI to create a being that perfectly meets some human needs, and in turn has all its own needs met just as perfectly. Every new person it creates just adds more complexity to the moral balance. It might be doable, but it might not, and it's a lot more work-thought-energy to do it that way.
If they are not people, if they are some kind of puppet zombie robot, then we will have billions of humans falling in love with puppet zombie robots. Because that is their only option. And having puppet zombie robot children. Maybe that's what FAI will conclude is best, but I doubt it.
I actually think that all our current ways of thinking, feeling and going about life would be so antiquated, post-FAI, as a horse buggy on an interstate highway. Once an AI can reforge us into more exalted creatures than we currently are, I'm not sure why anyone would want to continue living (falling in love? having children?) the old fashioned way. It would be as antiquated as the lifestyle of the Amish.
Some people want to be Amish. It seems like your statement could just as well be "I'm not sure why anyone would want to be Amish" and I'm not sure that communicates anything useful.
What would we want to be exalted for? So we can more completely appreciate our boredom?
It doesn't make sense to me that we'd get some arbitrary jump in mindpower, and then start an optimized advancement. (we might get some immediate patches, but there will be reasons for them.) Why not pump us all the way to multi-galaxy-brains? Then the growth issues are moot.
Either way, if we're abandoning our complex evolved values, then we don't need to be very complex beings at all. If we don't, then I don't expect that even our posthuman values will be satisfied by puppet zombie companions.
This is an extreme statement about everyone's preference, not even your own preference or your own belief about your own preference. One shouldn't jump that far.