JGWeissman comments on That Magical Click - Less Wrong

58 Post author: Eliezer_Yudkowsky 20 January 2010 04:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (400)

You are viewing a single comment's thread. Show more comments above.

Comment author: JGWeissman 25 January 2010 08:22:20PM 0 points [-]

I would expect that an FAI would not force us to play games, but would make games available for us to choose to play.

Comment author: Wei_Dai 25 January 2010 08:33:47PM 2 points [-]

It's not that an FAI would force us to play games, but rather there's nothing else to do. All the real problems would have been solved already.

Comment author: JamesAndrix 25 January 2010 08:56:42PM 5 points [-]

That's not necessarily true. We might still have to build a sturdy bridge to cross a river, it's just that nobody dies if we mess up.

Likewise, if one's mind is too advanced for bridge building to not be boring, then there will be other more complex organizations we would want, which the FAI is under no obligation to hand us.

I think we can have a huge set of real problems to solve, even after FAI solves all the needed ones.

Comment author: Wei_Dai 25 January 2010 10:23:19PM 3 points [-]

How is bridge-building not a game when the FAI could just flick a switch and transport you across the river in any number of ways that are much more efficient? When you're building a bridge in that situation, you're not solving the problem of crossing a river, you're just using up resources in order to not be bored.

Comment author: CronoDAS 25 January 2010 10:38:34PM 3 points [-]

Because it refuses to do so?

If you're 16 and your parents refuse to buy something for you (that they could afford without too much trouble) and instead make you go out and earn the money to buy it yourself, was solving the problem of how to get the money "just a game"?

Comment author: denisbider 25 January 2010 10:50:07PM 1 point [-]

Yes, if the parents will always be there to take care of you.

Comment author: JamesAndrix 26 January 2010 12:28:14AM 1 point [-]

We can wirehead children now.

We want them to be more than that.

Comment author: denisbider 26 January 2010 03:33:37AM 0 points [-]

The only reason we want that is that civilization would collapse without anyone to bear it. If FAI bears it, there is no pressure on anyone.

Comment author: JamesAndrix 26 January 2010 04:37:50AM *  2 points [-]

What does it mean for FAI to bear civilization? It can give us bridges, but if I'm going to spend time with you, you'd better be socialized. A life of obedient catgirls would harm your ability to deal with real humans (or posthumans)

And ignoring that, I don't think that we want to be more than we are just in order to get stuff done.
Both of these are things we to to achieve complex values. Some of the things we want are things which can't be handed to us, and some of those are thing which we can't achieve if everything which can be handed to us, is handed to us.

Comment author: denisbider 26 January 2010 12:49:14PM 0 points [-]

The companions FAI creates for you don't have to be obedient, nor catgirls. Instead, they can be companions that far exceed the value you can get from socializing with fellow humans or posthumans.

Once there is FAI, the best companion for anyone is FAI.

The only reason you want "complex values" is because your environment has inculcated in you that you want them. The reason your environment has inculcated this in you is because such inculcation is necessary in order to have people who will uphold civilization. Once there is FAI, such inculcation is no longer necessary, and is in fact counter-productive.

Comment author: Vladimir_Nesov 27 January 2010 10:29:15AM *  1 point [-]

We can wirehead children now.

We want them to be more than that.

The only reason we want that is that civilization would collapse without anyone to bear it. If FAI bears it, there is no pressure on anyone.

This is an extreme statement about everyone's preference, not even your own preference or your own belief about your own preference. One shouldn't jump that far.

Comment author: Vladimir_Nesov 27 January 2010 09:59:47AM 1 point [-]

How is bridge-building not a game when the FAI could just flick a switch and transport you across the river in any number of ways that are much more efficient?

It can't actually do that, because it's not what its preference tells it to do. The same way you can't jump out of the window given you are not suicidal.

Comment author: Wei_Dai 27 January 2010 11:07:23AM 1 point [-]

It can't actually do that, because it's not what its preference tells it to do. The same way you can't jump out of the window given you are not suicidal.

By that reasoning, World of Warcraft is not a game because the admins can't make me level 80 on day 1, because that's not what their preferences tell them to do... Or am I missing your point?

Comment author: Vladimir_Nesov 27 January 2010 12:37:41PM 0 points [-]

I'm attacking a specific argument that "FAI could just flick a switch". Whether it moves your conclusion about the described situation being a game depends on how genuine your argument for it being a game was and on how much you accept my counter-argument.

Comment author: ciphergoth 27 January 2010 01:08:51PM 0 points [-]

Could one of you précis the disagreement in a little more detail and with background? When you and Wei Dai disagree, I'd really like to understand the discussion better, but the discussion it sprang out of doesn't seem all that enlightening - thanks!

Comment author: Wei_Dai 27 January 2010 01:53:53PM 0 points [-]

I originally said that post-FAI, we'd have no real problems to solve, so everything we do would be like playing games, and we'd take a status hit because of that. Nesov allegedly found a way to recast the situation so that we can avoid taking the status hit, but I remain unconvinced. I admit this is one of our more trivial discussions. :)

Comment author: Vladimir_Nesov 29 January 2010 05:54:31PM *  1 point [-]

I originally didn't bother to do so explicitly, only wrote this reply that seems to have not been understood, but in light of Eliezer's post about flow of the argument, I'll recast the structure I see in the last few comments:

Wei: Bridge-building is a game, because FAI could just flick a switch. (Y leads to X having property S; Y="could flick a switch", X="FAI's world", S="is a game")
Vlad: No it couldn't, its preference (for us having to make an effort) makes it impossible for that to happen. (Y doesn't hold for X)
Wei: But there are games where players don't get free charity as well. (Z have property S without needing Y)
Vlad: I'm merely saying that Y doesn't hold, so if Y held any weight in the argument that "Y leads to X having property S", then having established not-Y, I've weakened the support for X having property S, and at least refuted the particular argument for X having property S, even if I haven't convincingly argued that X doesn't have property S overall.

Comment author: Kevin 27 January 2010 02:16:21PM *  1 point [-]

Is this a disagreement that is more about the meaning of words than anything else? I think you are Nesov are disagreeing about the meanings of game and real problems or maybe problems. Both of you defining those terms would help.

Comment author: ciphergoth 27 January 2010 02:39:25PM 0 points [-]

In the short term, I think you are correct. However, in the long term, I'm hoping that the FAI will find a non-disastrous way for us to become superintelligent ourselves, and therefore again be able to participate in solving real problems.

Comment author: JamesAndrix 26 January 2010 12:16:09AM 1 point [-]

When I build a bridge in a game, I get an in-game reward. I don't get easier transport to anywhere. If I neglect to build the bridge or play the game at all, I still get to use all the bridges otherwise available to me. 'Real' bridges are at the top level of reality available to me. Even the simulation hypothesis does not make these bridges a game.

Why do I want to cross the bridge? To not be bored, to find my love, or to meet some other human value. The AI could do that for me too, and cut out the need for transport. If we follow that logic even a short way, it would be obvious that we don't want the AI doing certain things for us. If there is danger of us being harmed because the FAI could help but won't it need merely help a little more, getting closer to those things we want to do ourselves. If we're in danger of being harmed by our own laziness, it need only back off. (It might do this at the level of the entires species, for all time, so individuals might be bored or angry or not cross rivers as soon as they would like, but it might optimize for everybody moment to moment.)

If there are things we couldn't stand to have a machine do, and couldn't stand for it to not help us with, I think those would be incoherent volitions.