Stuart_Armstrong comments on Siren worlds and the perils of over-optimised search - Less Wrong

27 Post author: Stuart_Armstrong 07 April 2014 11:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (411)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilosophyTutor 28 April 2014 12:13:45AM *  1 point [-]

It seems based on your later comments that the premise of marketing worlds existing relies on there being trade-offs between our specified wants and our unspecified wants, so that the world optimised for our specified wants must necessarily be highly likely to be lacking in our unspecified ones ("A world with maximal bananas will likely have no apples at all").

I don't think this is necessarily the case. If I only specify that I want low rates of abortion, for example, then I think it highly likely that 'd get a world that also has low rates of STD transmission, unwanted pregnancy, poverty, sexism and religiousity because they all go together, I think you could specify any one of those variables and almost all of the time you would get all the rest as a package deal without specifying them.

Of course a malevolent AI could probably deliberately construct a siren world to maximise one of those values and tank the rest but such worlds seem highly unlikely to arise organically. The rising tide of education, enlightenment, wealth and egalitarianism lifts most of the important boats all at once, or at least that is how it seems to me.

Comment author: Stuart_Armstrong 28 April 2014 11:45:26AM *  0 points [-]

on there being trade-offs between our specified wants and our unspecified wants

Yes, certainly. That's a problem of optimisation with finite resources. If A is a specified want and B is an unspecified want, then we shouldn't confuse "there are worlds with high A and also high B" with "the world with the highest A will also have high B".