DanielLC comments on Siren worlds and the perils of over-optimised search - Less Wrong

27 Post author: Stuart_Armstrong 07 April 2014 11:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (411)

You are viewing a single comment's thread.

Comment author: DanielLC 15 April 2014 07:05:15AM 0 points [-]

I think the wording here is kind of odd.

An unconstrained search will not find a siren world, or even a very good world. There are simply too many to consider. The problem is that you're likely to design an AI that finds worlds that you'd like. It may or may not actually show you anything, but you program it to give you what it thinks you'd rate the best. You're essentially programming it to design a siren world. It won't intentionally hide anything dark under there, but it will spend way too much effort on things that make the world look good. It might even end up with dark things hidden, just because they were somehow necessary to make it look that good.

Comment author: Stuart_Armstrong 17 April 2014 11:19:08AM 0 points [-]

It won't intentionally hide anything dark under there, but it will spend way too much effort on things that make the world look good.

That's a marketing world, not a sire world.

Comment author: DanielLC 17 April 2014 06:09:04PM 0 points [-]

What's the difference?

Comment author: Stuart_Armstrong 18 April 2014 08:51:30AM 0 points [-]

Siren worlds are optimised to be bad and hide this fact. Marketing worlds are optimised to appear good, and the badness is an indirect consequence of this.