Michaelos comments on [Link] A Short Film based on Eliezer Yudkowsky's AI Box experiment - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (40)
That's a good question. I checked the protocols at http://yudkowsky.net/singularity/aibox
The box appears to be originally defined as:
However, it also mentions:
That means a Gatekeeper could have said "I repair your mechanical problem/give you a lightbulb/save humanity, but you're still "In the box." " I can't argue, since the Gatekeeper by default also arbitrates all rule disputes. Now, it also says this:
If I was doing this case, a third party could have said "You allowed Michaelos's depowered AI fragments to escape the box, you lost." or "Sorry Michaelos, but being outside the box when you have no electrical power is not a win condition." I didn't really worry about defining all the rules because I primarily wanted to get a feel for the situation in general.
But no one even let depowered bits out. I had some very cautious gatekeepers.