Upon some reflection, I remembered that Robin has showed that two Bayesians who share the same priors can't disagree. So perhaps you can get your wish from an unsafe genie by wishing, "... to run a genie that perfectly shares my goals and prior probabilities."
If you can rigorously define Safety, you've already solved the Safety Problem. This isn't a shortcut.
3billy_the_kid
I wish for you to interpret my wishes how I interpret them.
Can anyone find a problem with that?
0themusicgod1
A sufficiently powerful genie might make safe genies by definition more unsafe. Then your wish could be granted.
edit (2015) caution: I think this particular comment is harmless in retrospect... but I wouldn't give it much weight
Upon some reflection, I remembered that Robin has showed that two Bayesians who share the same priors can't disagree. So perhaps you can get your wish from an unsafe genie by wishing, "... to run a genie that perfectly shares my goals and prior probabilities."