You all know the rules:
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote comments/posts on LW/OB.
- No more than 5 quotes per person per monthly thread, please.
Given complexity of value, 'evil giant' and 'good giant' should not be weighted equally; if we have no specific information about the morality distribution of giants, then as with any optimization process, 'good' is a much, much smaller target than 'evil' (if we're including apparently-human-hostile indifference).
Unless we believe them to be evolutionarily close to humans, or to have evolved under some selection pressures similar to those that produced morality, etc., in which we can do a bit better than a complexity prior for moral motivations.
(For more on this, check out my new blog, Overcoming Giants.)
Well, if by giants we mean "things that seem to resemble humans only are particularly big", then we should expect some sort of shared evolutionary history, so....