Edit: This is old material. It may be out of date.
I'm talking about the fictional race of House Elves from the Harry Potter universe first written about by J. K. Rowling and then uplifted in a grand act of fan-fiction by Elizer Yudkowsky. Unless severely mistreated they enjoy servitude to their masters (or more accurately the current residents of the homes they are binded to), this is also enforced by magical means since they must follow the letter if not the spirit of their master's direct order.
Overall treating House Elves the way they would like to be treated appears more or less sensible and don't feel like debating this if people don't disagree. Changing agents without their consent or knowledge seems obviously wrong, so turning someone into a servant creatures seem intuitively wrong. I can also understand that many people would mind their descendants being modified in such a fashion, perhaps their dis-utility is enough to offset the utility of their modified descendants. However how true is this of distant descendants that only share passing resemblance? I think a helpful reminder of scale might be our own self domestication.
Assuming one created elf like creatures ex nihilo, not as slightly modified versions of a existing species why would one not want to bring a mind into existence that would value its own existence and benefits you, as long as the act of creation or their existence in itself does not represents huge enough dis-utility? This seems somewhat related to the argument Robin Hanson once made that any creatures that can pay for their own existance and would value their own existance should be created.
I didn't mention this in the many HP fan fiction threads because I want a more general debate on the treatment and creation of such a class of agents.
Edit: Clearly if the species or class contains exceptions there should be ways for them to pursue their differing values.
I just thought about something. Could it be that we implicitly assign negative value to the existence of human-like minds who's utilities are just slightly off in a obvious way? Is part of the aversion to wireheading a uncanny valley effect?
Thinking of a creature that evolved or was selected to enjoy being used as a beast of labour by caretakers seems more ok the more different its mind is from ours (lets say for the sake of argument it does have human or superhuman level intelligence).
Why is my sympathy tied to this? Is this a case of my neural circutry being incapable of emulating what it would be like to be such a creature? Or is the failure in the first example, since I try to use my mind to emulate a mind that while otherwise similar has something vital I can't understand changed?
I'd expect that the Less Wrong community would tend to be unusually independent and averse to hierarchies. But much of humanity is accustomed to obedience, and even regards obedience to authority as a positive good. Some societies are more authoritarian than others, but duty, and humility and respect to superiors are commonly praised as virtues. Disobedience, whining, and malingering are considered bad.
Lots and lots of parents devote a lot of effort to raising their children to be obedient and respectful, not for cynical reasons, but out of love. T... (read more)