crap comments on Ideal Advisor Theories and Personal CEV - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (34)
What's about moral objections to creation of multitude of agents for the purposes of evaluation?
http://wiki.lesswrong.com/wiki/Nonperson_predicate (open problem!)
Relevant excerpt from Ian Banks new Culture novel, The Hydrogen Sonata:
It seems to me that prohibitions on mistreating sims might be the only example of a reasonable moral stricture with no apparent up-side-- it 's just avoiding a down-side.
Decent treatment of sentients at your own reality level increases opportunities for cooperation and avoids cycles of revenge, neither of which apply to sims.... unless you also have an obligation to let them join your society.
Not necessarily. Think for example of the controversy of linking FPS (the shooter variety, not the one with moving pictures per second) games and real life violence. Now, I'm not advocating such a link here at all, but it is conceivable that how you treat sims carries over to how you treat sentients at your own reality level to some extent, no matter how minor. Yielding a potential up-side.
At least in theory, this could be tested. We have the real world example of people who torture sims (something which seems more psychologically indicative to me than first-person shooter games). It might be possible to find out whether they're different from people who play Sim City but don't torture sims, and also whether torturing sims for the fun of it changes people.
Yes, although it would be really, really strange if there were no effect whatsoever, if in fact there were any activity period that you can engage in long term without in some way or form shaping your brain. This is anthropomorphizing of course, who knows what will or won't affect far future individuals. Still, we could test for current humans the effect size, relative to which we could define some threshold at which we'd call the effect non-negligible.
I haven't read THS yet, but I'm surprised that even a civilization written by Banks didn't think that the correct response to finding oneself as a "vengeful god" is to create an afterlife.
They explicitly don't address that: