When I was reading The Seven Biggest Dick Moves in the History of Gaming, I was struck by the number of people who are strongly motivated to cause misery to others [1], apparently for its own sake. I think the default assumption here is that the primary risk to ems is from errors in programming an AI, but cruelty from other ems, from silicon minds closely based on humans but not ems (is there a convenient term for this?) and from just plain organic humans strikes me as extremely likely.
We're talking about a species where a significant number of people feel better when they torture Sims. I don't think torturing Sims is of any moral importance, but it serves as an indicator about what people like to do. I also wonder how good a simulation has to be before torturing it does matter.
I find it hard to imagine a system where it's easy to upload people which has security so good that torturing copies wouldn't be feasible, but maybe I'm missing something.
[1] The article was also very funny. I point this out only because I feel a possibly excessive need to reassure readers that I have normal reactions.
Fair enough, as long as you're not presupposing that our value systems -- which are probably better than "minimize pain" -- are unlikely to have strong anti-torture preferences.
As for the other two points: you might have already argued for them somewhere else, but if not, feel free to say more here. It's at least obvious that anti-em-torture is harder to enforce, but are you thinking it's also probably too hard to even know whether a computation creates a person being tortured? Or that our notion of torture is probably confused with respect to ems (and possibly with respect to us animals too)?
If you express the preferences in terms of tradeoffs, it does not seem likely that the preference against the torture of ems will or should be 'strong.'
Both. It seems difficult to define torture (and decide what tradeoffs are worthwhile), and even if you could define torture it seems like there is no torture-free way to determine whether or not particular code is torturous.