asr comments on an ethical puzzle about brain emulation - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (55)
I picked the torture example, because I'm not sure what "John experiences X" really means, once you taboo all the confusing terms about personal identity and consciousness" -- but I think the moral question is a "territory" question, not a "map" question.
The "all states and only the states of the brain" part confuses me. Suppose we do time-slicing; the computer takes turns simulating John and simulating Richard. That can't be a moral distinction. I suspect it will take some very careful phrasing to find a definition for "all states and only those states" that isn't obviously wrong.
Yah. After thinking about this for a couple of days the only firm conclusion I have is that moral intuition doesn't work in these cases. I have a slight worry that thinking too hard about these sorts of hypotheticals will damage my moral intuition for the real-world cases -- but I don't think this is anything more than a baby basilisk at most.
I don't quite understand this. If a given event is not an example of John experiencing torture, then how is the moral status of John experiencing torture relevant?
I wasn't trying to argue that if this condition is not met, then there is no moral difficulty, I was just trying to narrow my initial claim to one I could make with confidence.
If I remove the "and only" clause I open myself up to a wide range of rabbit holes that confuse my intuitions, such as "we generate the GLUT of all possible future experiences John might have, including both torture and a wildly wonderful life".
IME moral intuitions do work in these cases, but they conflict, so it becomes necessary to think carefully about tradeoffs and boundary conditions to come up with a more precise and consistent formulation of those intuitions. That said, changing the intuitions themselves is certainly simpler, but has obvious difficulties.