TheOtherDave comments on an ethical puzzle about brain emulation - Less Wrong

14 Post author: asr 13 December 2013 09:53PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (55)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 15 December 2013 07:53:51PM 1 point [-]

I picked the torture example, because I'm not sure what "John experiences X" really means, once you taboo all the confusing terms about personal identity and consciousness" -- but I think the moral question is a "territory" question, not a "map" question.

I don't quite understand this. If a given event is not an example of John experiencing torture, then how is the moral status of John experiencing torture relevant?

The "all states and only the states of the brain" part confuses me.

I wasn't trying to argue that if this condition is not met, then there is no moral difficulty, I was just trying to narrow my initial claim to one I could make with confidence.

If I remove the "and only" clause I open myself up to a wide range of rabbit holes that confuse my intuitions, such as "we generate the GLUT of all possible future experiences John might have, including both torture and a wildly wonderful life".

the only firm conclusion I have is that moral intuition doesn't work in these cases.

IME moral intuitions do work in these cases, but they conflict, so it becomes necessary to think carefully about tradeoffs and boundary conditions to come up with a more precise and consistent formulation of those intuitions. That said, changing the intuitions themselves is certainly simpler, but has obvious difficulties.