I get the feeling I'm not just completing a full application of the definition here, but where does this apply to serious, terrible, "let's just imagine they threw you in hell for a few days" suffering. Sure, one can say that it's mostly pain being imagined, and the massive overload of a sensor not designed for such environments is really what we're bothered by, but is there a way that the part of this we usually talk about as 'suffering' fits into the attention-allocation narrative? Or are we talking about two different things here?
All the same, I find this fascinating and am going to experiment with it in my daily life. Looking forward to the non-content focused post.
I agree, I can't really reliably predict my actions. I think I know the morally correct thing to do, but I'm skeptical of my (or anyone's) ability to make reliable predictions about their actions under extreme stress. As I said, I usually use this when people seem overly confident of the consistency of their morality and their ability to follow it, as well as with people who question the plausibility of the original problem.
But I do recall the response distributions for this question mirroring the distribution for the second trolley problem; far fewer take...
I've used the trolley problem a lot, at first to show off my knowledge of moral philosophy, but later, when I realized anyone who knows any philosophy has already heard it, to shock friends that think they have a perfect and internally consistent moral system worked out. But I add a twist, which I stole from an episode of Radiolab (which got it from the last episode of MASH), that I think makes it a lot more effective; say you're the mother of a baby in a village in Vietnam, and you're hiding with the rest of the village from the Viet Cong. Your baby start...
I immediately thought, "Kill the baby." No hesitation.
I happen to agree with you on morality being fuzzy and inconsistent. I'm definitely not a utilitarian. I don't approve of policies of torture, for example. It's just that the village obviously matters more than a goddamn baby. The trolley problem, being more abstract, is more confusing to me.
despite all of the concerns raised in lionhearted's post, and everything that's been written on LW about how analytic types have trouble getting along without getting defensive and prickly, I still think I wouldn't see a response like this just about anywhere else on the internet. karma points to you
I like this post a lot, it speaks to some of my concerns about this community and about the sorts of people I'd like to surround myself with. As an analytical/systematizing/whatever (I got a 35 on the test Roko posted a while back, interpret from that what you will), I felt very strange about all of these rhetorical games for most of my life. It was only when I discovered signalling theory in my study of economics that it started to make sense. If I frame my social interactions as signalling problems, the goals and the ways I should achieve them seem to be...
You said pretty much what I was thinking. My (main) motivation for copying myself would be to make sure there is still a version of the matter/energy pattern wstrinz instantiated in the world in the event that one of us gets run over by a bus. If the copy has to stay completely separate from me, I don't really care about it (and I imagine it doesn't really care about me).
As with many uploading/anthropics problems, I find abusing Many Worlds to be a good way to get at this. Does it make me especially happy that there's a huge number of other me's in other universes? Not really. Would I give you anything, time or money, if you could credibly claim to be able to produce another universe with another me in it? probably not.
Once I understood the theory, my first question was has this been explained to any delusional patient with a good grasp of probability theory? I know this sort of thing generally doesn't work, but the n=1 experiment you mention is intriguing. I suppose what is more often interesting to me is what sorts of things people come up with to dismiss conflicting evidence, since it is in a strange place between completely random and clever lie. If you have a dragon in your garage about something you tend to give the most plausible excuses because you know, deep dow... (read more)