IlyaShpitser comments on The raw-experience dogma: Dissolving the “qualia” problem - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (340)
A joke: there is in fact an empirical test for p-zombiehood: whether you agree with Dennett or not.
Well, I agree with Dennett, and I'm pretty sure I'm a p-zombie.
I mean, that's the whole point, right? That p-zombies aren't actually any different from real people?
A p-zombie doesn't feel pain; it just says it does, and it goes through the motions of being in pain. Does that sound like you? If we chop off your hand, will you not actually be feeling anything?
When people say that it's conceivable for something to act exactly as if it were in pain without actually feeling pain, they are using the word "feel" in a way that I don't understand or care about. So, sure: I don't feel pain in that sense. That's not going to stop me from complaining about having my hand chopped off!
OK. But you're using "feel" in a sense I don't understand.
As far as I know, to feel is to detect, or perceive, and pain is positive punishment, in the jargon of operant conditioning. So to say "I feel pain" is to say that I detect a stimulus, and process the information in such a way that (all else equal) I will try to avoid similar circumstances in the future. Not being a psychologist, I don't know much more about pain. But (not being a psychologist) I don't need to know more about pain. And I reject the notion that we can, through introspection, know something more about what it "is like" to be in pain.
I believe it's unethical to inflict pain on people (or animals, unnecessarily), because to hold something in a state of pain is to frustrate its goals. I don't think that it is any qualia associated with pain that makes it bad. Indeed, this seems to lead to morally repugnant conclusions. If we could construct a sophisticated intelligence that can learn by operant conditioning, but somehow remove the qualia, does it become OK to subject it to endless punishment?
I don't think we have to argue whether it is the goal-frustration or the pain-quale that is the bad. They are both bad. I don't want to have my goals frustrated painlessly, and I don't want to experience pain even in ways that promote my goals, such as being cattle-proded every time I slip into Akrasia.
It would have been helpful to say why you reject it. If you were in a Mary-style experiment, whre you studied pain whilst being anaesthetised from birth, would you maintinan that personally experiencing pain for the first time would teach you nothing?
Don't you mean that avoiding pain is one of your goals?
It just seems like the default position. Can you give me a reason to take the idea of qualia seriously in the first place?
Yes.
Yes. Because pain hurts.
Yes. My pains hurt. My food tastes. Voices and music sound like something.
Do you go drink the wine or just read the label? Do you go on holiday or just read the brochure?
Um, those are all tautologies, so I'm not sure how to respond. If we define "qualia" as "what it feels like to have a feeling", then, well - that's just a feeling, right? And "qualia" is just a redundant and pretentious word, whose only intelligible purpose is to make a mystery of something that is relatively well understood (e.g: the "hard problem of consciousness"). No?
Erm, sorry for the snark, but seriously: has talk of qualia, as distinct from mere perceptions, ever achieved any useful or even interesting results? Consciousness will continue to be a mystery to people as long as they refuse to accept any answers - as long as they say: "Okay, you've explained everything worth knowing about how I, as an information processing system, perceive and respond to my environment. And you've explained everything worth knowing about how I perceive my own perceptions of my environment, and perceive those perceptions, and so on ad infinitum - but you still haven't explained why it feels like something to have those perceptions."
Ha! That's actually not far off. But it's because I'm a total nerd who tries to eat healthy and avoid unnecessary expenses - not because of how I feel about qualia. I think that happiness should be a consequence of good things happening, not that happiness is a good thing in itself. So I try to avoid doing things (like drugs) that would decouple my feelings from outcomes in the real world. In fact, if I just did whatever I felt like at any given time, I would end up even less outgoing - less adventurous.
If someone offered me a pill that would merely reduce my qualia experience of pain I would take it, even if it still triggered in me a process of information that would cause me to try to avoid similar circumstances in the future, and even if it were impossible to tell observationally that I had taken it, except by asking about my qualia of experiencing pain and other such philosophical topics. That is, if I am going to writhe in agony, I would prefer to have my mind do it for me without me having to experience the agony. If I'm going to never touch a hot stove because of one time when I burned me, I'd prefer to do that without having the memory of the burn. This idea is not malformed, given what we know about the human brain's lack of introspection on it's actions.
In practice it seems that the only reason that it frustrates a person's goals to receive pain is because they have a goal, "I don't want to be in pain." There are certainly reasons that the pain is adaptive, but it certainly seems from the inside like the most objectionable part is the qualia. If the sophisticated intelligence HAS qualia but doesn't have as a goal avoidance of pain, that suggests your ethical system would be OK to subject it to endless punishment (a sentiment with which I may agree).
Morphine is said to have this effect. Some people who have been prescribed it for pain say that they still feel the pain but it doesn't hurt. But it's illegal in most places except for bona fide medical purposes.
I think that split-brain study shows the opposite of what you think it shows. If you observed yourself to be writhing around in agony, then you would conclude that you were experiencing the qualia of pain. Try to imagine what this would actually be like, and think carefully about what "trying to avoid similar circumstances in the future" actually means. You can't sit still, can't think about anything else. You plead with anyone around to help you - put a stop to whatever is causing this - insisting that they should sympathize with you. The more intense the pain gets, the more desperate you become. If not, then you aren't actually in pain (as I define it) because you aren't trying very hard to avoid the stimulus. I'd sympathize with you. Are you saying you wouldn't sympathize with yourself?
BTW, how do you think I'd respond, if subjected to pain and asked about my "qualia"? By this reasoning, is my pain irrelevant?
I think you have the causation backwards. Pain causes a person to acquire the goal of avoiding whatever the source of the pain is, even if they didn't have that goal before. (Think about someone confidently volunteering to be water-boarded to prove a point, only to immediately change his mind when the torture starts.) That's how I just defined pain above. That's all pain is, as far as I know. Of course, in animals, the pain response happens to be associated with a bunch of biological quirks, but we could recognize pain without those minutiae.
Well, you just described an intelligence that doesn't feel pain. So it doesn't make sense to ask whether it would be OK to inflict pain on it. Could you clarify what it would mean to punish something that has no desire to avoid the punishment?
Taken literally, this suggests that you believe all actors really believe they are the character (at least, if they are acting exactly like the character). Since that seems unlikely, I'm not sure what you mean.
If an actor stays in character his entire life, making friends and holding down a job, in character - and if, whenever he seemed to zone out, you could interrupt him at any time to ask what he was thinking about, and he could give a detailed description of the day dream he was having, in character...
Well then I'd say the character is a lot less fictional than the actor. But even if there is an actor - an entirely different person putting on a show - the character is still a real person. This is no different from saying that a person is still a person, even if they're a brain emulation running on a computer. In this case, the actor is the substrate on which the character is running.
So would you say video game characters "feel" pain?
Probably some of them do (I don't play video games). But they aren't even close to being people, so I don't really care.
Would you say a thermostat feels pain when it can't adjust the temperature towards its preferred setting? Otherwise you might have some strange ideas about the complexity of video game characters. There's a very long way to go in internal complexity from a video game character to, say, a bacterium.
I don't think a program has to be very sophisticated to feel pain. But it does have to exhibit some kind of learning. For example:
This program aimlessly wanders over a space of locations, but eventually tends to avoid locations where X has returned True at past times. It seems obvious to me that X is pain, and that this program experiences pain. You might say that the program experiences less pain than we do, because the pain response is so simple. Or you might argue that it experiences pain more intensely, because all it does is implement the pain response. Either position seems valid, but again it's all academic to me, because I don't believe pain or pleasure are good or bad things in themselves.
To answer your question, a thermostat that is blocked from changing the temperature is frustrated, not necessarily in pain. Although, changing the setting on a working thermostat may be pain, because it is a stimulus that causes a change in the persistent behavior a system, directing it to extricate itself from its current situation.
(edit: had trouble with indentation.)
Ding-ding!