Comment author: RobbBB 26 August 2013 08:51:36PM *  1 point [-]

Qualia are the specific qualitative properties of our experience, as accessed by first-person introspection. The raw redness of the red you notice in your field of vision, for instance, as contrasted with the functional state of visually detecting light of wavelength 620–740 nm that an alien building a cognitive model of your behavior might initially construct. (Which is not to say that those two properties are non-identical. But if the two are identical, this must be discovered, not just stipulated.)

There are several popular arguments for the irreducibility of qualia (and, more generally, against the reliability of introspection as a method for directly reading off part of our world's ontology), which has made them controversial posits.

In a conversation about mindstuff, what Eliezer calls 'Reductionism' is what I'd call 'physicalism' — the doctrine that the mental (unlike physicsy stuff) is non-fundamental, that it can in principle be fully explained in non-mental terms. The 'reductionism' I think about (which I'll distinguish by making it minuscule) is the more specific doctrine that mental stuff — in this case, phenomenal properties, qualia — exists and is reducible. So a physicalist has to either eliminate or reduce every mental posit.

Eliminativists might insist that we reject qualia because the evidence for them is strictly first-person and introspective (phenomenological) rather than sensory and in-the-world (scientific). Or an eliminativist might think that we have strong prior grounds for accepting physicalism, but that reductionism is a doomed project, say, because of the Mary's Room argument.

Comment author: Juno_Watt 26 August 2013 09:09:58PM 0 points [-]

I agree with most of this, although I am not sure that the way strawberries taste to me is a posit.

Comment author: ESRogs 26 August 2013 03:07:43PM 1 point [-]

If Charles's qualia have changed, that will be noticeable to Charles -- introspection is hardly necessary, sinc ethe external world wil look different! But Charles won't report the change.

I don't think I understand what you're saying here, what kind of change could you notice but not report?

Comment author: Juno_Watt 26 August 2013 03:32:38PM *  -1 points [-]

If a change to the way your funcitonality is implemented alters how your consciousness seems to you, your consciosuness will seem different to you. If your funcitonality is preserved, you won't be able to report it. You will report tomatos are red even if they look grue or bleen to you. (You may also not be able to cognitively access--remember or think about--the change, if that is part of the preserved functionality, But if your experience changes, you can't fail to experience it).

Comment author: FeepingCreature 26 August 2013 02:03:46PM *  2 points [-]

Because the neural substitution preserves funcitonal equivlance, Charles will report the same qualia whether or not he still has them

Implying that qualia can be removed from a brain while maintaining all internal processes that sum up to cause talk of qualia, without deliberately replacing them with a substitute. In other words, your "qualia" are causally impotent and I'd go so far as to say, meaningless.

Are you sure you read Eliezer's critique of Chalmers? This is exactly the error that Chalmers makes.

It may also help you to read making beliefs pay rent and consider what the notion of qualia actually does for you, if you can imagine a person talking of qualia for the same reason as you while not having any.

Comment author: Juno_Watt 26 August 2013 03:13:11PM *  -1 points [-]

Implying that qualia can be removed from a brain while maintaining all internal processes that sum up to cause talk of qualia, without deliberately replacing them with a substitute. In other words, your "qualia" are causally impotent and I'd go so far as to say, meaningless.

Doesn't follow, Qualia aren't causing Charles's qualia-talk, but that doens't mean thery aren't causing mine. Kidney dyalisis machines don't need nephrons, but that doens't mean nephrons are causally idle in kidneys.

The epiphenomenality argument works for atom-by-atom duplicates, but not in WBE and neural replacement scenarios. if indentity theory is true, qualia have the causal powers of whatever physical properties they are identical to. If identity theory is true, changing the physcial substrate could remove or change the qualia.

Comment author: asparisi 26 August 2013 02:06:09PM 1 point [-]

Accounting for qualia and starting from qualia are two entirely different things. Saying "X must have qualia" is unhelpful if we cannot determine whether or not a given thing has qualia.

Qualia can perhaps best be described, briefly, as "subjective experience." So what do we mean by 'subjective' and 'experience'?

If by 'subjective' we mean 'unique to the individual position' and by 'experience' we mean 'alters its internal state on the basis of some perception' then qualia aren't that mysterious: a video camera can be described as having qualia if that's what we are talking about. Of course, many philosophers won't be happy with that sort of breakdown. But it isn't clear that they will be happy with any definition of qualia that allows for it to be distinguished.

If you want it to be something mysterious, then you aren't even defining it. You are just being unhelpful: like if I tell you that you owe me X dollars, without giving you anyway of defining X. If you want to break it down into non-mysterious components or conditions, great. What are they? Let me know what you are talking about, and why it should be considered important.

At this point, it's not a matter of ruling anything out as incoherent. It's a matter of trying to figure out what sort of thing we are talking about when we talk about consciousness and seeing how far that label applies. There doesn't appear to be anything inherently biological about what we are talking about when we are talking about consciousness. This could be a mistake, of course: but if so, you have to show it is a mistake and why.

Comment author: Juno_Watt 26 August 2013 02:39:32PM *  -1 points [-]

Accounting for qualia and starting from qualia are two entirely different things. Saying "X must have qualia" is unhelpful if we cannot determine whether or not a given thing has qualia.

We can tell that we have qualia, and our won consciousnessn is the ntarual starting point.

"Qualia" can be defined by giving examples: the way anchiovies taste, the way tomatos look, etc.

You are makiing heavy weather of the indefinability of some aspects of consciousness, but the flipside of that is that we all experience out won consciousness. It is not a mystery to us. So we can substitute "inner ostension" for abstract definition.

There doesn't appear to be anything inherently biological about what we are talking about when we are talking about consciousness.

OTOH, we don't have examples of non-biological consc.

Comment author: ESRogs 26 August 2013 03:54:24AM 4 points [-]

Did you read all the way to the dialogue containing this hypothetical?

Albert: "Suppose I replaced all the neurons in your head with tiny robotic artificial neurons that had the same connections, the same local input-output behavior, and analogous internal state and learning rules."

The following discussion seems very relevant indeed.

Comment author: Juno_Watt 26 August 2013 01:07:58PM 1 point [-]

I don't see anything very new here.

Charles: "Uh-uh! Your operation certainly did disturb the true cause of my talking about consciousness. It substituted a different cause in its place, the robots. Now, just because that new cause also happens to be conscious—talks about consciousness for the same generalized reason—doesn't mean it's the same cause that was originally there."

Albert: "But I wouldn't even have to tell you about the robot operation. You wouldn't notice. If you think, going on introspective evidence, that you are in an important sense "the same person" that you were five minutes ago, and I do something to you that doesn't change the introspective evidence available to you, then your conclusion that you are the same person that you were five minutes ago should be equally justified. Doesn't the Generalized Anti-Zombie Principle say that if I do something to you that alters your consciousness, let alone makes you a completely different person, then you ought to notice somehow?"

How does Albert know that Charles;s consciousness hasn't changed? It could have changed becasue of the replacement of protoplasm by silicon. And Charles won't report the change because of the functional equivalence of the change.

Charles: "Introspection isn't perfect. Lots of stuff goes on inside my brain that I don't notice."

If Charles's qualia have changed, that will be noticeable to Charles -- introspection is hardly necessary, sinc ethe external world wil look different! But Charles won't report the change. "Introspection" is being used ambiguously here, between what is noticed and what is reported.

Albert: "Yeah, and I can detect the switch flipping! You're detecting something that doesn't make a noticeable difference to the true cause of your talk about consciousness and personal identity. And the proof is, you'll talk just the same way afterward."

Albert's comment is a non sequitur. That the same effect occurs does not prove that the same cause occurs, There can mutliple causes of reports like "I see red". Because the neural substitution preserves funcitonal equivlance, Charles will report the same qualia whether or not he still has them,

Comment author: asparisi 26 August 2013 11:52:45AM 1 point [-]
  1. You've chosen one of the easier aspects of consciousness: self-awareness rather than, eg. qualia.

I cover this a bit when I talk about awareness, but I find qualia to often be used in such a way as to obscure what consciousness is rather than explicate it. (If I tell you that consciousness requires qualia, but can't tell you how to distinguish things which have qualia from things which do not, along with good reason to believe that this way of distinguishing is legitimate, then rocks could have qualia.)

  1. The "necessarily biological" could be aposteriori nomic necessity, not apriori conceptual necessity, which is the only kind you knock down in your comment.

If the defenders of a biological theory of consciousness want to introduce an empirically testable law to show that consciousness requires biology then I am more than happy to let them test it and get back to us. I don't feel the need to knock it down, since when it comes to a posteriori nomic necessity, we use science to tell whether it is legitimate or not.

Comment author: Juno_Watt 26 August 2013 12:14:35PM -1 points [-]

If we want to understand how consciousness works in humans, we have to accou t for qualia as part of it. Having an undertanding of human consc. is the best practical basis for deciding whether other entitieies have consc. OTOH, starting by trying to decide which entities have consc. is unlikely to lead anywhere.

The biological claim can be ruled out if it is incoherent, but not if it for being unproven, since the funciontal/computational alternative is also unproven.

Comment author: Eliezer_Yudkowsky 26 August 2013 01:03:57AM 3 points [-]
Comment author: Juno_Watt 26 August 2013 01:24:29AM -1 points [-]

Why? That doesn't argue any point relevant to this discussion.

Comment author: thomblake 01 August 2009 11:24:38PM 2 points [-]

you can't simply say that because we cannot use qualia to predict anything at this point, then you can just ignore qualia

In fact, I can and did. Furthermore, if a hypothesis doesn't predict anything, then it is a meaningless hypothesis; it cannot be tested, and it is not useful even in principle. An explanation that does not suggest a prediction is no explanation at all.

Avoid mysterious answers to mysterious questions

Comment author: Juno_Watt 26 August 2013 01:11:04AM 0 points [-]

"qualia" labels part of the explanandum, not the explanation.

Comment author: nshepperd 26 August 2013 12:21:23AM 10 points [-]

The argument against p-zombies is that the reason for our talk of consciousness is literally our consciousness, and hence there is no reason for a being not otherwise deliberately programmed to reproduce talk about consciousness to do it if it weren't conscious. It is a corollary of this that a zombie, which is physically identical, and therefore not deliberately programmed to imitate talk of consciousness but must still reproduce it, must talk about consciousness for the same reason we do. That is, the zombies must be conscious.

A faithful synaptic-level silicone WBE, if it independently starts talking about it at all, must be talking about it for the same reason as us (ie. consciousness), since it hasn't been deliberately programmed to fake consciousness-talk. Or, something extremely unlikely has happened.

Note that supposing that how the synapses are implemented could matter for consciousness, even while the macro-scale behaviour of the brain is identical, is equivalent to supposing that consciousness doesn't actually play any role in our consciousness-talk, since David Chalmers would write just as many papers on the Hard Problem regardless of whether we flipped the "consciousness" bit in every synapse in his brain.

Comment author: Juno_Watt 26 August 2013 01:02:46AM *  -2 points [-]

The argument against p-zombies is that the reason for our talk of consciousness is literally our consciousness, and hence there is no reason for a being not otherwise deliberately programmed to reproduce talk about consciousness to do it if it weren't conscious.

A functional duplicate will talk the same way as whomever it is a duplicate of.

A faithful synaptic-level silicone WBE, if it independently starts talking about it at all, must be talking about it for the same reason as us (ie. consciousness),

A WBE of a specific person will respond to the same stimuli in the same way as that person. Logically, that will be for the reason that it is a duplicate, Physically, the "reason" or, ultimate cause, could be quite different, since the WBE is physically different.

since it hasn't been deliberately programmed to fake consciousness-talk.

It has been programmed to be a functional duplicate of a specific individual.,

Or, something extremely unlikely has happened.

Something unlikely to happen naturally has happened. A WBE is an artificial construct which is exactly the same as an person in some ways,a nd radically different in others.

Note that supposing that how the synapses are implemented could matter for consciousness, even while the macro-scale behaviour of the brain is identical, is equivalent to supposing that consciousness doesn't actually play any role in our consciousness-talk,

Actually it isn't, for reasons that are widely misunderstood: kidney dyalisis machines don't need nephrons, but that doens't mean nephrons are causally idle in kidneys.

Comment author: Eliezer_Yudkowsky 25 August 2013 11:09:50PM 5 points [-]
Comment author: Juno_Watt 25 August 2013 11:23:22PM 5 points [-]

The argument against p-zombies is that there is no physical difference that could explain the difference in consciousness. That does not extend to silicon WBEs or AIs.

View more: Prev | Next