1 min read

-15

Edit: the below paragraphs are wrong. See the comments for an explanation.

 

Some people believe that the consciousness currently in one's body is the "same" consciousness as the one that was in one's body in the past and the one that will be in it in the future, but a "different" consciousness from those in other bodies. In this post I dissolve the question.

The question is meaningless because the answer doesn't correspond to any physical state in the universe and in no way influences or is influenced by sensory experiences. If one's consciousness suddenly became a totally different one, we know of no quantum particles that would change. Furthermore, swapping consciousnesses would make no changes to what is perceived. E.g. if one agent perceives p and time t and p' at the next time t+1, and another agent perceives q at time t and q' at time t+1, then if their consciousnesses are "swapped," the percepts would still be identical: p and q will be perceived at time t, and p' and q' will be perceived at t+1. One could argue that the percepts did change because the consciousness-swapping changed what a particular consciousness at time t will perceive at t+1, but that presupposes that a future consciousness will be in some meaningful way the "same" consciousness as the current one! Thus, the statement that two consciousnesses are the same consciousness is meaningless.

Can you find any flaws in my reasoning?

 

 

New Comment
23 comments, sorted by Click to highlight new comments since:

Useful heuristic: if a lot of smart people find something to be a hard problem, and you don't think it is a hard problem, you should assign most of the probability mass to you missing something.

[-]Shmi120

Sorry to break it to ya, but you not only failed to dissolve the question, you failed to even formulate it. Consider reading the appropriate sequence and the many posts here discussing the issue to educate yourself on the basics.

To start, the sentence

If one's consciousness suddenly became a totally different one, we know of no quantum particles that would change.

is either false or meaningless, depending on your definitions and assumptions.

Could you link to me the sequence, and explain how I failed to formulate the question?

[-]Val40

First you have to spend at least a few sentences on how you define "consciousness", otherwise you might risk this discussion to shift to the realm of semantics, like on the classic paradox of the sound of a fallen tree in a deserted forest.

I can't really define consciousness; doing so would be akin to trying to define red. The best I can do is point to something that is conscious (such as myself) and say "conscious" and then point to something that seems unlikely to be conscious (such as a rock) and say "not conscious."

The best I can do is point to something that is conscious (such as myself) and say "conscious" and then point to something that seems unlikely to be conscious (such as a rock) and say "not conscious."

That's not enough to be able to meaningfully answer the question "If one's consciousness suddenly became a totally different one, we know of no quantum particles that would change".

Practically I can distinguish a state where my heartbeat is within my conscious awareness from the state where it isn't. I can also distinguish the state where the heartbeat of another person is within my awareness and isn't.

I can distinguish between having the person sitting next to me in a lecture within my awareness and not having them within my awareness.

In both cases there are practical consequences. Likely consequences that you haven't observed because you don't have control over the different states and therefore can't experiment with it.

I don't follow. Why are you talking about conscious awareness?

What do you think consciousness is about when it's not about being aware and perceiving something?

Again, I can't really describe consciousness, so I'll give a example. Computers are aware of the user's keystrokes, as in there is some signal in the computer that is formed by keystrokes and processed accordingly, but the computer isn't necessarily conscious.

My body is always aware of it's heart beat the same way that a computer is always aware of the users keystrokes.

That's not the awareness I'm talking about. I'm talking about conscious awareness where it's possible to be aware of the heart beat and possible not to be.

Yes, I think I'm thinking about the same type of consciousness/awareness you are. Where were we going with this?

Yes, I think I'm thinking about the same type of consciousness/awareness you are.

Then why are you speaking about the way a computer is aware about all keystrokes that the keyboard sends? That's not what I'm talking about.

The keystroke example was to demonstrate what I didn't mean when I said consciousness.

Okay, sorry on that part.

[-]Val30

I'm sorry if I was not clear enough. I was meaning to state how you yourself understand the term "consciousness". For example, sound can be defined as a compression wave, or as a sensory experience.

If you define consciousness as in your example, then consciousness should always persist while you are considered to be different from a non-living object. However, it is possible to come up with a definition of consciousness where being asleep would make consciousness non-persistent.

I didn't realize there were multiple definitions of consciousness. Where can I learn what they are?

trying to define red

Red is the color of surfaces emitting/transmitting/scattering plenty of low-energy visible light but little high-energy visible light. Doesn't sound too tricky to me. What am I missing?

Again, I think we're this is more vocabulary confusion. When you said red, I think you meant the causes of one experiencing what we call "seeing red". When I said red, I meant one's subjective experience of red, as opposed to the subjective experience of blue. If one was changed so that when the photons of the wavelength that's normally seen as red were instead seen as what's normally seen as blue, there would be a difference is what is subjectively experienced, even though one can't rigorously define it.

[-]Shmi00

Consider asking a less trivial question, like "Can a machine be conscious?"

And consider reading http://www.scottaaronson.com/blog/?p=1951 and http://edge.org/responses/what-do-you-think-about-machines-that-think

As for a better attempt to dissolve a related question, see http://lesswrong.com/lw/5n9/seeing_red_dissolving_marys_room_and_qualia/

It would be helpful to start this sort of discussion with a working definition of consciousness, as the term has various meanings in various contexts and still carries metaphysical connotations.

If one's consciousness suddenly became a totally different one, we know of no quantum particles that would change.

If we take the viewpoint that consciousness is a function of the brain's internal state, then swapping one consciousness with another should correspond to the appropriate change inside the brain.

Furthermore, swapping consciousnesses would make no changes to what is perceived. E.g. if one agent perceives p and time t and p' at the next time t+1, and another agent perceives q at time t and q' at time t+1, then if their consciousnesses are "swapped," the percepts would still be identical: p and q will be perceived at time t, and p' and q' will be perceived at t+1.

Observing an external event creates a change in the brain state that depends both on the actual event and on the current brain state. Thus, two people can observe the same event and come away with completely different perceptions (politics provides an illustrative example). If a change in consciousness implies a change in brain state, then we would not expect the brain state after the observation to be the same in both cases.

[-][anonymous]10

If we define consciousness as just "awareness" in general, which is admittedly a vague definition, then consciousness persists as long as we are awake. If your definition includes thought processes as distinct elements of consciousness, then the states do correspond to physical states; they are the physical state of the brain. With a better definition of consciousness we could more easily dissolve the question, but until you do that we don't know what question we're trying to dissolve.

If one's consciousness suddenly became a totally different one, we know of know quantum particles that would change.

Changing someone's personality would cause many changes to the brain, since the functioning of the brain is responsible for it. If by this you mean their "identity" changes rather than any actual measurable physical or mental characteristics, then what do you mean by identity? Does it correspond to anything in reality?

This reminds me of the "why am I me and not somebody else" question, the answer to which is that "you" are the result of a mind growing up in that particular body/environment, and "you" can't have grown up in a different body because a mind growing up in a different body would be a different mind. There's no mystical identity that can transfer between people or through time (which I think is what you were actually trying to prove in your post, and which requires further argument to prove).

Swapping consciousnesses would make no changes to what is percieved.

Again, what do you mean by consciousness; a person's mind or some sort of "identity"? There are entire movies whose premises are based on swapping minds and the very apparent changes that result.

E.g. if one agent perceives p and time t and p' at the next time t+1, and another agent perceives q at time t and q' at time t+1, then if their consciousnesses are "swapped," the percepts would still be identical: p and q will be perceived at time t, and p' and q' will be perceived at t+1.

Then what are we swapping? If I swap subject p's brain with subject q's, then how can I call the consciousness currently in q's body q?

I agree that reasoning along lines similar to this can lead one to reject the concept of numerical identity of consciousness, but one needs to define what they mean by consciousness first. If you are thinking more along the lines of identity rather than consciousness per se, there's a lot of information about the diachronic problem.

You would benefit from reading that post you linked to about dissolving questions. What you seem to be doing is like asking "does free will exist?" and trying to argue that it doesn't. To dissolve the question is to precisely show the thought process that produces the question.

You're right. I (tried) to show the question was meaningless; I didn't dissolve it. I somehow misread it on my first reading. Oddly though, Eliezer never explicitly stated what dissolving the question meant in the post.