Comment author: jmmcd 26 August 2010 09:53:59PM 0 points [-]

Your entire reply deals with arguments you wish I had made.

Without coming down anywhere on the issue of continued personal identity being an illusion, OR the issue of a sense of purpose in this scenario, I'm trying to point out a purely logical inconsistency:

If uploading for personal immortality is "pursuing an illusion", then so is living: so you should allow inklesspen to murder you.

The other way around: if you want to accomplish things in the future with your current body, then you should be able to conceive of people wanting to accomplish things in their post-upload future. The continuity with the current self is equally illusory in each case, according to you.

Comment author: daedalus2u 27 August 2010 02:43:49AM 0 points [-]

Inklesspen's argument (which you said you agreed with) was was that my belief in a lack of personal identity continuity was incompatible with being unwilling to accept a painless death and that this constitutes a fatal flaw in my argument.

If there are things you want to accomplish and where you believe the most effective way for you to accomplish those things is via uploading what you believe will be a version of your identity into an electronic gizmo; all I can say is good luck with that. You are welcome to your beliefs.

In no way does that address Inklesspen's argument that my unwillingness to immediately experience a painless death somehow contradicts or disproves my belief in a lack of personal identity continuity or constitutes a flaw in my argument. I don't associate my “identity” with my consciousness, I associate my identity with my body and especially with my brain, but it is coupled to the rest of it. That my consciousness is not the same from day to day is not an issue for me. My body very much is alive and is quite good at doing things. It would be a waste to kill it. That it is not static is actually quite a feature, I can learn and do new things.

I have an actual body with which I can do actual things and with which I am doing actual things. All that can be said about the uploading you want to do is that it is very hypothetical. There might be electronic gizmos in the future that might be able to hold a simulation of an identity that might be able to be extracted from a human brain and that electronic gizmo might then be able to do things.

Your belief that you will accomplish things once a version of your identity is uploaded into an electronic gizmo is about you and your beliefs. It is not in the slightest bit about me or my reasoning that a belief in personal identity continuity is an illusion.

People professing a belief in an actual Heaven where they will receive actual rewards doesn't constitute evidence that such beliefs are not illusory either. Such people are usually unwilling to allow themselves to be killed to reach those rewards sooner. That unwillingness does not prove their beliefs are illusory any more than a willingness to be killed would prove they were non-illusory. The members of the Heaven's Gate group believed they were uploading their identities to some kind of Mother Ship electronic gizmo and they were willing to take cyanide to accelerate the process. Their willingness to take poison does not constitute evidence (to me) that their beliefs were not illusory.

Comment author: jmmcd 26 August 2010 12:27:18PM 0 points [-]

There might be useful things I want to accomplish with my post-upload body and brain. I agree with inklesspen: this is a fatal inconsistency.

Comment author: daedalus2u 26 August 2010 07:17:10PM 0 points [-]

I see this as analogous to what some religious people say when they are unable to conceive of a sense of morality or any code of behavior that does not come from their God.

If you are unable to conceive of a sense of purpose that is not attached to a personal sense of continued personal identity, I am not sure I can convince you otherwise.

But why you consider that my ability to conceive of a sense of purpose without a personal belief in a continued sense of personal identity is somehow a "flaw" in my reasoning is not something I quite understand.

Are you arguing that because some people "need" a personal sense of continued personal identity that reality "has to" be that way?

People made (and still make) similar arguments about the existence of God.

Comment author: Perplexed 25 August 2010 12:33:11AM 4 points [-]

I'm not convinced it is even necessary. For example, I did not learn that I am conscious by using a consciousness detector. Instead, I was taught that I am conscious. It happened in fifth grade Spelling class. I recall that I learned both the word "conscious" and "unconscious" that day, and I unlearned the non-word "unconscience".

I sometimes think that philosophers pretend too strenuously that we all work out these things logically as adults, when we all know that the reality is that we are catechised into them as children.

Comment author: daedalus2u 25 August 2010 03:11:59AM 0 points [-]

perplexed, how do you know you do not have a consciousness detector?

Do you see because you use a light detector? Or because you use your eyes? Or because you learned what the word “see” means?

When you understand spoken language do you use a sound detector? A word detector? Do the parts of your brain that you use to decode sounds into words into language into meaning not do computations on the signals those parts receive from your ears?

The only reason you can think a thought is because there are neural structures that are instantiating that thought. If your neural structures were incapable of instantiating a thought, you would be unable to think that thought.

Many people are unable to think many thoughts. It takes many years to train a brain to be able to think about quantum mechanics. I am unable to think accurately about quantum mechanics. My brain does not have the neural structures to do so. My brain also does not have the neural structures to understand Chinese. If it did, I would be able to understand Chinese, which I cannot do.

There has to be a one-to-one correspondence between the neural structures that instantiate a mental activity and the ability to do that mental activity. The brain is not magic; it is chemistry and physics just like everything else. If a brain can do something it is because it has the structures that can do it.

Why is consciousness different than sight or hearing? If consciousness is something that can be detected, there needs to be brain structures that are doing the detecting. If consciousness is not something that can be detected, then what is it that we are talking about? This is very basic stuff. I am just stating logical identities here. I don't understand where the disagreement is coming from.

Comment author: wedrifid 24 August 2010 04:14:56AM *  2 points [-]

I am talking about minimum requirements, not sufficient requirements.

Those two seem to be the same thing in this context.

If you have a different definition I would be happy to consider it.

No, it's as good as any. Yet the 'any' I've seen are all incomplete. Just be very careful that when you are discussing one element of 'consciousness' you are careful to only come to conclusions that require that element of consciousness and not some part of consciousness that is not included in your definition. For example I don't consider the above definition to be at all relevant to the Fermi paradox.

Comment author: daedalus2u 24 August 2010 06:15:06PM 1 point [-]

To be a car; a machine at a minimum must have wheels. Wheels are not sufficient to make a machine into a car.

To be conscious, an entity must be self-aware of self-consciousness. To be self-aware of self-consciousness an entity must have a "self-consciousness-detector" A self-consciousness-detector requires data and computation resources to do the pattern recognition necessary to detect self-consciousness.

What else consciousness requires I don't know, but I know it must require detection of self-consciousness.

Comment author: wedrifid 24 August 2010 12:18:56AM 3 points [-]

Humans can do this too (emulate another entity such that they think they are that entity), I think that is in essence what Stockholm Syndrome causes. Under severe trauma, following dissociation and depersonalization, the self reforms, but in a pattern that matches, identifies with, and bonds to the perpetrator of the trauma. The traumatized person has attempted to emulate the “green-beard persona” to avoid death and abuse being perpetrated upon them by the person with the “green beard”.

This doesn't seem to be the natural interpretation. Stockholm Syndrome is more or less the typical outcome of human social politics exaggerated somewhat.

Comment author: daedalus2u 24 August 2010 03:45:31PM 0 points [-]

Is there something wrong with my interpretation of Stockholm Syndrome other than it is not the “natural interpretation"? Is it inconsistent with anything known about Stockholm Syndrome, how people interact, or how humans evolved?

Would we consider it surprising if humans did have a mechanism to try and emulate a “green beard” if having a green beard became essential for survival?

We know that some people find many green-beard-type reasons for attacking and even killing other humans. Race, ethnicity, religion, sexual orientation, gender, and so on are all reasons for hating and even killing other humans. How do the victims prevent themselves from being victimized? Usually by obscuring their identity, by attempting to display the “green beard” the absence of which brings attack.

Stockholm Syndrome happens in a short period of time, so it is easier to study than the “poser” habits that occur over a lifetime. Is it fundamentally different, or is it just one point on a spectrum?

Comment author: Perplexed 24 August 2010 12:54:21PM 8 points [-]

I'm not sure he realized they were machines, though.

Comment author: daedalus2u 24 August 2010 01:27:54PM 0 points [-]

Yes, and some people today don't realize that the brain does computations on sensory input in order to accomplish pattern recognition, and without that computation there is no pattern recognition and no perception. Of anything.

Comment author: Oscar_Cunningham 24 August 2010 09:20:10AM 5 points [-]

Better approximation: Don't write posts about consciousness, unless you have read about mysterious answers to mysterious questions, and you've had an insight that make consciousness seem less mysterious than before.

Comment author: daedalus2u 24 August 2010 01:22:48PM 1 point [-]

I had read mysterious answers to mysterious questions. I think I do have an explanation that makes consciousness seem less mysterious and which does not introduce any additional mysteries. Unfortunately I seem to be the only one who appreciates that.

Maybe if I had started out to discuss the computational requirements of the perception of consciousness there would have been less objection. But I don't see any way to differentiate between perception of consciousness and consciousness. I don't think you can have one without the other.

Minimum computation and data requirements for consciousness.

-13 daedalus2u 23 August 2010 11:53PM

Consciousness is a difficult question because it is poorly defined and is the subjective experience of the entity experiencing it. Because an individual experiences their own consciousness directly, that experience is always richer and more compelling than the perception of consciousness in any other entity; your own consciousness always seem more “real” and richer than the would-be consciousness of another entity.

Because the experience of consciousness is subjective, we can never “know for sure” that an entity is actually experiencing consciousness. However there must be certain computational functions that must be accomplished for consciousness to be experienced. I am not attempting to discuss all computational functions that are necessary, just a first step at enumerating some of them and considering implications.

First an entity must have a “self detector”; a pattern recognition computation structure which it uses to recognizes its own state of being an entity and of being the same entity over time. If an entity is unable to recognize itself as an entity, then it can't be conscious that it is an entity. To rephrase Descartes, "I perceive myself to be an entity, therefore I am an entity."  It is possible to be an entity and not perceive that one is an entity. This happens in humans but rarely. Other computation structures may be necessary also, but without an ability to recognize itself as an entity an entity cannot be conscious.

continue reading »
Comment author: gwern 02 August 2010 04:02:46AM 0 points [-]

Yes yes, this is an argument for suicide rates never going to zero - but again, the basic theory that suicide is inversely correlated, even partially, with quality of life would seem to be disproved by this point.

Comment author: daedalus2u 02 August 2010 12:53:21PM 3 points [-]

I think the misconception is that what is generally considered “quality of life” is not correlated with things like affluence. People like to believe (pretend?) that it is, and by ever striving for more affluence feel that they are somehow improving their “quality of life”.

When someone is depressed, their “quality of life” is quite low. That “quality of life” can only be improved by resolving the depression, not by adding the bells and whistles of affluence.

How to resolve depression is not well understood. A large part of the problem is people who have never experienced depression, don't understand what it is and believe that things like more affluence will resolve it.

Comment author: gwern 01 August 2010 05:27:06PM *  2 points [-]

My counterpoint to the above would be that if suicide rates are such a good metric, then why can they go up with affluence? (I believe this applies not just to wealthy nations (ie. Japan, Scandinavia), but to individuals as well, but I wouldn't hang my hat on the latter.)

Comment author: daedalus2u 01 August 2010 05:58:07PM 3 points [-]

Suicide rates are a measure of depression, not of how good life is. Depression can hit people even when they otherwise have a very good life.

View more: Next