Comment author: wedrifid 24 August 2010 04:14:56AM *  2 points [-]

I am talking about minimum requirements, not sufficient requirements.

Those two seem to be the same thing in this context.

If you have a different definition I would be happy to consider it.

No, it's as good as any. Yet the 'any' I've seen are all incomplete. Just be very careful that when you are discussing one element of 'consciousness' you are careful to only come to conclusions that require that element of consciousness and not some part of consciousness that is not included in your definition. For example I don't consider the above definition to be at all relevant to the Fermi paradox.

Comment author: daedalus2u 24 August 2010 06:15:06PM 1 point [-]

To be a car; a machine at a minimum must have wheels. Wheels are not sufficient to make a machine into a car.

To be conscious, an entity must be self-aware of self-consciousness. To be self-aware of self-consciousness an entity must have a "self-consciousness-detector" A self-consciousness-detector requires data and computation resources to do the pattern recognition necessary to detect self-consciousness.

What else consciousness requires I don't know, but I know it must require detection of self-consciousness.

Comment author: wedrifid 24 August 2010 12:18:56AM 3 points [-]

Humans can do this too (emulate another entity such that they think they are that entity), I think that is in essence what Stockholm Syndrome causes. Under severe trauma, following dissociation and depersonalization, the self reforms, but in a pattern that matches, identifies with, and bonds to the perpetrator of the trauma. The traumatized person has attempted to emulate the “green-beard persona” to avoid death and abuse being perpetrated upon them by the person with the “green beard”.

This doesn't seem to be the natural interpretation. Stockholm Syndrome is more or less the typical outcome of human social politics exaggerated somewhat.

Comment author: daedalus2u 24 August 2010 03:45:31PM 0 points [-]

Is there something wrong with my interpretation of Stockholm Syndrome other than it is not the “natural interpretation"? Is it inconsistent with anything known about Stockholm Syndrome, how people interact, or how humans evolved?

Would we consider it surprising if humans did have a mechanism to try and emulate a “green beard” if having a green beard became essential for survival?

We know that some people find many green-beard-type reasons for attacking and even killing other humans. Race, ethnicity, religion, sexual orientation, gender, and so on are all reasons for hating and even killing other humans. How do the victims prevent themselves from being victimized? Usually by obscuring their identity, by attempting to display the “green beard” the absence of which brings attack.

Stockholm Syndrome happens in a short period of time, so it is easier to study than the “poser” habits that occur over a lifetime. Is it fundamentally different, or is it just one point on a spectrum?

Comment author: Perplexed 24 August 2010 12:54:21PM 8 points [-]

I'm not sure he realized they were machines, though.

Comment author: daedalus2u 24 August 2010 01:27:54PM 0 points [-]

Yes, and some people today don't realize that the brain does computations on sensory input in order to accomplish pattern recognition, and without that computation there is no pattern recognition and no perception. Of anything.

Comment author: Oscar_Cunningham 24 August 2010 09:20:10AM 5 points [-]

Better approximation: Don't write posts about consciousness, unless you have read about mysterious answers to mysterious questions, and you've had an insight that make consciousness seem less mysterious than before.

Comment author: daedalus2u 24 August 2010 01:22:48PM 1 point [-]

I had read mysterious answers to mysterious questions. I think I do have an explanation that makes consciousness seem less mysterious and which does not introduce any additional mysteries. Unfortunately I seem to be the only one who appreciates that.

Maybe if I had started out to discuss the computational requirements of the perception of consciousness there would have been less objection. But I don't see any way to differentiate between perception of consciousness and consciousness. I don't think you can have one without the other.

Comment author: nawitus 24 August 2010 09:20:54AM 0 points [-]

Consciousness actually means a number of different things, so any one definition will make discussion problematic. There really should be a number of different definitions for qualia/subjective consciousness, empirical consciousness etc.

Comment author: daedalus2u 24 August 2010 01:09:42PM 1 point [-]

nawitus, my post was too long as it is. If I had included multiple discussions of multiple definitions of consciousness and qualia, you would either still be reading it or would have stopped because it was too long.

Comment author: cousin_it 24 August 2010 09:13:53AM *  9 points [-]

The first dubious statement in the post seems to be this:

Because the experience of consciousness is subjective, we can never “know for sure” that an entity is actually experiencing consciousness.

How can you make such a statement about the entire future of science? A couple quotes:

"We may determine their forms, their distances, their bulk and their motions, but we can never know anything about their chemical and mineralogical structure" - Auguste Comte talking about stars in 1835

"Heavier than air flying machines are impossible" - Lord Kelvin, 1895

The second dubious statement comes right after the first:

However there must be certain computational functions that must be accomplished for consciousness to be experienced.

The same question applies: how on Earth do you know that? Where's your evidence? Sharing opinions only gets us so far!

And it just goes downhill from there.

Comment author: daedalus2u 24 August 2010 12:49:03PM 0 points [-]

With all due respect to Lord Kelvin, he personally knew of heavier than air flying machines. We now call them birds. He called them birds too.

Comment author: Oscar_Cunningham 24 August 2010 09:39:51AM *  3 points [-]

EDIT: I realise that you asked us to be gentle, and all I've done is point out a flaws. Feel free to ignore me.

You explore many interesting ideas, but none of them are backed up with enough evidence to be convincing. I doubt that anything you've said is correct. The first example of this is this statement:

Because the experience of consciousness is subjective, we can never “know for sure” that an entity is actually experiencing consciousness.

How do you know?

What if tomorrow a biologist worked out what caused conciousness and created a simple scan for it? What evidence do you have that would make you surprised if this happened?

First an entity must have a “self detector”; a pattern recognition computation structure which it uses to recognizes its own state of being an entity and of being the same entity over time.

Why? What is it that actually makes it impossible to have a concious (has qualia) entity that is not self-aware (knows some stuff about itself).

Recommended reading: http://lesswrong.com/lw/jl/what_is_evidence/

Comment author: daedalus2u 24 August 2010 12:43:28PM 0 points [-]

We can't “know for sure” because consciousness is a subjective experience. The only way you could “know for sure” would be if you simulated an entity and so knew from how you put the simulation together that the entity you were simulating did experience self-consciousness.

So how does this hypothetical biologist calibrate his consciousness scanner? Calibrate it so that he “knows for sure” that it is reading consciousness correctly? His degree of certainty in the output of his consciousness scanner is limited by his degree of certainty in his calibration standards. Even if it worked perfectly.

In order to be aware of something, you need to detect something. To detect something you need to receive sensory data and then process that data via pattern recognition into detection or not detection.

To detect consciousness your hypothetical biologist needs a “consciousness scanner”. So does any would-be detector of any consciousness. That “consciousness scanner” has to have certain properties whether it is instantiated in electronics or in meat. Those properties include receipt of sufficient data and then pattern recognition on that data to determine a detection or a not detection. That pattern recognition will be subject to type 1 errors and type 2 errors.

Comment author: wedrifid 24 August 2010 03:17:48AM *  3 points [-]

So, for example, any computer program that has the ability to to parse and understand relevant features of its own source code and also happens to have a few 'if' statements in some of the relevant areas.

It may actually exclude certain humans that I would consider conscious. (I believe Yvain mentioned this too.)

Comment author: daedalus2u 24 August 2010 03:43:10AM 0 points [-]

I am talking about minimum requirements, not sufficient requirements.

I am not sure what you mean by "understand relevant features of its own source code".

I don't know any humans that I would consider conscious that don't fit the definition of consciousness that I am using. If you have a different definition I would be happy to consider it.

Comment author: Yvain 24 August 2010 01:42:01AM *  9 points [-]

I think you've hit on an important point in asking what dissociation syndromes show about the way the mind processes "selfhood", and you could expand upon that by considering a whole bunch of interesting altered states that seem to correspond to something in the temporal lobe (I can't remember the exact research).

I didn't completely follow the rest of the article. Is "consciousness" even the right term to use here? It has way too many meanings, and some of them aren't what you're talking about here - for example, I don't see why there can't be an entity that has subjective experience but no personal identity or self-knowledge. Consider calling the concept you're looking for "personal identity" instead.

I also take issue with some of the language around continuity of personal identity being an illusion. I agree with you that it probably doesn't correspond to anything in the universe, but it belongs in a category with morality of "Things we're not forced to go along with by natural law, but which are built into our goal system and finding they don't have any objective basis doesn't force us to give them up". I don't think aliens would be philosophically rash enough to stop existing just because of a belief that personal identity is an illusion.

Also, paragraph breaks!

Comment author: daedalus2u 24 August 2010 03:31:15AM 3 points [-]

Yvain, what I mean by illusion is:

perceptions not corresponding to objective reality due to defects in sensory information processing used as the basis for that perception.

Optical illusions are examples of perceptions that don't correspond to reality because of how our nervous system processes light signals. Errors in perception; either false positives or false negatives are illusions.

In some of the meditative traditions there is the goal of "losing the self". I have never studied those traditions and don't know much about them. I do know about dissociation from PTSD.

There can be entities that are not self-aware. I think that most animals that don't recognize themselves in a mirror fit in the category of not recognizing themselves as entities. That was not the focus of what I wanted to talk about.

To be self-aware, an entity must have an entity detector that registers “self” upon exposure to certain stimuli.

Some animals do recognize other entities but don't recognize themselves as “self”. They perceive another entity in a mirror, not themselves.

Comment author: Jayson_Virissimo 24 August 2010 02:15:26AM 3 points [-]

daedalus2u, taboo "consciousness".

Comment author: daedalus2u 24 August 2010 02:46:36AM *  0 points [-]

[Consciousness] :The subjective state of being self-aware that one is an autonomous entity that can differentially regulate what one is thinking about.

View more: Prev | Next