You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

XiXiDu comments on Eight questions for computationalists - Less Wrong Discussion

16 Post author: dfranke 13 April 2011 12:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (87)

You are viewing a single comment's thread.

Comment author: XiXiDu 13 April 2011 02:01:29PM *  12 points [-]

There is too much vagueness involved here. A better question would be if there is any reason to believe that even though evolution could create consciousness we can not.

No doubt we don't know much about intelligence and consciousness. Do we even know enough to be able to tell that the use of the term "consciousness" makes sense? I don't know. But what I know is that we know a lot about physics and biological evolution and that we know that we are physical and an effect of evolution.

We know a bit less about the relation between evolutionary processes and intelligence but we do know that there is an important difference and that the latter can utilize the former.

Given all that we know, is it reasonable to doubt the possibility that we can create "minds", conscious and intelligent agents? I don't think so.

Comment author: byrnema 13 April 2011 03:56:01PM 6 points [-]

A better question would be if there is any reason to believe that even though evolution could create consciousness we can not.

Very good point! Even if consciousness does require something mysterious and metaphysical we don't know about, if it's harnessed within us (and robustly passes from parent to child over billions of births), we can harness it elsewhere.

Comment author: Laoch 24 August 2012 05:19:48PM *  0 points [-]

I reject the "Consciousness is really just computation" if you define computation as the operation of contemporary computers not brains, but I wholeheartedly agree that we are physical and an effect of evolution as is our subjective experience. I just don't think that the mind/consciousness is solely the neural connections of ones brain. Cell metabolism and whole organism metabolism and the environment of that organism define the concious experience also. If it's reduced to a neural net important factors will most certainly be lost.

Comment author: shminux 24 August 2012 08:01:50PM 0 points [-]

Does this mean that amputees should be less conscious?

Comment author: gwern 24 August 2012 10:36:02PM 3 points [-]

Maybe not with humans, but definitely for octopuses!

(More seriously, depending on how seriously you take embodied cognition, there may be some small loss. I mean, we know that your gut bacteria influence your mood via the nerves to the gut; so there are connections. And once there are connections, it becomes much more plausible that cut connections may decrease consciousness. After a few weeks in a float tank, how conscious would you be? Not very...)

Comment author: shminux 24 August 2012 11:22:26PM 0 points [-]

I'm pretty sure that you agree that none of this means that a human brain in a vat with proper connections to the environment, real or simulated, is inherently less conscious than one attached to a body.

Comment author: gwern 25 August 2012 12:24:00AM 0 points [-]

I don't take embodiment that far, no, but a simulated amputation in a simulation would seem as problematic as a real amputation in the real-world barring extraordinary intervention on the part of the simulation.

Comment author: Laoch 25 August 2012 09:08:49AM *  0 points [-]

No but subjective conscious experience would change definitely.

Comment author: Dolores1984 24 August 2012 06:06:23PM 0 points [-]

Well, that ought to be testable. If he upload a human, and the source of consciousness is lost, they should stop feeling it. Provided they're honest, we can just ask them.

Comment author: Laoch 24 August 2012 07:45:11PM 0 points [-]

That could very well be the case.

Comment author: [deleted] 24 August 2012 06:37:33PM 0 points [-]

Well, you're a p-zombie, you would say that.

Comment author: lessdazed 13 April 2011 02:23:16PM 0 points [-]

Do we even know enough to be able to tell that the use of the term "consciousness" makes sense? I don't know.

Is there a better word than "consciousness" for the explanation for why (I think I) say "I see red" and "I am conscious"? I do (think I) claim those things, so there is a causal explanation.

Comment author: Pfft 14 April 2011 01:01:18AM *  2 points [-]

I think any word would be better than "conciousness"! :) It really is a very confusing term, since it is often used (vaguely) to refer to quite different concepts.

Cognitive scientists often use it to mean something similar to "attention" or as the opposite of "unconscious". This is an "implementation level" view -- it refers to certain mechanisms used by the brain to process information.

Then there is what Ned Block calls "access consciousness", "the phenomenon whereby information in our minds is accessible for verbal report, reasoning, and the control of behavior" (to quote Wikipedia). This is a "functional specification level" view: conciousness is correctly implemented if it lets you accurately describe the world around you or the state of your own mind.

Then finally there's "phenomenological conciousness" or qualia or whatever you want to call it -- the mystical secret sauce.

No doubt these are all interrelated in complicated ways, but it certainly does not help matter to use terminology which further blurs the distinction. Especially since they are not equally mysterious: the actual implementation in the brain will take a long time to figure out, and as for the qualia it's hard to say even what a successful answer would look like. But at the functional specification level, it seems quite easy to give a (teleological) explanation. That is, it's easy to see that an agent benefits from being able to represent the world (and be able to say "I see a red thing") and to reason about itself ("each time I see a red thing I feel hungry"). So it's not very mysterious that we have mental concepts for "what I'm currently feeling", etc.