Nisan comments on The two insights of materialism - Less Wrong

18 Post author: Academian 24 March 2010 02:47PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (132)

You are viewing a single comment's thread. Show more comments above.

Comment author: Nisan 25 March 2010 03:59:13PM 2 points [-]

The former might believe that consciousness arises from particular physical interactions — interactions that might exist in the brain but not in a computer.

Comment author: mattnewport 25 March 2010 04:02:12PM 1 point [-]

Wouldn't such a person believe that you can't fully simulate a person at all with a conventional computer though?

Comment author: Nisan 25 March 2010 04:10:38PM *  2 points [-]

I think Phil Goetz is using the term "simulate" in its computational or mathematical sense: The materialist of the first kind would agree that if you had a pretty good algorithmic model of a brain, you could simulate that model in a computer and it would behave just like the brain. But they would not agree that the simulation had consciousness.

ETA: Correct me if I'm wrong, but a materialist of the first kind would be one who is open to the possibility of p-zombies.

Comment author: bogus 25 March 2010 04:21:10PM *  3 points [-]

ETA: Correct me if I'm wrong, but a materialist of the first kind would be one who is open to the possibility of p-zombies.

No, p-zombies are supposed to be indistinguishable from the real thing. You can tell apart a simulation of consciousness from an actual conscious being, because the simulation is running on a different substrate.

Comment author: Nick_Tarleton 25 March 2010 05:07:58PM *  4 points [-]

Basically, yes. But I think it's worthwhile to distinguish between physically (the original definition), functionally, and behaviorally identical p-zombies, where materialists reject the possibility of the first, and functionalists reject the first and second (each is obviously a superset of the former).

NB: "Functionally identical" is handwaving, absent some canonical method of figuring out what computation a physical system implements (the conscious-rocks argument).

Comment author: mattnewport 25 March 2010 04:19:29PM 2 points [-]

Do people holding this view who call themselves materialists actually exist? It seems an incoherent position to hold and I can't recall seeing anyone express that belief. It seems very similar to the dualist position that consciousness has some magic property that can't be captured outside of a human brain.

Comment author: Nick_Tarleton 25 March 2010 05:01:11PM *  1 point [-]
Comment author: mattnewport 25 March 2010 05:42:31PM 0 points [-]

As far as I can tell from looking at those links both Searle and Pearce would deny the possibility of simulating a person with a conventional computer. I understand that position and while I think it is probably wrong it is not obviously wrong and it could turn out to be true. It seems that this is also Penrose's position.

From the Chinese Room Wikipedia entry for example:

Searle accuses strong AI of dualism, the idea that the mind and the body are made up of different "substances". He writes that "strong AI only makes sense given the dualistic assumption that, where the mind is concerned, the brain doesn't matter." He rejects any form of dualism, writing that "brains cause minds" and that "actual human mental phenomena [are] dependent on actual physical-chemical properties of actual human brains", a position called "biological naturalism" (as opposed to alternatives like behaviourism, functionalism, identity theory and dualism).

From the Pearce link you gave:

Secondly, why is it that, say, an ant colony or the population of China or (I'd argue) a digital computer - with its classical serial architecture and "von Neumann bottleneck" - don't support a unitary consciousness beyond the aggregate consciousness of its individual constituents, whereas a hundred billion (apparently) discrete but functionally interconnected nerve cells of a waking/dreaming vertebrate CNS can generate a unitary experiential field? I'd argue that it's the functionally unique valence properties of the carbon atom that generate the macromolecular structures needed for unitary conscious mind from the primordial quantum minddust.

So I still wonder whether anyone actually believes that you could simulate a human mind with a computer but that it would not be conscious.

Comment author: bogus 25 March 2010 05:54:24PM 2 points [-]

both Searle and Pearce would deny the possibility of simulating a person with a conventional computer.

They would deny that a conventional computer simulation can create subjective experience. However, the Church-Turing thesis implies that if physicalism is true then conscious beings can be simulated. AFAICT, it is only Penrose who would deny this.

Comment author: mattnewport 25 March 2010 06:25:59PM 0 points [-]

Do you mean the Church-Turing-Deutsch principle? It appears to me that Pearce at least in the linked article is making a claim which effectively denies that principle - his claim implies that physics is not computable.

Comment author: bogus 25 March 2010 06:42:51PM 0 points [-]

It appears to me that Pearce at least in the linked article is making a claim which effectively denies that principle - his claim implies that physics is not computable.

Why? Pearce is a physicalist, not a computationalist; he ought to accept the possibility of a computation which is behaviorally identical to consciousness but has no conscious experience.

Comment author: mattnewport 25 March 2010 06:51:18PM 0 points [-]

he ought to accept the possibility of a computation which is behaviorally identical to consciousness but has no conscious experience.

What sense of 'ought' are you using here? That seems like a very odd thing to believe to me. If you think that's what he actually believes you're going to have to point me to some evidence.

Comment author: Nick_Tarleton 25 March 2010 06:08:20PM *  1 point [-]

Basically, what bogus said.

I'm confused about what you mean by "simulating a person". Presumably you don't mean simulating in a way that is conscious/has mental states (since that would make the claim under discussion trivially, uninterestingly inconsistent), so presumably you do mean just simulating the physics/neurology and producing the same behavior. While AFAIK neither explicitly says so in the links, Searle and Pearce both seem to me to believe the latter is possible. (Searle in particular has never, AFAIK, denied that an unconscious Chinese Room would be possible in principle; and by "strong AI" Searle means the possibility of AI with an 'actual mind'/mental states/consciousness, not just generally intelligent behavior.)

Comment author: mattnewport 25 March 2010 06:19:12PM 1 point [-]

so presumably you do mean just simulating the physics/neurology and producing the same behavior.

Yes. Equivalently, is uploading possible with conventional computers?

It seems to me that both Searle and Pearce would answer no to both questions. Pearce in particular seems to be saying that consciousness depends on quantum properties of brains that cannot be simulated by a conventional computer. It appears to me that this is equivalent to a claim that physics is not computable but I'm not totally confident of that equivalence. I have trouble reading any other conclusion from anything in those links. Can you point to a quote that makes you think otherwise?

Comment author: Nick_Tarleton 26 March 2010 01:34:36AM 1 point [-]

It appears to me that this is equivalent to a claim that physics is not computable but I'm not totally confident of that equivalence.

I don't think Pearce or Searle would agree with this, and it sounds like you might be projecting your belief onto them. We already know of philosophers who explicitly endorse the possibility of zombies, so it's not surprising for philosophers to endorse positions that imply the possibility of zombies.

Can you point to a quote that makes you think otherwise?

Afraid not, but I think if they thought physics were uncomputable (in the behavioral-simulation sense) they would say so more explicitly.

Comment author: mattnewport 26 March 2010 04:41:41AM 0 points [-]

I don't think Pearce or Searle would agree with this, and it sounds like you might be projecting your belief onto them.

Way back at the beginning of this thread I was trying to establish whether anybody who calls themselves a materialist actually believes the statement "you can't fully simulate a person without the simulation being conscious" to be false. I still don't feel I have an answer to that question. It seems that bogus might believe that statement to be false but he is frustratingly evasive when it comes to answering any direct questions about what he actually believes. It seems we are not currently in a position to say definitively what Pearce or Searle believe.

The only reason I asked in the first place is that I've tended to assume someone who self-describes as a materialist would also believe that statement to be true. I guess the moral of this thread is that I can't assume that and should ask if I want to know.

Comment author: bogus 25 March 2010 06:26:53PM 0 points [-]

quantum properties of brains that cannot be simulated by a conventional computer.

To the best of our knowledge, any "quantum property" can be simulated by a classical computer with approx. exponential slowdown. Obviously, a classical computer is not going to instantiate these quantum properties.

Comment author: mattnewport 25 March 2010 06:30:28PM 2 points [-]

Obviously, a classical computer is not going to instantiate these quantum properties.

Is that obvious?

Comment author: bogus 25 March 2010 04:30:28PM 0 points [-]

I don't know about consciousness, but the position that subjective experience has some magic property is common sense. Materialism is just a reasonable attempt to ground that magic property in the physical world.

Comment author: bogus 25 March 2010 04:08:59PM 0 points [-]

You could fully simulate the person's consciousness. The simulation won't have any subjective experience, and it might also be very inefficient from a computational perspective. Compare running an executable program on a computer vs. running the same program in an interpreted VM.