Eliezer_Yudkowsky comments on Consciousness - Less Wrong

2 Post author: Mitchell_Porter 08 January 2010 12:18PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (221)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 10 January 2010 01:18:37PM 3 points [-]

What's in the shepherd that's not in the pebbles, exactly?

Let's move to the automated pebble-tracking system where a curtain twitches as the sheep passes, causing a pebble to fall into the bucket (the fabric is called Sensory Modality, from a company called Natural Selections). What is in the shepherd that is not in the automated, curtain-based sheep-tracking system?

Comment author: Mitchell_Porter 16 January 2010 08:10:12AM 1 point [-]

What is in the shepherd that is not in the automated, curtain-based sheep-tracking system?

Do you agree that there is a phenomenon of subjective meaning to be accounted for? The question of meaning does not originate with problems like "why does pebble-tracking work?". It arises because we attribute semantic content both to certain artefacts and to our own mental states.

If we view the number of pebbles as representing the number of sheep, this is possible because of the causal structure, but it actually occurs because of "human interpretation". Now if we go to mental states themselves, do you propose to explain their representational semantics in exactly the same way – human interpretation; which creates foundationless circularity – or do you propose to explain the semantics of human thought in some other way – and if so in what way – or will you deny that human thoughts have a semantics at all?

Comment author: Tyrrell_McAllister 10 January 2010 06:00:17PM *  1 point [-]

Even as a reductionist, I'll point out that the shepherd seems to have something in him that singles out the sheep specifically, as opposed to all other possible referents. The sheep-tracking system, in contrast, could just as well be counting sheep-noses instead of sheep. Or it could be counting sheep-passings—not the sheep themselves, but rather just their act of passing past the fabric. It's only when the shepherd is added to the system that the sheep-out-in-the-field get specified as the referents of the pebbles.

ETA: To expand a bit: The issue I raise above is basically Quine's indeterminacy of translation problem.

One's initial impulse might be to say that you just need "higher resolution". The idea is that the pebble machine just doesn't have a high-enough resolution to differentiate sheep from sheep-passings or sheep-noses, while the shepherd's brain does. This then leads to questions such as, How much resolution is enough to make meaning? Does the machine (without the shepherd) fail to be a referring thing altogether? Or does its "low resolution" just mean that it refers to some big semantic blob that includes sheep, sheep-noses, sheep-passings, etc.?

Personally, I don't think that this is the right approach to take. I think it's better to direct our energy towards resolving our confusion surrounding the concept of a computation.