orthonormal comments on The conscious tape - Less Wrong

11 Post author: PhilGoetz 16 September 2010 07:55PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (113)

You are viewing a single comment's thread.

Comment author: orthonormal 17 September 2010 12:10:55AM 2 points [-]

I endorse the first alternative; the intuition at first felt wrong (in a Chinese Room sort of way), but that feeling disappeared when I realized the following:

I was envisioning a tape (call it Tape A) which only recorded some very small end result of Turing Machine A, like the numerical output of a calculation or the move Deep Blue makes. And that seems too "small" somehow to encapsulate consciousness— I felt that I needed the moving Turing machine to make it "live" in all its detail.

But of course, it's trivial to write a different Turing machine which writes on a tape (call it Tape B) the entire history of Machine A's computation (as well as its output), and this indeed has the required richness for me to be comfortable in calling Tape B conscious.

Comment author: David_Allen 17 September 2010 12:39:35AM 0 points [-]

But of course, it's trivial to write a different Turing machine which writes on a tape (call it Tape B) the entire history of Machine A's computation (as well as its output), and this indeed has the required richness for me to be comfortable in calling Tape B conscious.

In what context can Tape B be labeled conscious?

A history of consciousness does not seem to me to be the same as consciousness. A full debug trace of a program is simply not the same thing as the original program.

If however you create a Machine C that replays Tape B, I would grant that Machine C reproduces the consciousness of Machine A.

Comment author: orthonormal 17 September 2010 12:50:04AM 1 point [-]

This gets into hairy territory with no clear "conscious"/"not conscious" boundary between a spectrum of different variations, but I'd say that the interpretive framework needed to trace a thought from the log on Tape B is essentially the same as the interpretive framework needed to trace it from the action of Machine A on the start tape. They're isomorphic mathematical objects.

Comment author: David_Allen 17 September 2010 02:12:10AM 0 points [-]

I agree with everything you say here.

I claim that the "interpretive framework" you refer to is essential in the labeling of Tape B as conscious. Without specifying the context, the consciousness of Tape B is unknown.

Comment author: orthonormal 18 September 2010 05:47:15PM 4 points [-]

I claim that the "interpretive framework" you refer to is essential in the labeling of Tape B as conscious. Without specifying the context, the consciousness of Tape B is unknown.

You might be interested in the thought experiment of a so-called "joke interpretation", which maps the random molecule oscillations in (say) a rock onto a conscious mind, and asks what the difference is between this and a more "reasonable" map from a brain to a mind. There's a good discussion of this in Good and Real.

Comment author: David_Allen 20 September 2010 11:36:40PM 0 points [-]

I skimmed the material and see what you mean.

I would restate the thought experiment as such. A state sequence measured from a rock is used to generate a look-up table that maps from the rock state sequence to a pre-measured consciousness state sequence. This is essentially an encryption of the consciousness state sequence using the rock state sequence as a one-time pad. The consciousness state sequence can be generated by replaying the rock state sequence through the look-up table. With the final question being: is the rock conscious?

In the model I've outlined in my comments, consciousness exists at the level the consciousness abstraction is present. In this case that abstraction is not present at the level of the rock, but only at the level of the system that uses the look-up table, and only for the duration of the sequence. The states measured from the rock are used to generate the consciousness, but they are not the consciousness.

Comment author: bogus 21 September 2010 12:22:34AM *  1 point [-]

In this case that abstraction is not present at the level of the rock, but only at the level of the system that uses the look-up table

What is the "system that uses the look-up table"? Do you require a particular kind of physical system in order for consciousness to exist? If not, what if the "system" which replays the sequence is a human with a pen and paper? Does the system truly exhibit the original consciousness sequence, in addition to the human's existing consciousness?

Comment author: David_Allen 21 September 2010 03:01:41AM *  1 point [-]

Ah, Chinese room questions.

The system that replays the sequence can be anything, including a human with pen and paper.

Does the system truly exhibit the original consciousness sequence, in addition to the human's existing consciousness?

Yes, assuming that the measured consciousness sequence captured the essential elements of the original consciousness.

To "see" the original consciousness in this system you must adopt the correct context; the context that resolves the consciousness abstraction within the system. From that context you will not see the human. If you see a human following instructions and making notes, you will not see the consciousness he is generating.

Consider a chess program playing a game against itself. If we glance at the monitor we would see the game as it progresses. If instead we only could examine the quarks that make up the computer, we would be completely blind to the chess program abstraction.

Comment author: PhilGoetz 17 September 2010 06:06:59PM 1 point [-]

Is a mono-consciousness then impossible?

Comment author: David_Allen 20 September 2010 11:59:57PM 0 points [-]

Thanks for the clarification.

In my comments I have been working on the idea that consciousness is an abstraction. The context in which the consciousness abstraction exists, is where consciousness can be found.

So a mono-consciousness would still have a context that supports a consciousness abstraction. I don't see any problem with that. However the consciousness might be like a feral child, no table manners and very strange to us.

How about this. If a consciousness tells a joke in a forest where no other consciousness can hear it, is the joke still funny?

Comment author: David_Allen 17 September 2010 07:35:47PM *  0 points [-]

I'll need more details. What is a mono-consciousness?

Comment author: PhilGoetz 20 September 2010 03:57:56PM 0 points [-]

I was thinking of a magnetic monopole. A single consciousness, that does not interact with any others