pianoforte611 comments on AALWA: Ask any LessWronger anything - Less Wrong

28 Post author: Will_Newsome 12 January 2014 02:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (611)

You are viewing a single comment's thread. Show more comments above.

Comment author: pianoforte611 12 January 2014 02:11:48PM 1 point [-]

I'm guessing you think free will is a trivial problem, what about consciousness? That still baffles me.

Comment author: ThrustVectoring 12 January 2014 02:21:55PM 3 points [-]

The most apt description I've found is something along the lines of "consciousness is what information-processing feels like from the inside."

It's not just about the what a brain does, because a simulated brain would still be conscious, despite not being made of neurons. It's about certain kinds of patterns of thought (not the physical neural action, but thought as in operation performed on data). Human brains have it, insects don't, anything in between is something for actual specialists to discuss. But what it is - the pattern of data processing - isn't all that mysterious.

Comment author: pianoforte611 12 January 2014 09:21:02PM 4 points [-]

Okay but why does information processing feel like anything at all? There are cognitive processes that are information processing but you are not conscious of them.

Comment author: Locaha 12 January 2014 06:30:43PM -1 points [-]

Human brains have it

How do you know?

Comment author: Eugine_Nier 14 January 2014 01:14:51AM 1 point [-]

I know I'm conscious because I experience it. As for everyone else, really I'm generalizing from one example.

Comment author: Locaha 14 January 2014 07:45:26AM -1 points [-]

I know I'm conscious because I experience it.

So do I, but it doesn't help me to assess the consciousness of others.

Comment author: Alsadius 16 January 2014 07:35:33AM 1 point [-]

Occam's Razor. All these people seem similar to me in so many ways, they're probably similar in this way too, especially if they all say that they are.

Comment author: Locaha 16 January 2014 07:47:06AM -1 points [-]

The little box that claims it experiences consciousness (just like you do) is also similar to you. How do you decide what is similar enough and what is not?

Comment author: Alsadius 16 January 2014 08:07:30AM 1 point [-]

We live in a world effectively devoid of borderline cases. Humans are clearly close enough, since they all act like they're thinking in basically similar fashions, and other species are clearly not. I will have to reconsider this when we encounter non-human intelligences, but for now I have zero data on those, and thus cannot form a meaningful opinion.

Comment author: Locaha 16 January 2014 08:37:17AM -1 points [-]

I suggest you taboo the word clearly. For example, it is not at all clearly to me that a 6 month infant experience consciousness as I do. But if the infant does, then surely an adult chimpanzee do too?

See where it's going?

Comment author: Eugine_Nier 17 January 2014 03:40:32AM *  -1 points [-]

Well, it is possible to make an argument based on the Self-Sampling Assumption that only people who share the rare inherent trait X with me are conscious.

Comment author: Locaha 17 January 2014 08:30:00AM -1 points [-]

Is is a sort of trait the talking box can't possibly have?

Comment author: ThrustVectoring 13 January 2014 12:44:35AM 1 point [-]

I find it awfully suspicious that the vast majority of humans talk about experiencing consciousness. It'd be very strange if they were doing so for no reason, so I think that the human brain has some kind of pattern of thought that causes talking about consciousness.

For brevity, I call that-kind-of-thinking-that-causes-people-to-talk-about-consciousness "consciousness".

Comment author: Locaha 13 January 2014 06:35:46AM *  -1 points [-]

Definition of "it has it if it talks about it" is problematic. You can make a very simple machine that talks about experiencing consciousness.

Comment author: ThrustVectoring 13 January 2014 03:31:01PM 2 points [-]

You can make a very simple machine that talks about experiencing consciousness.

And that simple machine does so because it was made to do so by people experiencing consciousness.

Comment author: Locaha 13 January 2014 06:33:37PM -1 points [-]

And that simple machine does so because it was made to do so by people experiencing consciousness.

How do you know?

Comment author: ThrustVectoring 13 January 2014 08:21:18PM 1 point [-]

I find it awfully suspicious that the vast majority of humans talk about experiencing consciousness. It'd be very strange if they were doing so for no reason.

Comment author: gjm 13 January 2014 03:17:40PM 0 points [-]

And if interaction with such machines is the only ground you have for thinking that anything experiences consciousness, I think it would be reasonable to say that "consciousness" is whatever it is that makes those machines talk that way.

In practice, much of our notion of "consciousness" comes from observing our own mental workings, and I think we each have pretty good evidence that other people function quite similarly to ourselves, all of which makes that scenario unlikely to be the one we're actually in.

Comment author: gjm 13 January 2014 03:16:02PM -1 points [-]

How does anyone learn what the term "consciousness" applies to? So far as I can tell, it's universally by observing human beings (who are, so far as anyone can tell, implemented almost entirely in human brains) and most specifically themselves. So it seems that if "consciousness" refers to anything at all, it refers to something human brains -- or at least human beings -- have. (I would say the same thing about "intelligence" and "humanity" and "personhood".)

I suppose it's just barely possible that, e.g., someone might find good evidence that many human beings are actually some kind of puppets controlled from outside the Matrix. In that case we might want to say that some human brains have consciousness but not all. This seems improbable enough -- it seems on a par with discovering that we're in a simulation where the electrical conductivity of copper emerges naturally from the underlying laws, while the electrical conductivity of iron is hacked in case by case by experimenters who are deliberately misleading us about what the laws are -- that I feel perfectly comfortable ignoring the possibility until some actual evidence comes along.