Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: alicey 10 August 2017 02:56:30AM 0 points [-]

you are missing the concept of blather

Comment author: g_pepper 10 August 2017 04:14:33AM 0 points [-]

The definition of "blather" that I find is:

"talk long-windedly without making very much sense", which does not sound like Thomas's comment.

What definition are you using?

Comment author: alicey 02 August 2017 01:14:10AM 0 points [-]

blather

Comment author: g_pepper 02 August 2017 03:07:01AM 2 points [-]

Thomas's comment seems quite sensible to me.

It seems to me that Dyson's argument was that as temperature falls, so does the energy required for computing. So, the point in time when we run out of available energy to compute diverges. But, Thomas reasonably points out (I think - correct me if I am misrepresenting you Thomas) that as temperature falls and the energy used for computing falls, so does the speed of computation, and so the amount of computation that can be performed converges, even if we were to compute forever.

Also, isn't Thomas correct that Planck's constant puts an absolute minimum on the amount of energy required for computation?

These seem like perfectly reasonable responses to Dyson's comments. What am I missing?

Comment author: drethelin 20 July 2017 05:13:12PM 1 point [-]

You can justify a belief in "Induction works" by induction over your own life.

Comment author: g_pepper 20 July 2017 05:27:05PM *  0 points [-]

Wouldn't that be question begging?

Comment author: tadasdatys 20 July 2017 04:06:50PM 0 points [-]

These things are subjective but not unreal.

Did you mean, "at present subjective"? Because if something is objectively measurable then it is objective. Are these things both subjective and objective? Or will we stop being conscious, when we get a better understanding of the brain.

I know that I have first person experiences and I know that I am self-aware via direct experience.

Are those different experiences or different words for the same thing? What would it feel like to be self-aware without having first person experiences or vice versa?

Comment author: g_pepper 20 July 2017 05:12:44PM *  0 points [-]

Did you mean, "at present subjective"? Because if something is objectively measurable then it is objective. Are these things both subjective and objective?

To clarify, consciousness is a subjective experience, or more precisely it is the ability to have (subjective) first person experiences. Beliefs are similarly "in the head of the believer". Whether either of these things will be measurable/detectable by an outside observer in the future is an open question.

Are those different experiences or different words for the same thing? What would it feel like to be self-aware without having first person experiences or vice versa?

Interesting questions. It seems to me that self awareness is a first person experience, so I am doubtful that you could have self awareness without the ability to have first person experiences. I don't think that they are different words for the same thing though - I suspect that there are first-person experiences other than self awareness. I don't see how my argument or yours depends on whether or not first-person experiences and self-awareness are the same; do you ask the questions for any particular reason, or did you just find them to be interesting questions?

Comment author: tadasdatys 20 July 2017 06:09:05AM 0 points [-]

You can't get the answer to either of those things via measurement

What makes you think that? Surely this belief would be a memory and memories are physically stored in the brain, right? Again, there is a difference between difficult and impossible.

self-awareness and first-person experiences

Those sound like synonyms, not in any way more precise than the word "consciousness" itself.

Comment author: g_pepper 20 July 2017 12:27:32PM 0 points [-]

What makes you think that? Surely this belief would be a memory and memories are physically stored in the brain, right?

To clarify: at the present you can't obtain a person's beliefs by measurement, just as at the present we have no objective test for consciousness in entities with a physiology significantly different from our own. These things are subjective but not unreal.

Those sound like synonyms, not in any way more precise than the word "consciousness" itself.

And yet I know that I have first person experiences and I know that I am self-aware via direct experience. Other people likewise know these things about themselves via direct experience. And it is possible to discuss these things based on that common understanding. So, there is no reason to stop using the word "consciousness".

Comment author: tadasdatys 19 July 2017 01:54:55PM 0 points [-]

it is difficult or impossible for an observer to know whether an entity with a physiology significantly different from the observer's is conscious

There is a big gap between "difficult" and "impossible". If a thing is "difficult to measure", then you're supposed to know in principle what sort of measurement you'd want to do, or what evidence you could in theory find, that proves or disproves it. If a thing is "impossible to measure", then the thing is likely bullshit.

there is a common understanding of the term "conscious"

What understanding exactly? Besides "I'm conscious" and "rocks aren't conscious", what is it that you understand about consciousness?

Comment author: g_pepper 19 July 2017 08:16:44PM 0 points [-]

If a thing is "impossible to measure", then the thing is likely bullshit.

In the case of consciousness, we are talking about subjective experience. I don't think that the fact that we can't measure it makes it bullshit. For another example, you might wonder whether I have a belief as to whether P=NP, and if so, what that belief is. You can't get the answer to either of those things via measurement, but I don't think that they are bullshit questions (albeit they are not particularly useful questions).

What understanding exactly? Besides "I'm conscious" and "rocks aren't conscious", what is it that you understand about consciousness?

In brief, my understanding of consciousness is that it is the ability to have self-awareness and first-person experiences.

Comment author: tadasdatys 18 July 2017 07:16:23PM 0 points [-]

Let me say it differently. There is a category in your head called "conscious entities". Categories are formed from definitions or by picking some examples and extrapolating (or both). I say category, but it doesn't really have to be hard and binary. I'm saying that "conscious entities" is an extrapolated category. It includes yourself, and it excludes inanimate objects. That's something we all agree on (even "inanimate objects" may be a little shaky).

My point is that this is the whole specification of "conscious entities". There is nothing more to help us decide, which objects belong to it, besides wishful thinking. Usually we choose to include all humans or all animals. Some choose to keep themselves as the only member. Others may want to accept plants. It's all arbitrary. You may choose to pick some precise definition, based on something measurable, but that will just be you. You'll be better off using another label for your definition.

Comment author: g_pepper 19 July 2017 12:47:41PM 0 points [-]

That it is difficult or impossible for an observer to know whether an entity with a physiology significantly different from the observer's is conscious is not really in question - pretty much everyone on this thread has said that. It doesn't follow that I should drop the term or a "use another label"; there is a common understanding of the term "conscious" that makes it useful even if we can't know whether "X is conscious" is true in many cases.

Comment author: tadasdatys 18 July 2017 03:18:58PM 0 points [-]

Here's what I think happened.

You observed something interesting happening in your brain, you labeled it "consciousness".
You observed that other humans are similar to you both in structure and in behavior, so you deduced that the same interesting thing is is happening in their brains, and labeled the humans "conscious".
You observed that a rock is not similar to you in any way, deduced that the same interesting thing is not happening in it, and labeled it "not conscious".
Then you observed a robot, and you asked "is it conscious?". If you asked the full question - "are the things happening in a robot similar to the things happening in my brain" - it would be obvious that you won't get a yes/no answer. They're similar in some ways and different in others.

Comment author: g_pepper 18 July 2017 04:10:11PM *  0 points [-]

You observed something interesting happening in your brain, you labeled it "consciousness". You observed that other humans are similar to you both in structure and in behavior, so you deduced that the same interesting thing is is happening in their brains, and labeled the humans "conscious".

Yes, that sounds about right, with the caveat that I would say that other humans are almost certainly conscious. Obviously there are people (e.g. solipsists) who don't think that conscious minds other than their own exist.

You observed that a rock is not similar to you in any way, deduced that the same interesting thing is not happening in it, and labeled it "not conscious".

That sounds approximately right, albeit it is not just the fact that a rock is dissimilar to me that leads me to believe it to be unconscious. I am open to the possibility that entities very different from myself might be conscious.

Then you observed a robot, and you asked "is it conscious?". If you asked the full question - "are the things happening in a robot similar to the things happening in my brain" - it would be obvious that you won't get a yes/no answer. They're similar in some ways and different in others.

I'm not sure that "is the robot conscious" is really equivalent to "are the things happening in a robot similar to the things happening in my brain". It could be that some things happening in the robot's brain are similar in some ways to some things happening in my brain, but the specific things that are similar might have little or nothing to do with consciousness. Moreover, even if a robot's brain used mechanisms that are very different from those used by my own brain, this would not mean that the robot is necessarily not conscious. That is what makes the consciousness question difficult - we don't have an objective way of detecting it in others, particularly in others whose physiology differs significantly from our own. Note that this does not make consciousness unreal, however.

I would be willing to answer "no" to the "is the robot conscious" question for any current robot that I have seen or even read about. But, that is not to say that no robot will ever be conscious.I do agree that there could be varying degrees of consciousness (rather than a yes/no answer), e.g. I suspect that animals have varying degrees of consciousness, e.g. non-human apes a fairly high degree, ants a low or zero degree, etc.

I don't see why any of this would lead to the conclusion that consciousness or pain are not real phenomena.

Comment author: tadasdatys 18 July 2017 01:49:14PM 0 points [-]

You say that like its a good thing.

No, I'm not personally in favor of changing definitions of broken words. It leads to stupid arguments. But people do that.

If you look for consciousness from the outside, you'll find nothing, or you'll find behaviour. That's because consciousness is on the inside, is about subjectivity.

It would be preferable to find consciousness in the real world. Either reflected in behavior or in the physical structure of the brain. I'm under the impression that cousin_it believes you can have the latter without the former. I say you must have both. Are you saying you don't need either? That you could have two physically identical agents, one conscious, the other not?

Comment author: g_pepper 18 July 2017 02:27:14PM 0 points [-]

It would be preferable to find consciousness in the real world.

I find myself to be conscious every day. I don't understand what you find "unreal" about direct experience.

Comment author: tadasdatys 18 July 2017 07:46:07AM 0 points [-]

Seeing red isn't the same as claiming to see red

A record player looping the words "I see red" is very different from how humans see, both internally and behaviorally. A robot which takes a picture, finds the most common pixel color, and if that's red, plays the same "I see red" sound, is still in some ways different, but a lot less so. And if someone wanted to call this second robot conscious, as far as color is concerned, there would be no problem with that.

You may feel that pain is special, and that if we recognize a robot which says "ouch" when pushed, to feel pain, that would be in some sense bad. But it wouldn't. We already recognize that different agents can have equally valid experiences of pain, that aren't equally important to us (e.g. torturing rats vs humans. or foreigners vs family). This is not a new problem, and suggesting that some agents have a magical invisible property that makes their experiences important, is not a good solution.

Comment author: g_pepper 18 July 2017 12:36:12PM 1 point [-]

You may feel that pain is special, and that if we recognize a robot which says "ouch" when pushed, to feel pain, that would be in some sense bad. But it wouldn't. We already recognize that different agents can have equally valid experiences of pain, that aren't equally important to us (e.g. torturing rats vs humans. or foreigners vs family).

I don't see how it follows from the fact that foreigners and animals feel pain that it is reasonable to recognize that a robot that is programmed to say "ouch" when pushed feels pain. Can you clarify that inference?

suggesting that some agents have a magical invisible property that makes their experiences important, is not a good solution

I don't see anything magical about consciousness - it is something that is presumably nearly universally held by people, and no one on this thread has suggested a supernatural explanation for it. Just because we don't as-of-yet have an objective metric for consciousness in others does not make it magical.

View more: Next