tingram comments on Rationality Quotes June 2013 - Less Wrong

3 Post author: Thomas 03 June 2013 03:08AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (778)

You are viewing a single comment's thread. Show more comments above.

Comment author: khafra 04 June 2013 11:44:13AM 2 points [-]

It just takes some imagination. Hollow out both the Earth and the Moon to reduce their gravitational pull; support the ladder with carbon nanotube filaments; stave off collapse by pushing it around with high-efficiency ion impulse engines; etc.

I agree, though, that philosophers often make too much of the distinction between "logically impossible" and "physically impossible." There's probably no in principle possible way to hollow out the Earth significantly while retaining its structure; etc.

Comment author: tingram 04 June 2013 01:50:23PM 4 points [-]

I think that often "logically possible" means "possible if you don't think too hard about it". Which is exactly Dennett's point in context: the idea that you are a brain in a vat is only conceivable if you don't think about the computing power that would be necessary for a convincing simulation.

Comment author: ChristianKl 06 June 2013 09:23:09PM 6 points [-]

Which is exactly Dennett's point in context: the idea that you are a brain in a vat is only conceivable if you don't think about the computing power that would be necessary for a convincing simulation.

Dreams can be quite convincing simulations that don't need that much computing power.

The worlds that people who do astral traveling perceive can be quite complex. Complex enough to convince people who engage in that practice that they really are on an astral plane. Does that mean that the people are really on an astral plane and aren't just imagining it?

Comment author: Caspian 08 June 2013 03:01:17AM 4 points [-]

The way I like to think about it is that convincingness is a 2-place function - a simulation is convincing to a particular mind/brain. If there's a reasonably well-defined interface between the mind and the simulation (e.g. the 5 senses and maybe a couple more) then it's cheating to bypass that interface and make the brain more gullible than normal, for example by introducing chemicals into the vat for that purpose.

From that perspective, dreams are not especially convincing compared to experience while awake, rather dreamers are especially convincable.

Dennett's point seems to be that a lot of computing power would be needed to make a convincing simulation for a mind as clear-thinking as a reader who was awake. Later in the chapter he talks about other types of hallucinations.

Comment author: ChristianKl 08 June 2013 07:09:12PM 3 points [-]

The way I like to think about it is that convincingness is a 2-place function - a simulation is convincing to a particular mind/brain. If there's a reasonably well-defined interface between the mind and the simulation (e.g. the 5 senses and maybe a couple more)

The 5 senses are brain events. There aren't input channels to the brain. Take taste. How many different tastes of food can you perceive through your taste sense? More than 5. Why? Your brain takes data from nose, tongue and your memory and fits them together to something that you can perceive through your smell sense.

You have no direct access to the data that your nose or tongue sends to your brain through your conscious qualia perception.

If someone is open by receiving suggestions and you give him a hypnotic suggestion that a apple tastes like an orange you can awake him. If he eats the thing he will tell you that the apple is an orange. He might even get angry when someone tells him that the thing isn't an orange because it obviously tastes like an orange.

it's cheating to bypass that interface and make the brain more gullible than normal, for example by introducing chemicals into the vat for that purpose.

You don't need to introduce any chemicals. Millions of years of evolutions have trained brains to have an extremly high prior for thinking that they aren't "brains in a vat".

Doubting your own perception is an incredibly hard cognitive task.

There are experients where an experimentor uses a single electron to trigger a subject to do a particular task like raising his arm. If the experimentor afterwards ask the subject why he raised the arm the subject makes up a story and believes in that story. It takes effort for the leader of an experiment to convince a subject that he made up the story and there was no reason he raised his arm.

Comment author: tingram 07 June 2013 02:07:15AM 0 points [-]

I suggest you read the opening chapter of Consciousness Explained. Someone's posted it online here.

Comment author: ChristianKl 07 June 2013 09:49:54AM *  2 points [-]

Dennett seems to quote no actual scientific paper in the paragraph or otherwise really know what the brain does.

You don't need to provide detailed feedback to the brain, Dennett should be well aware that humans have a blind spot in their eyes and the brain makes up information to fill the blind spot.

It's the same with suggesting a brain in the vat that it's acting in the real world. The brain makes up the information that's missing to provide for an experience of being in the real world.

To produce a strong hallucination (as I understand Dennett he means equates strong hallucination with complex hallucination) you might need to have a channel through with you can insert information into the brain but you don't need to provide every detail. Missing details get made up by the brain.

Comment author: Leonhart 11 June 2013 09:52:05PM *  3 points [-]

Dennett should be well aware that humans have a blind spot in their eyes and the brain makes up information to fill the blind spot.

No, Dennett explicitly denies that the brain makes up information to fill the blind spot. This is central to his thesis. He creates a whole concept called 'figment' to mock this notion.

His position is that nothing within the brain's narrative generators expects, requires, or needs data from the blind spot; hence, in consciousness, the blind spot doesn't exist. No gaps need to be filled in, any more that HJPEV can be aware that Eliezer has removed a line that he might, counterfactually, have spoken.

For a hallucination to be strong, does not require the hallucination to have great internal complexity. It suffices that the brain happen to not ask too many questions.

Comment author: ChristianKl 12 June 2013 09:04:11AM 1 point [-]

For a hallucination to be strong, does not require the hallucination to have great internal complexity.

That's a question of definition of strong. But it seems that I read Dennett to charitable for that purpose. He defines it as:

Another conclusion it seems that we can draw from this is that strong hallucinations are simply impossible! By a strong hallucination I mean a hallucination of an apparently concrete and persisting three-dimensional object in the real world — as contrasted to flashes, geometric distortions, auras, afterimages, fleeting phantom-limb experiences, and other anomalous sensations. A strong hallucination would be, say, a ghost that talked back, that permitted you to touch it, that resisted with a sense of solidity, that cast a shadow, that was visible from any angle so that you might walk around it and see what its back looked like.

Given that definition, Dennett just seems wrong.

He continues saying:

Reports of very strong hallucinations are rare

I know multiple people in real life who report hallucinations of that strength. If you want an online source, the Tulpa forum has plenty of peope who manage to have strong hallucinations of Tulpa's.

The Tupla way seems to take months or a year. If you have a strongly hypnotically suggestible person a good hypnotist can create such hallucination in less than an hour.

Comment author: Leonhart 12 June 2013 09:12:05PM 1 point [-]

I think I must be misreading you. I'm puzzled that you believe this about hallucinations - that it's possible for the brain to devote enough processing power to create a "strong" hallucination in the Dennettian sense - but upthread, you seemed to be saying that dreams did not require such processing power. Dreams are surely the canonical example, for people who believe that whole swaths of world-geometry are actually being modelled, rendered and lit inside of their heads? After all, there is nothing else to be occupying the brain's horsepower; no conflicting signal source.

If I may share with you my own anecdote; when asleep, I often believe myself to be experiencing a fully sensory, qualia-rich environment. But often as I wake, there is an interim moment when I realise - it seems to be revealed - that there never was a dream. There was only a little voice making language-like statements to itself - "now I am over here now I am talking to Bob the scenery is so beautiful how rich my qualia are".

I think Dennett's position is just this; that there never was a dream, only a series of answers to spurious questions, which don't have to be consistent because nothing was awake to demand consistency.

Do you think he's wrong about dreams, too, or are you saying that waking hallucinations are importantly different? I had a quick look at the Tulpa forum and am unimpressed so far. Could you point to any examples you find particularly compelling?

If you have a strongly hypnotically suggestible person a good hypnotist can create such hallucination in less than an hour.

Ok, so I flat out don't believe that. If waking consciousness was that unstable, a couple of hours of immersive video gaming would leave me psychotic; and all it would take to see angels would be a mildly-well-delivered Latin Mass, rather than weeks of fasting and self-flagellation.

I'll go read about it, though.

Comment author: RobbBB 16 June 2013 06:11:36PM *  4 points [-]

there is an interim moment when I realise - it seems to be revealed - that there never was a dream. There was only a little voice making language-like statements to itself

I don't think I've ever had an experience quite like that. I've perhaps had experiences that are transitional between images and propositions -- I'm thinking by visualizing a little story to myself, and the images themselves are seamlessly semantic, like I'm on the inside of a novel and the narration is a deep component of the concrete flow of events. But to my knowledge I've never felt a sudden revelation that my mental images were 'only a little voice making language-like statements to itself', à la Dennett's suggestion that all experiences are just judgments.

Perhaps we're conceptualizing the same experience after-the-fact in different ways. Or perhaps we just have different phenomenologies. A lot of people have suggested (sometimes tongue-in-cheek) that Dennett finds his own wilder hypotheses credible because he has an unusually linguistic, abstract, qualitatively impoverished phenomenology. (Personally, I wouldn't be surprised if that's a little bit true, but I think it's a small factor compared to Dennett's philosophical commitments.)

Comment author: NancyLebovitz 19 June 2013 12:25:10PM 2 points [-]

I've occasionally had dreams where elements have backstories--- I just know something about something in my dream, without having any way of having found it out.

Comment author: Juno_Watt 16 June 2013 06:55:58PM 1 point [-]

A lot of people have suggested (sometimes tongue-in-cheek) that Dennett finds his own wilder hypotheses credible because he has an unusually linguistic, abstract, qualitatively impoverished phenomenology.

He is known to be a wine connoisseur. Sidney Shoemaker once asked him why he doesn't just read the label..

Comment author: Kaj_Sotala 19 June 2013 02:38:08PM 0 points [-]

I'm puzzled that you believe this about hallucinations - that it's possible for the brain to devote enough processing power to create a "strong" hallucination in the Dennettian sense - but upthread, you seemed to be saying that dreams did not require such processing power.

It is perfectly consistent to both believe that (some people) can have fully realistic mental imagery, and that (most people's) dreams tend to exhibit sub-realistic mental imagery.

I have one friend who claims to have eidetic mental imagery, and I have no reason to doubt her. Thomas Metzinger discusses in Being No-One the notion of whether the brain can generate fully realistic imagery, and holds that it usually cannot, but notes the existence of eidetic imaginers as an exception to the rule.

Comment author: Leonhart 19 June 2013 09:59:29PM 3 points [-]

Thanks for the cite: sadly, on clicking through, I get a menacing error message in a terrifying language, so evidently you can't share it that way? You are quite right that it's consistent. It's just that it surprised my model, which was saying "if realistic mental imagery is going to happen anywhere, surely it's going to be dreams, that seems obviously the time-of-least-contention-for-visual-workspace."

I'm beginning to wonder whether any useful phenomenology at all survives the Typical Mind Fallacy. Right now, if somebody turned up claiming that their inner monologue was made of butterscotch and unaccountably lapsed into Klingon from three to five PM on weekdays, I'd be all "cool story bro".

Comment author: Juno_Watt 19 June 2013 05:52:17PM 0 points [-]

I think that often "logically possible" means "possible if you don't think too hard about it".

Agreed