JoshuaZ comments on Book Review: The Root of Thought - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (91)
No, he's not. Neurons are part of the territory. They are composed of other parts of the territory which are composed of quarks, spacetime, etc. But that doesn't make a neuron not part of the territory. Just because something is ontologically reducible doesn't mean it isn't part of the territory. It just means that you need to be very careful not to treat it is as ontologically fundamental when it isn't.
Fine, substitute "not ontologically fundamental" for "not part of the territory" if you must.
The problem is that most philosophers who care about phenomenology at all would assign at least some ontologically foundational status to it, simply because it is foundational enough to you and anyone else with subjective experience. There is a reasonable argument to be made that "the way it feels from the inside" is just as fundamental as the basic physics of how the world works.
This does not imply that the two are necessarily related (for instance, P-zombies or robots can be unconscious yet physically talk about subjective experience). It does mean that Occam's razor should apply to "the way it feels from the inside", which tends to weigh against complex explanations like "configurations of neurons" and in favor of either exotic physics or a spooky superintelligence who can figure out how to run debugger sessions on our physical brains.
Unfortunately, this is close to nonsense. Just because something strikes me as foundational to me doesn't give me any decent reason for thinking it has any such actually foundational status. Humans suck as introspection. We really, really suck at intuiting out the differences in how we process things unless things are going drastically wrong. For example, it isn't obvious to most humans that we use different sections of our brains to add and multiply. But, there's a lot of evidence for this. For example, fMRI scans show different areas lighting up, with areas corresponding to memory lighting up for multiplication and areas corresponding to reasoning lighting up for addition. Similarly, there are stroke victims who only lose the ability to do one or the other operation. And this is but one example of how humans fail. Relying on human feelings to get an idea about how anything in the world, especially our own mind, works is not a good idea.
I don't follow this logic at all. I'm not completely sure what you are trying to do here but it sounds suspiciously like the theistic argument that God is a simple hypothesis. Just because I can posit something as a single, irreducible entity does not make that thing simple. (Also, can you expand on what you mean by a spooky superintelligence running debugging sessions since I can't parse this is in any coherent way)
Small nitpick: I am not talking about what is foundational to the way our world works. I am only making the fairly trite obsevation that subjective experience/qualia is the only thing we can directly experience; it would be really, really strange if something so basic to us turned out to be dependent on complicated configurations of neurons and glial cells, as naive physicalists suggest.
What this is actually saying is that phenomenology (the stuff we can access by introspection) cannot directly map physical areas of the brain of the kind which might get damaged in a stroke. In itself, this is not evidence that humans "suck" at introspection; especially if our consciousness really is a quantum state with $bignum degrees of freedom, rather than a classical system with spatially separate subparts.
God is not a simple hypothesis, but "this was affected by an optimization process which cares about X or something like it" is simpler than "this configuration which happens to be near-optimal for X arose by sheer luck". Which is pretty much what one would have to posit in order to explain our subjective experience of the extremely complicated physical systems we call "brains". There are other avenues such as the anthropic principle, but ISTM that at some point one would start to run into circularities.
What else can it depend on? Your original claim was that it has to do something with quantum superpositions, so can you tell, how these superpositions are going to explain qualia any better? Seems like you demand the explanation be black box without internal structure; this is contrary to what actual explanations are.
The "naive physicalists" don't maintain anything like that. Evolution isn't sheer luck.
Do you question the consensus that you see using your eyes? Because the eye is a blatantly complicated mechanism directly in the middle of one of the direct experiences of the world you stake your theory on.
I'm not questioning the fact that complicated mechanisms are involved in creating your subjective experience; I question the physical description of that subjective experience as an incredibly complicated configuration in the brain. If your qualia are at all real in some sense, they should correspond to something far simpler than that on Occam's Razor grounds. Alternately, you might just be a P-zombie. But then you'd have serious problems experiencing how your brain feels from the inside, although your brain would definitely be talking about its internal experiences.
Why aren't you? You just said that "[qualia] should correspond to something far simpler than that". If a (say) visual quale is simple, then why does the human system need a complicated mechanism to capture large numbers of photons such that they form a coherent image on a surface coated with photosensitive neurons, which are wired so as to cause large-scale effects on other parts of the neural (and glial) system of the brain, starting with the visual cortex and spreading from there ... to cause something simple? Light was simple to start with! If you expect things to be simple at the Cartesian theater, the visual system moves the wrong way.
Light is simple, but evolved organisms care very little about the fundamental qualities of light. They care a lot about running efficient computations using various inputs, including the excitation of photosensitive neurons. This is probably why the Cartesian theather feels very much like computation on high-level inputs and outputs, rather than objectively fundamental things such as wavelengths of light. And the computations which transform low-level data like excitation of sensory neurons into high-level inputs are probably unconscious because they are qualitatively different from conscious computation.
I would expect optimization for efficiency to be something evolution does - but I am compelled to note that I mentioned "the Cartesian theater" as a reference to Daniel Dennett's Consciousness Explained, where he strenuously refuted the idea of the Cartesian theater. By Dennett's argument - and even when Consciousness Explained came out, he had a lot of research data to work from - the collocation of all sensory data in a single channel to run past some homunculus recording our conscious experience is unlikely. After all, there already is a data-processing entity right there to collect all the sensory data - that's the entire brain. So within the brain, it should not be surprising that different conscious experiences are saved to memory from different parts. Particularly since the brain is patently a parallel computer anyway.
Daniel Dennett's "refutation" of the Cartesian theater has been widely criticized. Basically, he relies on perceptual illusions such as discrete motion being perceived as continuous, arguing that there should be a fact of the matter as to whether "the motion in the Cartesian theater" is continuous or not. But phenomenology is far simpler (or more complicated) than that: the fact that we perceive the quale of continuous_motion does not imply that a homunculous somewhere is seeing the object in an intermediate position at each given moment in time. It is a strawman argument.
Well, what is it, then?
Ahhhh, I see now. Subjective experience must be ontologically foundational because it feels foundational, subjectively. This seems oddly... circular.
Configurations of neurons are not complex. They are complicated, but they can still be explained by the same physics as everything else in the world. You are proposing a more complex universe. Or possibly a god. They are equally implausible without supporting evidence.