Perplexed comments on Minimum computation and data requirements for consciousness. - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (81)
It strikes me as bizarre too, particularly here. So, you have to ask yourself whether you are misinterpreting. Maybe they are asking for evidence of something else.
You are asking me to think about topics I usually try to avoid. I believe that most talk about cognition is confused, and doubt that I can do any better. But here goes.
During the evolutionary development of human cognition, we passed through these stages:
So that is my off-the-cuff theory of consciousness. It certainly requires social cognition and it probably requires language. It obviously requires computation. It is relatively useless, but it is the inevitable byproduct of useful things. Ah, but now let us add
Was that an important addition? I don't think so. It is important to recognize volitional agents, epistemic agents, and eventually moral agents, as well as the fact that others act as if we ourselves were also agents of all three kinds. I'm not quite sure why anyone much cares whether either ourselves or any of the other agents are also conscious.
To use EY terminology from the sequences, all the useful stuff above is purely about maps. The consciousness stuff is about thinking that maps really match up to territory. But as reductionists, we know that the real matchup between map and territory actually takes place several levels down. So consciousness, like free will, is a mostly harmless illusion to be dissolved, rather than an important phenomenon to be understood.
That probably didn't help you very much, but it helped me to clarify my own thinking.
Self-model theory of subjectivity can also suggest
(7) Ability to explicitly represent state of our own knowledge, intentions, focus of attention etc. Ability to analyse performance of our own brain and find ways to circumvent limitations. Ability to control brain's resource allocation by learned (vs evolved) procedures.
Interesting thing about consciousness is that the map is a part of territory it describes, and as the map should be represented by neuronal connections and activity it can presumable influence territory.
Yes, and 1, 2, 3, 4, 5, and 6 and 7 all require data and computation resources.
And to compare a map with a territory one needs a map (i.e. data) and a comparator (i.e. a pattern recognition device) and needs computational resources to compare the data with the territory using the comparator.
When one is thinking about internal states, the map, the territory and the comparator are all internal. That they are internal does not obviate the need for them.