Nominull comments on Outline of a lower bound for consciousness - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (110)
So if I understand correctly, your basic claim underlying all of this is that a system can be said not to be conscious if its set of beliefs remains equally valid when you switch the labels on some of the things it has beliefs about. I have a few concerns about this point, which you may have already considered, but which I would like to see addressed explicitly. I will post them as replies to this post.
If I am mischaracterizing your position, please let me know, and then my replies to this post can probably be ignored.
Doesn't this fail independence of irrelevant alternatives? That is to say, couldn't I take a conscious system and augment it with two atoms, then add one fact about each atom such that switching the labels on the two atoms maintains the truth of those facts? It seems to me that in that case, the system would be provably unconscious, which does not accord with my intuition.
Yes; I mentioned that in the full version. The brain is full of information that we're not conscious of. This is necessarily so when you have regions of the graph of K with low connectivity. A more complete analysis would look for uniquely-grounded subsets of K. For example, it's plausible that infants thrashing their arms around blindly have knowledge in their brains about where there arms are and how to move them, but are not conscious of that knowledge; but are conscious of simpler sensation.