Max Tegmark publishes a preprint of a paper arguing from physical principles that consciousness is “what information processing feels like from the inside,” a position I've previously articulated on lesswrong. It's a very physics-rich paper, but here's the most accessable description I was able to find within it:
If we understood consciousness as a physical phenomenon, we could in principle answer all of these questions [about consciousness] by studying the equations of physics: we could identify all conscious entities in any physical system, and calculate what they would perceive. However, this approach is typically not pursued by physicists, with the argument that we do not understand consciousness well enough.
In this paper, I argue that recent progress in neuroscience has fundamentally changed this situation, and that we physicists can no longer blame neuroscientists for our own lack of progress. I have long contended that consciousness is the way information feels when being processed in certain complex ways, i.e., that it corresponds to certain complex patterns in spacetime that obey the same laws of physics as other complex systems, with no "secret sauce" required.
The whole paper is very rich, and worth a read.
Thanks for posting this.
Do I recall correctly that Gary Drescher also uses the 'what information processing feels like from the inside' view of consciousness, and that Eliezer thought it was at least a good insight?
I've been warming to the idea as a useful insight, but I'm still pretty confused; it feels like there's a useful (but possibly quantitative and not qualitative) difference between myself (obviously 'conscious' for any coherent extrapolated meaning of the term) and my computer (obviously not conscious (to any significant extent?)), which is not accounted for by saying merely that consciousness is the feeling of information processing.
I think the idea of consciousness might really be a confused suite arising from real, more fundamental ingredients including the feeling of information being processed. So maybe it's more like 'the properties of an information processor give rise (possibly in combination with other things) to things we refer to by 'consciousness''. I'm struggling to think of cases where we can't (at least in principle) taboo consciousness and instead talk about more specific things that we know to refer non-confusedly to actual things. And saying 'Consciousness is X' seems to take consciousness too seriously as a useful or meaningful or coherent concept.
(I guess consciousness is often treated as a fundamental ethical consideration that cannot be reduced any further, but I am skeptical of the idea that consciousness is fundamental to ethics per se, and extremely suspicious of ethical considerations that have not been shown reducible to 'selfishness'+game/decision theory.)
I think there's a notable probability of the disjunction that consciousness is meaningless enough that any attempt to reduce it as much as Tegmark tries is misguided or that it is possible in non-quantum models and Tegmark's approach (even if it is incomplete or partially incorrect) generalises.
Why do you think your computer is not conscious? It probably has more of a conscious experience than, say, a flatworm or sea urchin. (As byrnema notes, conscious does not necessarily imply self-aware here.)