To whom it may concern:
This thread is for the discussion of Less Wrong topics that have not appeared in recent posts. If a discussion gets unwieldy, celebrate by turning it into a top-level post.
(After the critical success of part II, and the strong box office sales of part III in spite of mixed reviews, will part IV finally see the June Open Thread jump the shark?)
There are a lot of hypotheses floating around.
Mine is:
We have awareness. That is, we observe things in the territory with our senses, and include them in our map of the territory. The phenomenon we observe as consciousness is just our ability to include ourselves (our own minds, and some of it's inner sensations) in the territory.
Some people think there are things you can only know if you experience them yourself. In theory, you could run a decent simulation of what it's like to be a bat, but you would still have memories of being human, and therefore awareness of bat territory wouldn't be enough.
My solution: implant memories, including bat memories of not having human memories, into yourself. In theory, this should work.
I hope you don't mean you're hypothesizing what the word "consciousness" means; rather, your hypotheses are alternate predictions about physical unknowns or about the future. Which is it?
I'm asking what the definition, the meaning, of the word consciousness is. Hypothesizing what a word means feels like the wrong way to do things. Well, unless we're hypothesizing what other people mean when they say "consciousness". But if we're using the word here at LW we shouldn't need to hypothesize, ... (read more)