Mitchell_Porter comments on It's not like anything to be a bat - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (189)
No, it just means that <red> plays a causal role in us, which would be played by something else in a simulation of us.
There's nothing paradoxical about the idea of an unconscious simulation of consciousness. It might be an ominous or a disconcerting idea, but there's no contradiction.
See what I just said to William Sawin about fundamental versus derived causality. These are derived causal relations; really, they are regularities which follow indirectly from large numbers of genuine causal relations. My eccentricity lies in proposing a model where mental states can be fundamental causes and not just derived causes, because the conscious mind is a single fundamental entity - a complex one, that in current language we might call an entangled quantum system in an algebraically very distinctive state, but still a single entity, in a way that a pile of unentangled atoms would not be.
Being a single entity means that it can enter directly into whatever fundamental causal relations are responsible for physical dynamics. Being that entity, from the inside, means having the sensations, thoughts, and desires that you do have; described mathematically, that will mean that you are an entity in a particular complicated, formally specified state; and physically, the immediate interactions of that entity would be with neighboring parts of the brain. These interactions cause the qualia, and they convey the "will".
That may sound strange, but even if you believe in a mind that is material but non-fundamental, it still has to work like that or else it is causally irrelevant. So when you judge the idea, remember to check whether you're rejecting it for weirdness that your own beliefs already implicitly carry.
So you're taking the existing causal graph, drawing a box around all the interactions that happen inside a brain, and saying that everything inside the box counts as one thing.
That's not simplification, it's just bad accountancy.