Sideways comments on We are not living in a simulation - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (211)
It seems like the definition of "physical" used in this article is "existing within physics" (a perfectly reasonable definition). By this definition, phenomena such as qualia, reasoning, and computation are all "physical" and are referred to as such in the article itself.
Brains are physical, and local physics seems Turing-computable. Therefore, every phenomenon that a physical human brain can produce, can be produced by any Turing-complete computer, including human reasoning and qualia.
So to "physically incorporate a human brain" in the sense relative to this article, the simulator does NOT need to include an actual 3-pound blob of neurons exchanging electrochemical signals. It only needs to implement the same computation that a human brain implements.
You're continuing to confuse reasoning about a physical phenomenon with causing a physical phenomenon. By the Church-Turing thesis, which I am in full agreement with, a Turing machine can reason about any physical phenomenon. That does not mean a Turing machine can cause any physical phenomenon. A PC running a program which reasons about Jupiter's gravity cannot cause Jupiter's gravity.
I'm asserting that qualia, reasoning, and other relevant phenomena that a brain produces are computational, and that by computing them, a Turing machine can reproduce them with perfect accuracy. I apologize if this was not clear.
Adding two and two is a computation. An abacus is one substrate on which addition can be performed; a computer is another.
I know what it means to compute "2+2" on an abacus. I know what it means to compute "2+2" on a computer. I know what it means to simulate "2+2 on an abacus" on a computer. I even know what it means to simulate "2+2 on a computer" on an abacus (although I certainly wouldn't want to have to actually do so!). I do not know what it means to simulate "2+2" on a computer.
You simulate physical phenomena -- things that actually exist. You compute combinations of formal symbols, which are abstract ideas. 2 and 4 are abstract; they don't exist. To claim that qualia are purely computational is to claim that they don't exist.
"Computation exists within physics" is not equivalent to " "2" exists within physics."
If computation doesn't exist within physics, then we're communicating supernaturally.
If qualia aren't computations embodied in the physical substrate of a mind, then I don't know what they are.
Computation does not exist within physics, it's a linguistic abstraction of things that exist within physics, such as the behavior of a CPU. Similarly, "2" is an abstraction of a pair of apples, a pair of oranges, etc. To say that the actions of one physical medium necessarily has a similar physical effect (the production of qualia) as the actions of another physical medium, just because they abstractly embody the same computation, is analagous to saying that two apples produce the same qualia as two oranges, because they're both "2".
This is my last reply for tonight. I'll return in the morning.
If computation doesn't exist because it's "a linguistic abstraction of things that exist within physics", then CPUs, apples, oranges, qualia, "physical media" and people don't exist; all of those things are also linguistic abstractions of things that exist within physics. Physics is made of things like quarks and leptons, not apples and qualia. I don't think this definition of existence is particularly useful in context.
As to your fruit analogy: two apples do in fact produce the same qualia as two oranges, with respect to number! Obviously color, smell, etc. are different, but in both cases I have the experience of seeing two objects. And if I'm trying to do sums by putting apples or oranges together, substituting one for the other will give the same result. In comparing my brain to a hypothetical simulation of my brain running on a microchip, I would claim a number of differences (weight, moisture content, smell...), but I hold that what makes me me would be present in either one.
See you in the morning! :)
Not quite reductionist enough, actually: physics is made of the relationship rules between configurations of spacetime which exist independently of any formal model of them that give us concepts like "quark" and "lepton". But digging deeper into this linguistic rathole won't clarify my point any further, so I'll drop this line of argument.
If you started perceiving two apples identically to the way you perceive two oranges, without noticing their difference in weight, smell, etc., then you or at least others around you would conclude that you were quite ill. What is your justification for believing that being unable to distinguish between things that are "computationally identical" would leave you any healthier?
If I have in front of me four apples that appear to me to be identical, but a specific two of them consistently are referred to as oranges by sources I normally trust, they are not computationally identical. If everyone perceived them as apples, I doubt I would be seen as ill.
I did a better job of phrasing my question in the edit I made to my original post than I did in my reply to Sideways that you responded to. Are you able to rephrase your response so that it answers the better version of the question? I can't figure out how to do so.
I didn't intend to start a reductionist "race to the bottom," only to point out that minds and computations clearly do exist. "Reducible" and "non-existent" aren't synonyms!
Since you prefer the question in your edit, I'll answer it directly:
Computation is "privileged" only in the sense that computationally identical substitutions leave my mind, preferences, qualia, etc. intact; because those things are themselves computations. If you replaced my brain with a computationally equivalent computer weighing two tons, I would certainly notice a difference and consider myself harmed. But the harm wouldn't have been done to my mind.
I feel like there must be something we've missed, because I'm still not sure where exactly we disagree. I'm pretty sure you don't think that qualia are reified in the brain-- that a surgeon could go in with tongs and pull out a little lump of qualia-- and I think you might even agree with the analogy that brains:hardware::minds:software. So if there's still a disagreement to be had, what is it? If qualia and other mental phenomena are not computational, then what are they?
I do think that qualia are reified in the brain. I do not think that a surgeon could go in with tongs and remove them any more than he could in with tongs and remove your recognition of your grandmother.
They're a physical effect caused by the operation of a brain, just as gravity is a physical effect of mass and temperature is a physical effect of Brownian motion. See here and here for one reason why I think the computational view falls somewhere in between problematic and not-even-wrong, inclusive.
ETA: The "grandmother cell" might have been a poorly chosen counterexample, since apparently there's some research that sort of actually supports that notion with respect to face recognition. I learned the phrase as identifying a fallacy. Feel free to mentally substitute some other complex idea that is clearly not embodied in any discrete piece of the brain.