Sniffnoy comments on Dreams of AIXI - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (145)
I think I'm finally starting to understand your article. I will probably have to go back and vote it up; it's a worthwhile point.
Do you have the link for that? I think there's an article somewhere, but I can't remember what it's called.
If there isn't one, why do you assume computationalism? I find it stunningly implausible that the mere specification of formal relationships among abstract concepts is sufficient to reify those concepts, i.e., to cause them to actually exist. For me, the very definitions of "concept," "relationship," and "exist" are almost enough to justify an assumption of anti-computationalism. A "concept" is something that might or might not exist; it is merely potential existence. A "relationship" is a set of concepts. I either don't know of or don't understand any of the insights that would suggest that everything that potentially exists and is computed therefore actually exists -- computing, to me, just sounds like a way of manipulating concepts, or, at best, of moving a few bits of matter around, perhaps LED switches or a turing tape, in accordance with a set of concepts. How could moving LED switches around make things real?
By "real," I mean made of "stuff." I get through a typical day and navigate my ordinary world by assuming that there is a distinction between "stuff" (matter-energy) and "ideas" (ways of arranging the matter-energy in space-time). Obviously thinking about an idea will tend to form some analog of the idea in the stuff that makes up my brain, and, if my brain were so thorough and precise as to resemble AIXI, the analog might be a very tight analog indeed, but it's still an analog, right? I mean, I don't take you to mean that an AIXI 'brain' would literally form a class-M planet inside its CPU so as to better understand the sentient beings on that planet. The AIXI brain would just be thinking about the ideas that govern the behavior of the sentient beings...and thinking about ideas, even very precisely, doesn't make the ideas real.
I might be missing something here; I'd appreciate it if you could point out the flaw(s) in my logic.
I don't think you can make this distinction meaningful. After all, what's an electron? Just a pattern in the electron field...