You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

torekp comments on Irrationality Game II - Less Wrong Discussion

13 [deleted] 03 July 2012 06:50PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (380)

You are viewing a single comment's thread. Show more comments above.

Comment author: torekp 08 July 2012 01:04:29AM *  1 point [-]

I understand "computationalism" as referring to the philosophical Computational Theory of the Mind (wiki, Stanford Encyclopedia of Phil.). From the wiki:

Computational theories of mind are often said to require mental representation because 'input' into a computation comes in the form of symbols or representations of other objects. A computer cannot compute an actual object, but must interpret and represent the object in some form and then compute the representation. The computational theory of mind is related to the representational theory of mind in that they both require that mental states are representations. However the two theories differ in that the representational theory claims that all mental states are representations while the computational theory leaves open that certain mental states, such as pain or depression, may not be representational and therefore may not be suitable for a computational treatment.

From the SEP:

representations have both semantic and syntactic properties, and processes of reasoning are performed in ways responsive only to the syntax of the symbols—a type of process that meets a technical definition of ‘computation’

Because computation is about syntax not semantics, the physical context - embodiment and extension - is irrelevant to computation qua computation. That is what I mean when I say that embodiment and extension are regarded as of no interest. Of course, if a philosopher is less thorough-going about computationalism, leaving pains and depression out of it for example, then embodiment may be of interest for those mental events.

However, your last paragraph throws a monkey wrench into my reasoning, because you raise the possibility of a "computer" drawn to include more territory. All I can say is, that would be unusual, and it seems more straightforward to delineate the syntactic rules of the visual system's edge-detection and blob-detection processes, for example, than of the whole organism+world system.

Comment author: magfrump 08 July 2012 04:58:42AM 0 points [-]

I feel like we are talking past each other in a way that I do not know how to pinpoint.

Part of the problem is that I am trying to compare three things--what I believe, the original statement, and the theory of computationalism.

To try to summarize each of these in a sentence:

I believe that the entire universe essentially "is" a computation, and so minds are necessarily PARTS of computations, but these computations involve their environments. The theory of computationalism tries to understand minds as computations, separate from the environment. The OP suggests that computationalism is likely not a very good way of figuring out minds.

1) do these summaries seem accurate to you? 2) I still can't tell whether my beliefs agree or disagree with either of the other two statements. Is it clearer from an outside perspective?

Comment author: torekp 10 July 2012 02:13:53AM *  0 points [-]

Your summaries look good to me. As compared to your beliefs, standard Computational Theory of Mind is probably neither true nor false, because it's defined in the context of assumptions you reject. Without those assumptions granted, it fails to state a proposition, I think.

Comment author: magfrump 11 July 2012 04:10:16AM 0 points [-]

Without those assumptions granted, it fails to state a proposition

I am constantly surprised and alarmed by how many things end up this way.