PhilGoetz comments on Outline of a lower bound for consciousness - Less Wrong

5 Post author: PhilGoetz 13 January 2010 05:27AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (110)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 07 February 2010 05:28:52PM 1 point [-]

Short answer: It won't guarantee that, because rats learn most of what they know. The equation I developed turns out to be identical to an equation saying that the amount of information contained in facts and data must be at least as great as the amount of information that it takes to specify the ontology. So any creature that learns its ontology, automatically satisfies the equation.

Comment author: CannibalSmith 08 February 2010 11:44:52AM 0 points [-]

... Could we take as the input the most a rat could ever learn?

Comment author: PhilGoetz 08 February 2010 06:46:40PM 0 points [-]

I don't understand the question. It's an inequality, and in cases where the inequality isn't satisfied, the answer it gives is "I don't know". The answer for a rat will always be "I don't know".

Comment author: CannibalSmith 09 February 2010 07:39:39AM 0 points [-]

I must profess I didn't understand most of what you've said, but did I guess the following right? The equation says that

IF my knowledge is "bigger" than my ontology THEN I might be conscious

And in the case of learning my ontology, it means that my ontology is a subset of my knowledge and thus never bigger than the former.

Comment author: PhilGoetz 09 February 2010 07:35:41PM 0 points [-]

IF my knowledge is "bigger" than my ontology THEN I might be conscious

Right.

And in the case of learning my ontology, it means that my ontology is a subset of my knowledge and thus never bigger than the former.

Exactly.