You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Nebu comments on Greg Egan disses stand-ins for Overcoming Bias, SIAI in new book - Less Wrong Discussion

35 Post author: Kaj_Sotala 07 October 2010 06:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (40)

You are viewing a single comment's thread. Show more comments above.

Comment author: Nebu 24 January 2016 10:28:00PM 1 point [-]

I suspect that if we're willing to say human minds are Turing Complete[1], then we should also be willing to say that an ant's mind is Turing Complete. So when imagining a human with a lot of patience and a very large notebook interacting with a billion year old alien, consider an ant with a lot of patience and a very large surface area to record ant-pheromones upon, interacting with a human. Consider how likely it is that human would be interested in telling the ant things it didn't yet know. Consider what topics the human would focus on telling the ant, and whether it might decide to hold back on some topics because it figures the ant isn't ready to understand those concepts yet. Consider whether it's more important for the patience to lie within the ant or within the human.

1: I generally consider human minds to NOT be Turing Complete, because Turing Machines have infinite memory (via their infinite tape), whereas human minds have finite memory (being composed of a finite amount of matter). I guess Egan is working around this via the "very large notebook", which is why I'll let this particular nitpick slide for now.