thomblake comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: thomblake 18 May 2012 04:41:57PM 2 points [-]

Yes, that's possible. It's still possible that you could get a lot done with strategy #2 without being able to make that prediction.

I agree that if 2 systems have the same inputs and outputs, their internals don't matter much here.

Comment author: TheOtherDave 18 May 2012 05:25:31PM *  0 points [-]

So.. when we posit in this discussion a system that lacks a theory of mind in a sense that matters, are we positing a system that cannot make predictions like this one? I assume so, given what you just said, but I want to confirm.

Comment author: thomblake 18 May 2012 06:05:44PM 1 point [-]

Yes, I'd say so. It isn't helpful here to say that a system lacks a theory of mind if it has a mechanism that allows it to make predictions about reported beliefs, intentions, etc.

Comment author: TheOtherDave 18 May 2012 06:12:33PM 0 points [-]

Cool! This was precisely my concern. It sounded an awful lot like y'all were talking about a system that could make such predictions but somehow lacked a theory of mind. Thanks for clarifying.