timtyler comments on Deadlines and AI theory - Less Wrong

2 Post author: Dmytry 23 March 2012 10:57AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (16)

You are viewing a single comment's thread. Show more comments above.

Comment author: timtyler 23 March 2012 01:31:20PM *  1 point [-]

Doesn't this merely meta sidestep the issue? Now what the AI needs to do is modify itself to use pragmatic goals when modifying itself, to any level of recursion, and then the situation becomes unified with Dmytry's concern.

So: the machine distinguishes between the act of temporarily swapping in or out an approved pragmatic goal (in order to quickly burn through some instrumental task, for example) from more serious forms of self modification.

How can you prove that in time the FAI does not reduce to an uFAI or be quickly rendered to a less steep logistic growth wrt uFAI?

Eeek! Too much FAI! I do not see how that idea has anything to do with this discussion.