JohnDavidBustard comments on Recommended Reading for Friendly AI Research - Less Wrong

26 Post author: Vladimir_Nesov 09 October 2010 01:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (29)

You are viewing a single comment's thread. Show more comments above.

Comment author: JohnDavidBustard 16 October 2010 03:37:03PM 0 points [-]

So, assuming survival is important, a solution that maximises survival plus wireheading would seem to solve that problem. Of course it may well just delay the inevitable heat death ending but if we choose to make that important, then sure, we can optimise for survival as well. I'm not sure that gets around the issue that any solution we produce (with or without optimisation for survival) is merely an elaborate way of satisfying our desires (in this case including the desire to continue to exist) and thus all FAI solutions are a form of wireheading.