orthonormal comments on Some Thoughts Are Too Dangerous For Brains to Think - Less Wrong

15 Post author: WrongBot 13 July 2010 04:44AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (311)

You are viewing a single comment's thread. Show more comments above.

Comment author: orthonormal 13 July 2010 02:44:05PM 0 points [-]

This does raise an interesting issue: if I'm a strictly selfish utilitarian, do I not want my utility function to be that which will attain the highest expected utility?

This is a particular form of wireheading; fortunately, for evolutionary reasons we're not able to do very much of it without advanced technology.

Comment author: Vladimir_Nesov 13 July 2010 06:38:39PM *  1 point [-]

This does raise an interesting issue: if I'm a strictly selfish utilitarian, do I not want my utility function to be that which will attain the highest expected utility?

This is a particular form of wireheading

I'd say it's rather a form of conceptual confusion: you can't change a concept ("change" is itself a "timeful" concept, meaningful only as a property within structures which are processes in the appropriate sense). But it's plausible that creating agents with slightly different explicit preference will result in a better outcome than, all else equal, if you give those agents your own preference. Of course, you'd probably need to be a superintelligence to correctly make decisions like this, at which point creation of agents with given preference might cease to be a natural concept.

Comment author: red75 13 July 2010 07:48:22PM 0 points [-]

I am afraid that advanced technology is not necessary. Literal wireheading.