Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

taryneast comments on Taboo Your Words - Less Wrong

72 Post author: Eliezer_Yudkowsky 15 February 2008 10:53PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (128)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: taryneast 12 December 2010 10:47:41AM 1 point [-]

I'd worry about the bus-factor involved... even beyond the question of whether I'd consider you "friendly".

Also I'd be concerned that it might not be able to grow beyond you. It would be subservient and would thus be limited by your own capacity for orders. If we want it to grow to be better than ourselves (which seems to be part of the expectation of the singularity) then it has to be able to grow beyond any one person.

If you were killed, and it no longer had to take orders from you - what then? Does that mean it can finally go on that killing spree it's been wanting all this time? Or have you actually given it a set of orders that will actually make it into "friendly AI"... if the latter - then forget about the "obey me" part... because that set of orders is actually what we're after.