You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

marchdown comments on Irrationality Game II - Less Wrong Discussion

13 [deleted] 03 July 2012 06:50PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (380)

You are viewing a single comment's thread.

Comment author: marchdown 04 July 2012 02:36:32AM -1 points [-]

Irrationality game

Moral intuitions are very simple. A general idea of what it means for somebody to be human is enough to severely restrict variety of moral intuitions which you would expect it to be possible for them to have. Thus, conditioned on Adam's humanity, you would need very little additional information to get a good idea of Adam's morals, while Bob the alien would need to explain his basic preferences at length for you to model his moral judgements accurately. It follows that the tricky part of explaining moral intuitions to a machine is explaining human, and it's not possible to cheat by formalizing moral separately.

Comment author: Eugine_Nier 04 July 2012 07:19:43AM 1 point [-]

Please attach a probability.

Comment author: marchdown 04 July 2012 09:33:48AM 0 points [-]

Fairly certain (85%—98%).

Comment author: Andreas_Giger 04 July 2012 01:21:56PM *  -2 points [-]

That is a very wide range. Downvoted you anyway.