DefectiveAlgorithm comments on On Terminal Goals and Virtue Ethics - Less Wrong

67 Post author: Swimmer963 18 June 2014 04:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (205)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheAncientGeek 19 June 2014 07:09:33PM *  -2 points [-]

So you have a thing which is like an axiom in that it can't be explained in more basic terms...

..but is unlike an axiom in that you can ignore its implications where they don't suit.. you don't have to savage galaxies to obtain bacon...

..unless you're an AI and it's paperclips instead of bacon, because in that case these axiom like things actually are axiom like.

Comment author: DefectiveAlgorithm 19 June 2014 07:45:25PM *  2 points [-]

If acquiring bacon was your ONLY terminal goal, then yes, it would be irrational not to do absolutely everything you could to maximize your expected bacon. However, most people have more than just one terminal goal. You seem to be using 'terminal goal' to mean 'a goal more important than any other'. Trouble is, no one else is using it this way.

EDIT: Actually, it seems to me that you're using 'terminal goal' to mean something analogous to a terminal node in a tree search (if you can reach that node, you're done). No one else is using it that way either.

Comment author: TheAncientGeek 19 June 2014 08:15:03PM -2 points [-]

Feel free to offer the correc definition. But note that you came define it as overridable, since non terminal goals are already defined that way.

There is no evidence that people have one or more terminal goals . At least you need to offer a definition such that multiple TGs don't collide, and are distinguishable from non TGs.

Comment author: Nornagest 19 June 2014 08:24:16PM 0 points [-]

Where are you getting these requirements from?