MugaSofer comments on The genie knows, but doesn't care - Less Wrong

54 Post author: RobbBB 06 September 2013 06:42AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (515)

You are viewing a single comment's thread. Show more comments above.

Comment author: Juno_Watt 12 September 2013 04:06:29PM -2 points [-]

Those are only 'mistakes' if you value human intentions. A grammatical error is only an error because we value the specific rules of grammar we do; it's not the same sort of thing as a false belief (though it may stem from, or result in, false beliefs).

You will see a grammatical error as a mistake if you value grammar in general, or if you value being right in general.

A self-improving AI needs a goal. A goal of self-improvement alone would work. A goal of getting things right in general would work too, and be much safer, as it would include getting our intentions right as a sub-goal.

Comment author: MugaSofer 12 September 2013 04:18:59PM *  1 point [-]

A goal of self-improvement alone would work.

Although since "self-improvement" in this context basically refers to "improving your ability to accomplish goals"...

You will see a grammatical error as a mistake if you value grammar in general, or if you value being right in general.

Stop me if this is a non-secteur, but surely "having accurate beliefs" and "acting on those beliefs in a particular way" are completely different things? I haven't really been following this conversation, though.