You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

JoshuaFox comments on Superintelligence 10: Instrumentally convergent goals - Less Wrong Discussion

7 Post author: KatjaGrace 18 November 2014 02:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (31)

You are viewing a single comment's thread. Show more comments above.

Comment author: JoshuaFox 18 November 2014 07:54:12AM 0 points [-]

Another way to look at it: Subgoals may be offset by other subgoals. This includes convergent values.

Humans don't usually let any one of their conflicting values override all others. For example, accumulation of any given resource is moderated by other humans and by diminishing marginal returns on any one resource as compared to another.

On the other hand, for a superintelligence, particularly one with a simple terminal goal, these moderating factors would be less effective. For example, they might not have competitors.