bokov comments on Is Kiryas Joel an Unhappy Place? - Less Wrong

20 Post author: gwern 23 April 2011 12:08AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (186)

You are viewing a single comment's thread. Show more comments above.

Comment author: bokov 26 September 2013 03:26:42PM 2 points [-]

You know, his scenario of erasing humanity as a byproduct of an optimization process indifferent to human values amounts to the unfriendly AI scenarios we discuss, just relaxing the requirement that the optimization process be sentient.

I wonder if the following is a valid generalization of the specific problem that motivates the MIRI folks:

Our ability to scale up and speed up achievement of goals has outpaced or will soon outpace our ability to find goals that we won't regret.