You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

diegocaleiro comments on Superintelligence 10: Instrumentally convergent goals - Less Wrong Discussion

7 Post author: KatjaGrace 18 November 2014 02:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (31)

You are viewing a single comment's thread. Show more comments above.

Comment author: diegocaleiro 18 November 2014 04:28:29PM 0 points [-]

I take this to be false.

To be the same and to have the same goals are two distinct, but equally possible kinds of sameness.

Most humans seem to care much more about the former (survival) then the later (that their goals be sustained in the universe)

Citing Woody Allen: "I don't want to achieve immortality through my work. I want to achieve it through not dying."

We do have distinct reasons to think machine intelligences would like to preserve their goals, and that for them, perhaps identity would feel more entangled with goals, however those reasons are far from unequivocal.