timtyler comments on Holden Karnofsky's Singularity Institute Objection 1 - Less Wrong

8 Post author: ciphergoth 11 May 2012 07:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (60)

You are viewing a single comment's thread. Show more comments above.

Comment author: timtyler 13 May 2012 11:45:57AM 1 point [-]

I tentatively hold, and believe it is the SI's position, that an uFAI is almost certain to produce human extinction.

When humans are a critical clue to the couse of evolution on the planet? Surely they would repeatedly reconstruct and rerun history to gain clues about the forms of alien that they might encounter - if they held basic universal instrumental values and didn't have too short a planning horizon.