Tim_Tyler comments on The Magnitude of His Own Folly - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (127)
That scenario is based on the idea of life only arising once. A superintelligence bent on short-term paperclip production would probably be handicapped by its pretty twisted utility function - and would most likely fail in competition with any other alien race.
Such a superintelligence would still want to conquer the galaxy, though. One thing it wouldn't be is boring.