Calien comments on Open Thread, May 25 - May 31, 2015 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (301)
I know this may come across as sociopathically cold and calculating, but given that post-singularity civilisation could be at least thirty orders of magnitude larger than current civilisation, I don't really think short term EA makes sense. I'm surprised that the EA and existential risk efforts seem to be correlated, since logically it seems to me that they should be anti-correlated.
And if the response is that future civilisation is 'far' in the overcoming bias sense, well, so are starving children in Africa.
Proponents of both have the same attitude of "this is a thing that people ocassionally give lip service to, that we're going to follow to a more logical conclusion and actually act on".