You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Tuxedage comments on Quotes on Existential Risk - Less Wrong Discussion

3 Post author: lukeprog 28 April 2012 02:01AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (15)

You are viewing a single comment's thread.

Comment author: Tuxedage 28 April 2012 02:12:06PM *  8 points [-]

There is a saying in heuristics and biases that people do not evaluate events, but descriptions of events - what is called non-extensional reasoning. The extension of humanity's extinction includes the death of yourself, of your friends, of your family, of your loved ones, of your city, of your country, of your political fellows. Yet people who would take great offense at a proposal to wipe the country of Britain from the map, to kill every member of the Democratic Party in the U.S., to turn the city of Paris to glass - who would feel still greater horror on hearing the doctor say that their child had cancer - these people will discuss the extinction of humanity with perfect calm. "Extinction of humanity", as words on paper, appears in fictional novels, or is discussed in philosophy books - it belongs to a different context than the Spanish flu. We evaluate descriptions of events, not extensions of events. The cliché phrase end of the world invokes the magisterium of myth and dream, of prophecy and apocalypse, of novels and movies. The challenge of existential risks to rationality is that, the catastrophes being so huge, people snap into a different mode of thinking. Human deaths are suddenly no longer bad, and detailed predictions suddenly no longer require any expertise, and whether the story is told with a happy ending or a sad ending is a matter of personal taste in stories.

.

No more than Albert Szent-Györgyi could multiply the suffering of one human by a hundred million can I truly understand the value of clear thinking about global risks. Scope neglect is the hazard of being a biological human, running on an analog brain; the brain cannot multiply by six billion. And the stakes of existential risk extend beyond even the six billion humans alive today, to all the stars in all the galaxies that humanity and humanity's descendants may some day touch. All that vast potential hinges on our survival here, now, in the days when the realm of humankind is a single planet orbiting a single star. I can't feel our future. All I can do is try to defend it."

-- Eliezer Yudkowsky, Cognitive biases potentially affecting judgment of global risks

I hope I am allowed to quote EY. I personally thought this was a very well written and beautiful quote.

Comment author: pedanterrific 28 April 2012 06:20:07PM 4 points [-]

First reaction: EY wrote entire long paragraphs without any italics? So I looked up the paper. It should be "The extension of" and "phrase end of the world invokes". Apparently he toned it down.

Comment author: ciphergoth 28 April 2012 03:03:32PM 2 points [-]

This is two quotes; quite a bit of text separates these two paragraphs.