Today's post, Scope Insensitivity, was originally published on 14 May 2007. A summary (taken from the LW wiki):
The human brain can't represent large quantities: an environmental measure that will save 200,000 birds doesn't conjure anywhere near a hundred times the emotional impact and willingness-to-pay of a measure that would save 2,000 birds.
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Third Alternatives for Afterlife-ism, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.
My personal investment thesis, formed after reading The Black Swan and The Big Short, is designed to take advantage of scope insensitivity.
The thesis is: Invest in a 1-10% chance of a 100x-1000x return
The reasoning is that other investors act the same when the chance of something is anywhere from 0.5% to 10%, and they act the same when the potential value of something is anywhere from 10x to 1000x, so I'm buying underpriced investments.
I currently know of two investments that follow my thesis: