If everyone was to take Landsburg's argument seriously, which would imply that all humans were rational, then everyone would solely donate to the SIAI. If everyone only donated to the SIAI, would something like Wikipedia even exist? I suppose the SIAI would have created Wikipedia if it was necessary. I'm just wondering how much important stuff out there was spawned by irrational contributions and how the world would look like if such contributions would have never been made. I'm also not sure how venture capitalist growth funding differs from the idea to diversify one's contributions to charity.
Note that I do not doubt the correctness of Landsburg's math. I'm just not sure if it would have worked out given human shortcomings (even if everyone was maximally rational). If nobody was to diversify, contributing to what seems to be the most rational option given the current data, then being wrong would be a catastrophe. Even maximally rational humans can fail after all. This wouldn't likely be a problem if everyone contributed to a goal that could be verified rather quickly, but something like the SIAI could eat up the resources of the planet and still turn out to be not even wrong in the end. Since everyone would have concentrated on that one goal (no doubt being the most rational choice at the moment), might such a counterfactual world have been better off diversifying its contributions or would the SIAI have turned into some kind of financial management allocating those contributions and subsequently become itself a venture capitalist?
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I really do not want to argue about semantics either, but our agreed interpretation makes Niel's statement equivalent to "our visual system is not optimal for non-ancestral environments", which is highly uninteresting. I think the Dawkin's larengyal nerve example is much more interesting in this sense, since it points out body designs do not come from a sane Creator, at least in some instances (which is enough for his point).
Since we do not live in the ancestral environment now, I think the quotation could be just underlining how we should viscerally know our brain is going to output sub-optimal crud given certain inputs. Upvoted original.