Michael Anissimov posted the following on the SIAI blog:
Thanks to the generosity of two major donors; Jaan Tallinn, a founder of Skype and Ambient Sound Investments, and Edwin Evans, CEO of the mobile applications startup Quinly, every contribution to the Singularity Institute up until January 20, 2011 will be matched dollar-for-dollar, up to a total of $125,000.
Interested in optimal philanthropy — that is, maximizing the future expected benefit to humanity per charitable dollar spent? The technological creation of greater-than-human intelligence has the potential to unleash an “intelligence explosion” as intelligent systems design still more sophisticated successors. This dynamic could transform our world as greatly as the advent of human intelligence has already transformed the Earth, for better or for worse. Thinking rationally about these prospects and working to encourage a favorable outcome offers an extraordinary chance to make a difference. The Singularity Institute exists to do so through its research, the Singularity Summit, and public education.
We support both direct engagements with the issues as well as the improvements in methodology and rationality needed to make better progress. Through our Visiting Fellows program, researchers from undergrads to Ph.Ds pursue questions on the foundations of Artificial Intelligence and related topics in two-to-three month stints. Our Resident Faculty, up to four researchers from three last year, pursues long-term projects, including AI research, a literature review, and a book on rationality, the first draft of which was just completed. Singularity Institute researchers and representatives gave over a dozen presentations at half a dozen conferences in 2010. Our Singularity Summit conference in San Francisco was a great success, bringing together over 600 attendees and 22 top scientists and other speakers to explore cutting-edge issues in technology and science.
We are pleased to receive donation matching support this year from Edwin Evans of the United States, a long-time Singularity Institute donor, and Jaan Tallinn of Estonia, a more recent donor and supporter. Jaan recently gave a talk on the Singularity and his life at a entrepreneurial group in Finland. Here’s what Jaan has to say about us:
“We became the dominant species on this planet by being the most intelligent species around. This century we are going to cede that crown to machines. After we do that, it will be them steering history rather than us. Since we have only one shot at getting the transition right, the importance of SIAI’s work cannot be overestimated. Not finding any organisation to take up this challenge as seriously as SIAI on my side of the planet, I conclude that it’s worth following them across 10 time zones.”
– Jaan Tallinn, Singularity Institute donor
Make a lasting impact on the long-term future of humanity today — make a donation to the Singularity Institute and help us reach our $125,000 goal. For more detailed information on our projects and work, contact us at institute@intelligence.org or read our new organizational overview.
-----
Kaj's commentary: if you haven't done so recently, do check out the SIAI publications page. There are several new papers and presentations, out of which I thought that Carl Shulman's Whole Brain Emulations and the Evolution of Superorganisms made for particularly fascinating (and scary) reading. SIAI's finally starting to get its paper-writing machinery into gear, so let's give them money to make that possible. There's also a static page about this challenge; if you're on Facebook, please take the time to "like" it there.
(Full disclosure: I was an SIAI Visiting Fellow in April-July 2010.)
It's been downvoted - I guess - because it sits on the wrong side of a very interesting dynamic: what I call the "outside view dismissal" or "outside view attack". It goes like this:
A: From the outside, far too many groups discover that their supported cause is the best donation avenue. Therefore, be skeptical of any group advocating their preferred cause as the best donation avenue.
B: Ah, but this group tries to the best of their objective abilities to determine the best donation avenue, and their cause has independently come out as the best donation avenue. You might say we prefer it because it's the best, not the other way around.
A: From the outside, far too many groups claim to prefer it because it's the best and not the other way around. Therefore, be skeptical of any group claiming they prefer a cause because it is the best.
B: Ah, but this group has spent a huge amount of time and effort training themselves to be good at determining what is best, and an equal amount of time training themselves to notice common failure modes like reversing causal flows because it looks better.
A: From the outside, far too many groups claim such training for it to be true. Therefore, be skeptical of any group making that claim.
B: Ah, but this group is well aware of that possibility; we specifically started from the outside view and used evidence to update properly to the level of these claims.
A: From the outside, far too many groups claim to have started skeptical and been convinced by evidence for it to be true. Therefore, be skeptical of any group making that claim.
B: No, we really, truly, did start out skeptical, and we really, truly, did get convinced by the evidence.
A: From the outside, far too many people claim they really did weigh the evidence for it to be true. Therefore, be skeptical of any person claiming to have really weighed the evidence.
B: Fine, you know what? Here's the evidence, look at it yourself. You already know you're starting from the position of maximum skepticism.
A: From the outside, there are far too many 'convince even a skeptic' collections of evidence for them all to be true. Therefore, I am suspicious that this collection might be indoctrination, not evidence.
And so on.
The problem is that the outside view is used not just to set a good prior, but also to discount any and all evidence presented to support a higher inside view. This is the opposite of an epistemically unreachable position - an epistemically stuck position, a flawed position (you can't get anywhere from there), but try explaining that idea to A. Dollars to donuts you'll get:
A: From the outside, far too many people accuse me of having a flawed or epistemically stuck position. Therefore, be skeptical of anyone making such an accusation.
And I am sure many people on LessWrong have had this discussion (probably in the form of 'oh yeah? lots of people think they're right and they're wrong' -> 'lots of people claim to work harder at being right too and they're wrong' -> 'lots of people resort to statistics and objective measurements that have probably been fudged to support their position' -> 'lots people claim they haven't fudged when they have' and so on), and I am sure that the downvoted comment pattern-matches the beginning of such a discussion.
Where is the evidence?