I put together a 'landing page' for the intelligence explosion concept similar to Nick Bostrom's landing pages for anthropics, the simulation argument, and existential risk. The new website is IntelligenceExplosion.com. You can see I borrowed the CSS from Bostrom's anthropics page and then simplified it.
Just as with the Singularity FAQ, I'll be keeping this website up to date, so please send me corrections or bibliography additions at luke [at] singinst [dot] org.
That is a problem. What do ya'll think of the new image?
It doesn't make as much sense without the context of showing the parochial human picture first, and I'm worried that without that context it'll just come across as hyperbole. "The AI will be thiiiiiiiiiiis much smarter than Einstein!!!" It also suggests too strong a connection between recursive self-improvement and a specific level of intelligence.