Eliezer...Are these available? Are they the standard stuff (i.e., "Evidence and Import")?
Yes, and his posts about intelligence explosion on Overcoming Bias, this, this, and unfortunately comments scattered around Less Wrong or various interveiws that would take some work to find and gather in one place.
How do you arrive at that conclusion? I'm less skeptical of the cause-specific claim than the organization-specific claim, but it's worth digging deeper into.
Nick Bostrom's book on superintelligence probably provides the best single treatment now, having synthesized most pre-existing work. It is moving towards publication, but you might ask him if you can read the draft.
having synthesized most pre-existing work
Most pre-existing work? I would've said "having synthesized ~5% of pre-existing work related to superintelligence strategy that has been done at or near MIRI and FHI."
In the past, people like Eliezer Yudkowsky (see 1, 2, 3, 4, and 5) have argued that MIRI has a medium probability of success. What is this probability estimate based on and how is success defined?
I've read standard MIRI literature (like "Evidence and Import" and "Five Theses"), but I may have missed something.
-
(Meta: I don't think this deserves a discussion thread, but I posted this on the open thread and no-one responded, and I think it's important enough to merit a response.)