AFAIK, nothing of the kind is publicly available. The closest thing to it is probably his Intuitive Explanation of Bayes' Theorem; however, Bayes' Theorem is high-school math. (His Cartoon Guide to Lob's Theorem might also be relevant- although they may think it's just more words.) Two relevant quotes by Eliezer:
On some gut level I’m also just embarrassed by the number of compliments I get for my math ability (because I’m a good explainer and can make math things that I do understand seem obvious to other people) as compared to the actual amount of advanced math knowledge that I have (practically none by any real mathematician’s standard).
My current sense of the problems of self-modifying decision theory is that it won’t end up being Deep Math, nothing like the proof of Fermat’s Last Theorem—that 95% of the progress-stopping difficulty will be in figuring out which theorem is true and worth proving, not the proof. (Robin Hanson spends a lot of time usefully discussing which activities are most prestigious in academia, and it would be a Hansonian observation, even though he didn’t say it AFAIK, that complicated proofs are prestigious but it’s much more important to figure out which theorem to prove.)
complicated proofs are prestigious but it’s much more important to figure out which theorem to prove
Viewtifully phrased...
I blew through all of MoR in about 48 hours, and in an attempt to learn more about the science and philosophy that Harry espouses, I've been reading the sequences and Eliezer's posts on Less Wrong. Eliezer has written extensively about AI, rationality, quantum physics, singularity research, etc. I have a question: how correct has he been? Has his interpretation of quantum physics predicted any subsequently-observed phenomena? Has his understanding of cognitive science and technology allowed him to successfully anticipate the progress of AI research, or has he made any significant advances himself? Is he on the record predicting anything, either right or wrong?
Why is this important: when I read something written by Paul Krugman, I know that he has a Nobel Prize in economics, and I know that he has the best track record of any top pundit in the US in terms of making accurate predictions. Meanwhile, I know that Thomas Friedman is an idiot. Based on this track record, I believe things written by Krugman much more than I believe things written by Friedman. But if I hadn't read Friedman's writing from 2002-2006, then I wouldn't know how terribly wrong he has been, and I would be too credulous about his claims.
Similarly, reading Mike Darwin's predictions about the future of medicine was very enlightening. He was wrong about nearly everything. So now I know to distrust claims that he makes about the pace or extent of subsequent medical research.
Has Eliezer offered anything falsifiable, or put his reputation on the line in any way? "If X and Y don't happen by Z, then I have vastly overestimated the pace of AI research, or I don't understand quantum physics as well as I think I do," etc etc.