I'm looking over the table of contents to Intelligence Explosion Microeconomics, and it doesn't look as though there's any reference to what seems to me would be the most relevant topic of consideration to an intelligence explosion: returns on AI research. As I previously pointed out, an AGI that was just as "smart" as all the world's AI researchers combined would make AI progress at the same slow rate they are making AI progress, with no explosion. Having that AI make itself 10% "smarter" (which would take a long time--it's only as smart as the world's AI researchers) would only result in self-improvement progress that was 10% faster. In other words, it'd be exponential, yes, but it'd be an exponential like human economic growth, not like a nuclear chain reaction.
The empirical finding that when you combine the brainpower of the world's AI researchers (who are very smart people, according to a reliable source of mine), they get such low returns in terms of finding new useful AI insights, seems to me like it should weigh more than reasoning by analogy from non-AI domains.
(But even given this empirical finding, the question seems hopelessly uncertain to me, and I'm curious what justification anyone would give from updating strongly from even odds. The most salient observation I made from my recent PredictionBook experiment is that if a question is interesting enough for me to put it in PredictionBook, then I know less than I think about it and I'm best off giving it 50/50 odds. I suspect this applies to other humans, e.g. Jonah Sinick expressed a similar sentiment to me the other day. So a priori, the very fact that two smart people, Robin and Eliezer, take opposite sides of an issue should make us reluctant to assign any strong probabilities... I think :P)
So a priori, the very fact that two smart people, Robin and Eliezer, take opposite sides of an issue should make us reluctant to assign any strong probabilities... I think :P
Suppose experts' opinions were assigned by coin flip with a weighted coin, where the weight of the coin is the probability that makes best use of available information.
If we go to the first expert and they hold opinion Heads, what do we think the weighting of the coin is? 2/3. But then another expert comes along with opinion Tails, and so our probability goes back to 1/2. Last, we...
This is a thread where people can ask questions that they would ordinarily feel embarrassed for not knowing the answer to. The previous "stupid" questions thread is at almost 500 questions in about a month, so I think it's time for a new one.
Also, I have a new "stupid" question.