First, at least it establishes a minimum. If an AI can learn the basics of English in a day, then it still has that much of a head start against humans. Even if it takes longer to master the rest of language, you can at least cut 3 years off the training time, and presumably the rest can be learned at a rapid rate as well.
It also establishes that AI can teach itself specialized skills very rapidly. Today it learns the basics of language, tomorrow it learns the basics of programming, the day after it learns vision, and then it can learns engineering nanotechnology, etc. This is an ability far above what humans can do, and would give it a huge advantage.
Finally, even if it takes months, that's still FOOM. I don't know where the cutoff point is, but anything that advances at a pace that rapid is dangerous. It's very different than than the alternative "slow takeoff" scenarios where AI takes years and years to advance to superhuman level.
I've been going through the AIFoom debate, and both sides makes sense to me. I intend to continue, but I'm wondering if there're already insights in LW culture I can get if I just ask for them.
My understanding is as follows:
The difference between a chimp and a human is only 5 million years of evolution. That's not time enough for many changes.
Eliezer takes this as proof that the difference between the two in the brain architecture can't be much. Thus, you can have a chimp-intelligent AI that doesn't do much, and then with some very small changes, suddenly get a human-intelligent AI and FOOM!
Robin takes the 5-million year gap as proof that the significant difference between chimps and humans is only partly in the brain architecture. Evolution simply can't be responsible for most of the relevant difference; the difference must be elsewhere.
So he concludes that when our ancestors got smart enough for language, culture became a thing. Our species stumbled across various little insights into life, and these got passed on. An increasingly massive base of cultural content, made of very many small improvements is largely responsible for the difference between chimps and humans.
Culture assimilated new information into humans much faster than evolution could.
So he concludes that you can get a chimp-level AI, and to get up to human-level will take, not a very few insights, but a very great many, each one slowly improving the computer's intelligence. So no Foom, it'll be a gradual thing.
So I think I've figured out the question. Is there a commonly known answer, or are there insights towards the same?