Unspecified false assumptions seems to be too weak to respond to. That page refers to a bunch of estimates and surveys relating to the estimated time to superintelligence by a range of individuals and groups.
WBE is more likely to me for a simple reason: we can implement a brain without understanding how it works. To achieve human-equivalent AGI requires that we know what we are doing.
If we can build a model of a worm brain, we can probably scale it up a billion times without any deep understanding of how it works. That's just one type of shortcut to superintelligence on the path to WBE. In practice there are lots of similar shortcuts - and taking any one of them skips WBE, making it redundant.
You simply don't need to understand how an adult brain works in order to build something with superior functionality. We did not need to understand how birds worked to make a flying machine. We did not need to understand how fish worked to build a submarine. Brains are unlikely to be very much different in that respect.
And despite six decades' worth of research on that topic, I cannot as of yet see any discernable indications that we are significantly closer to mastering general intelligence than we were when we began the effort.
So: you can't see much progress. However there evidently has been progress - now we have Watson, Siri, W.A., Netflix, Google and other systems doing real work - which is a big deal. Machine intelligence is on the commercial slippery slope that will lead to superintelligence - while whole brain emulation simply doesn't work and so has no commercial applications. Its flagship "products" are silly PR stunts.
The Whole Brain Emulation Roadmap is silent about the issue - but its figures generally support the contention that WBE is going to arrive too slowly to have a significant impact.
according to the information as yet available -- AGI is distantly remote. WBE is not.
That's just a baseless fantasy, IMHO.
If we can build a model of a worm brain, we can probably scale it up a billion times without any deep understanding of how it works. That's just one type of shortcut to superintelligence on the path to WBE.
Ten million dogs cannot contemplate what Shakespeare meant when he said that a rose, by any other name, would still smell as sweet. Even a billion dogs could not do this. Nor could 3^^^3 nematodes.
This belief is just plain unintelligent.
...You simply don't need to understand how an adult brain works in order to build something with superior functiona
If you were a utilitarian, then why would you want to risk creating an AGI that had the potential to be an existential risk, when you could eliminate all suffering with the advent of WBE (whole brain emulation) and hence virtual reality (or digital alteration of your source code) and hence utopia? Wouldn't you want to try to prevent AI research and just promote WBE research? Or is it that AGI is more likely to come before WBE and so we should focus our efforts on making sure that the AGI is friendly? Or maybe uploading isn't possible for technological or philosophical reasons (substrate dependence)?
Is there a link to a discussion on this that I'm missing out on?