From a review of Greg Egan's new book, Zendegi:
Egan has always had difficulty in portraying characters whose views he disagrees with. They always end up seeming like puppets or strawmen, pure mouthpieces for a viewpoint. And this causes trouble in another strand of Zendegi, which is a mildly satirical look at transhumanism. Now you can satirize by nastiness, or by mockery, but Egan is too nice for the former, and not accurate enough at mimicry for the latter. It ends up being a bit feeble, and the targets are not likely to be much hurt.
Who are the targets of Egan’s satire? Well, here’s one of them, appealing to Nasim to upload him:
“I’m Nate Caplan.” He offered her his hand, and she shook it. In response to her sustained look of puzzlement he added, “My IQ is one hundred and sixty. I’m in perfect physical and mental health. And I can pay you half a million dollars right now, any way you want it. [...] when you’ve got the bugs ironed out, I want to be the first. When you start recording full synaptic details and scanning whole brains in high resolution—” [...] “You can always reach me through my blog,” he panted. “Overpowering Falsehood dot com, the number one site for rational thinking about the future—”
(We’re supposed, I think, to contrast Caplan’s goal of personal survival with Martin’s goal of bringing up his son.)
“Overpowering Falsehood dot com” is transparently overcomingbias.com, a blog set up by Robin Hanson of the Future of Humanity Institute and Eliezer Yudkowsky of the Singularity Institute for Artificial Intelligence. Which is ironic, because Yudkowsky is Egan’s biggest fan: “Permutation City [...] is simply the best science-fiction book ever written” and his thoughts on transhumanism were strongly influenced by Egan: “Diaspora [...] affected my entire train of thought about the Singularity.”
Another transhumanist group is the “Benign Superintelligence Bootstrap Project”—the name references Yudkowsky’s idea of “Friendly AI” and the description references Yudkowsky’s argument that recursive self-optimization could rapidly propel an AI to superintelligence. From Zendegi:
“Their aim is to build an artificial intelligence capable of such exquisite powers of self-analysis that it will design and construct its own successor, which will be armed with superior versions of all the skills the original possessed. The successor will produce a still more proficient third version, and so on, leading to a cascade of exponentially increasing abilities. Once this process is set in motion, within weeks—perhaps within hours—a being of truly God-like powers will emerge.”
Egan portrays the Bootstrap Project as a (possibly self-deluding, it’s not clear) confidence trick. The Project persuades a billionaire to donate his fortune to them in the hope that the “being of truly God-like powers” will grant him immortality come the Singularity. He dies disappointed and the Project “turn[s] five billion dollars into nothing but padded salaries and empty verbiage”.
(Original pointer via Kobayashi; Risto Saarelma found the review. I thought this was worthy of a separate thread.)
When I said that "real time" seems like a big deal, I didn't mean in terms of the fundamental nature of intelligence; I'm not sure that I even disagree about the whole notebook statement. But given minds of almost exactly the same speed there is huge advantage to things like answering a question first in class, bidding first on a contract, designing and carrying out an experiment fast, etc.
To the point where computation, the one place where we can speed up our thinking, is a gigantic industry that keeps expanding despite paradigm failures and quantum phenomena. People who do things faster are better off in a trade situation, so creating an intelligence that thinks faster would be a huge economic boon.
As for scenarios where speed is necessary that aren't interactive: if a meteor is heading toward your planet, the faster the timescale of your species' mind the more "time" you have to prepare for it. That's the least contrived scenario that I can think of, and it isn't of huge importance, but that was sort of tangential to my point regardless.