- At some point in the development of AI, there will be a very swift increase in the optimization power of the most powerful AI, moving from a non-dangerous level to a level of superintelligence. (Fast takeoff)
...unless people want it to go slowly. It isn't a law of nature that things will go quickly. It seems likely that a more unified society will be able to progress as slowly as it wants to. There are plenty of proposals to throttle development - via "nannies" or other kinds of safety valve.
Insistence on a rapid takeoff arises from a position of technological determinism. It ignores sociological factors.
IMO, the "rapid takeoff" idea should probably be seen as a fundraising ploy. It's big, scary, and it could conceivably happen - just the kind of thing for stimulating donations.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I prefer this briefer formalization, since it avoids some of the vagueness of "adequate preparations" and makes premise (6) clearer.