At the FHI, we are currently working on a project around whole brain emulations (WBE), or uploads. One important question is if getting to whole brain emulations first would make subsequent AGI creation
- more or less likely to happen,
- more or less likely to be survivable.
If you have any opinions or ideas on this, please submit them here. No need to present an organised overall argument; we'll be doing that. What would help most is any unusual suggestion, that we might not have thought of, for how WBE would affect AGI.
EDIT: Many thanks to everyone who suggested ideas here, they've been taken under consideration.
There seems a reasonable chance that none of these will FOOM into a negative Singularity before we get hi-fi WBE (e.g., if lo-fi WBE are not smart/sane enough hide their insanity from human overseers and quickly improve themselves or build powerful AGI), especially if we push the right techs so that lo-fi WBE and hi-fi WBE arrive at nearly the same time.
This argument can't be right and complete, since it makes no reference at all to WBE, which has to be an important strategic consideration. You seem to be answering the question "If we had to push for FAI directly, how should we do it?" which is not what I asked.
This seems to me likely to be very hard, without something like a singleton or a project with a massive lead over its competitors that can take its time and is willing to despite the strangeness and difficulty of the problem, competitive pressures, etc.
A boost to neuroimaging goes into the public tech base, accelerating all WBE projects without any special advantage to the safety-oriented. The thought with decision theory is that the combination of its ... (read more)