I have encountered the argument that safe brain uploads are as hard as friendly AI. In particular, this is offered as justification for focusing on the development of FAI rather than spending energy trying to make sure WBE (or an alternative based on stronger understanding of the brain) comes first. I don't yet understand/believe these arguments.
I have not seen a careful discussion of these issues anywhere, although I suspect plenty have occurred. My question is: why would I support the SIAI instead of directing my money towards the technology needed to better understand and emulate the human brain?
Suppose human society has some hope of designing FAI. Then I strongly suspect that a community of uploads have at least as good a chance of designing FAI. If I can find humans who are properly motivated, then I can produce uploads who are also motivated to work on the design of FAI. Moreover, if emulated brains eventually outproduce us signfiicantly, then they have a higher chance of designing an FAI before something else kills them. The main remaining question is how safe an upload would be, and how well an upload-initiated singularity is likely to proceed.
There are three factors suggesting the safety of an upload-initiated singularity. First, uploads always run as fast as the available computing substrate. It is less likely for an upload to accidentally stumble upon (rather than design) AI, because computers never get subjectively faster. Second, there is hope of controlling the nature of uploads; if rational, intelligent uploads can be responsible for most upload output, then we should expect the probability of a friendly singularity to be correspondingly higher.
The main factor contributing to the risk of an upload-initiated singularity is that uploads already have access to uploads. It is possible that uploads will self-modify unsafely, and that this may be (even relatively) easier than for existing humans to develop AI. Is this the crux of the argument against uploads? If so, could someone who has thought through the argument please spell it out in much more detail, or point me to such a spelling out?
Constant,
What you are observing are the effects of relatively small rates of immigration, small enough that all kinds of complex and non-obvious effects are possible in a dynamic and diversified economy, especially since the skills profile of the immigrants is very different from the native population. However, if you kept adding an unlimited number of immigrants to a country at arbitrarily fast rates, including an unlimited number of immigrants skilled at each imaginable profession, the wages of all kinds of labor would indeed plummet. At some point, they would fall all the way down to subsistence, and if you kept adding extra people beyond that, they would fall even further and there would be mass famine.
Remember, we're not talking about a country that accepts an annual number of immigrants equal to 1%, or 5%, or even 10% or 20% of its population. We're talking about a magical world where the number of people can be increased by orders of magnitude overnight, with readily available skills in any work you can imagine. That is what uploads mean, and there's no way you can extrapolate the comparably infinitesimal trends from ordinary human societies to such extremes.
As for the issues of land, housing, and food production, that would also be fatal for humans. Uploads still require non-zero resources to subsist, and since the marginal cost of copying them is zero as long as there are resources available, they will be multiplied until they fill all the available resources. Now, a human requires a plot of land to produce his food and another plot of land for lodging (future technology may shrink the former drastically, but not to the level of an upload's requirements, and moreover the latter must remain substantial).
Unless the human owns enough land, he must pay the land rent to subsist (directly for the lodging land and through his food bills for the farming land). But the rent of land must be at least as high as the opportunity cost of forsaking the option to fill it up with a vast farm of slaving uploads and reap the profits, which will be many orders of magnitude above what a human can earn. It would be as if presently there existed a creature large enough to fill a whole state and requiring its entire agricultural output to subsist, but incapable of doing more productive work than a single human. How could such a creature support itself?
One the one hand you express near certainty about what would happen (wages "would indeed plummet"), and on the other hand you caution about extrapolating from ... (read more)