If you were a utilitarian, then why would you want to risk creating an AGI that had the potential to be an existential risk, when you could eliminate all suffering with the advent of WBE (whole brain emulation) and hence virtual reality (or digital alteration of your source code) and hence utopia? Wouldn't you want to try to prevent AI research and just promote WBE research? Or is it that AGI is more likely to come before WBE and so we should focus our efforts on making sure that the AGI is friendly? Or maybe uploading isn't possible for technological or philosophical reasons (substrate dependence)?
Is there a link to a discussion on this that I'm missing out on?
You say (emphasis mine):
That's an enormous non sequitur. The resources necessary for maintaining a utopian virtual reality for a WBE may indeed be infinitesimal compared to those necessary for keeping a human happy. However, the easiness of multiplying WBEs is so great that it would rapidly lead to a Malthusian equilibrium, no matter how small the cost of subsistence per WBE might be.
For an in-depth treatment of this issue, see Robin Hanson's writings on the economics of WBEs. (Just google for "uploads" and "ems" in the archives of Overcoming Bias and Hanson's academic website.)
I'll look into it. What is the motivation for these uploads to multiply? I can understand the human desire to. But even if uploads cannot directly change their source code, it seems pretty likely that they could change their utility function to something that is a little more logical (utilitarian). If they don't have the desire to copy themselves indefinitely (something which humans basically have due to our evolutionary history), doesn't this lower the probability of a population explosion uploads?