Sure, you cannot rely on spontaneous emergence for anything predictable, as neural network attempts at AGI demonstrate. My point was that if you ignore the chance of something emerging, that something will emerge in a most inopportune moment. I see your original point, though. Not sure if it can be successful. My guess is that the best case is some kind of "controlled emergence", where you at least set the parameter space of what might happen.
http://singularity.org/blog/2013/01/30/we-are-now-the-machine-intelligence-research-institute-miri/
As Risto Saarelma pointed out on IRC, "Volcano Lair Doom Institute" would have been cooler, but this is pretty good too. As the word "Singularity" has pretty much lost its meaning, it's better to have a name that doesn't give a new person all kinds of weird initial associations as their first impression. And "Machine Intelligence Research Institute" is appropriately descriptive while still being general enough.