I'm sure most of the readers of lesswrong and overcomingbias would consider a (edit: non-FAI) singleton scenario undesirable. (In a singleton scenario, a single political power or individual rules over most of humanity.)
Singleton could occur if a group of people developed Artificial General Intelligence with a significant lead over their competitors. The economic advantage from sole possession of AGI technology would allow the controllers of the technology the opportunity to gain a economic or even a political monopoly in a relatively short timescale.
This particular risk, as Robin Hanson pointed out, is less plausible if the "race for AGI" involves many competitors, and no competitor can gain too large of a lead over others. This "close race" scenario is more likely if there is an "open-source" attitude in the AGI community. Even if private organizations attempt to maintain exclusive control of their own innovations, one might hope that hackers or internal leaks would release essential breakthroughs before the innovators could gain too much of a lead.
Then, supposing AGI is rapidly acquired by many different powers soon after its development, one can further hope that the existence of multiple organizations with AGI with differing goals would serve to prevent any one power from gaining a monopoly using AGI.
This post is concerned with what happens afterwards, when AGI technology is more or less publicly available. In this situation, the long-term freedom of humanity is still not guaranteed, because disparities in access to computational power could still allow one power to gain a technological lead over the rest of humanity. Technological leads in the form of conventional warfare technologies are not as likely, and perhaps not even as threatening, as technological leads in the form of breakthroughs in cryptography.
In this information-dependent post-utopia, any power which manages to take control of the computational structures of a society would gain incredible leverage. Any military power which could augment their conventional forces with the ability to intercept all of their enemies' communications whilst protecting their own would enjoy an incredible tactical advantage. In the post-AGI world, the key risk for singleton is exclusive access to key-cracking technology.
Therefore, a long-term plan for avoiding singleton includes not only measures to promote "open-source" sharing of AGI-relevant technologies, but also "open-source" sharing of cryptographic innovations.
Since any revolutions in cryptography are likely to come from mathematical breakthroughs, a true "open-source" policy for cryptography would include measures to make mathematical knowledge available on an unprecedented scale. A first step to carrying out such a plan might include encoding of core mathematical results in an open-source database of formal proofs.
So you're saying that it would be OK if there were no further fruits of crypto research, but that the danger is that there will be further fruits and they won't be public, and that poses a great danger? What sort of danger do you have in mind?
Just so you know, crypto is my specialization; AFAIK I'm the person with the most expertise in the subject here on Less Wrong. I think we're having an "illusion of transparency" problem here, because you obviously feel you're being clear, and my efforts to understand you feel to me like pulling teeth. In your response, please could you err on the side of over-explaining, of saying as much as possible?
Thanks!
I suppose I could have been more coherent in the original post.
My claims.
1) In the future, cryptography will be a military technology as significant as e.g. nuclear weapons
2) However, how cryptography differs from conventional weapons technology is that giving everyone access to cryptography makes it less of a threat (whereas you would not want to give nukes to everyone)
3) To make cryptography less of a threat to political stability, there should be a concerted effort on the part of cryptography researchers and politicians to enforce that all cryptography research be open.