I was surprised by that as well, but I took that from an article by Jules Lobel, Professor of Law, University of Pittsburgh Law School based on a book he wrote:
...Influential intellectuals such as Bertrand Russell and the famous physicist Leo Szilard supported preventive war arguments, as did noted mathematicians such as John Williams, a leading figure at the RAND Corporation, and John von Neumann, the founder of gametheory.129 Von Neumann was a particularly strong advocate, remarking in 1950 that “[i]f you say why not bomb them tomorrow, I
Thanks for the clarification. If that's the plausible scenario for Aligned AGI, then I was drawing a sharper line between Aligned and Unaligned than was warranted. I will edit some part of the text on my website to reflect that.
Thanks for your comment. This is something I should have stated a bit more explicitly.
When I mentioned "single state (or part thereof)", the part thereof was referring to these groups or groups in other countries that are yet to be formed.
I think the chance of government intervention is quite high in the slow take-off scenario. It's quite likely that any group successfully working on AGI will slowly but noticeably start to accumulate a lot of resources. If that cannot be concealed, it will start to attract a lot of attention. I think it is unlikely that th...
Thanks for your comment.
If someone wants to estimate the overall existential risk attached to AGI, then it seems fitting that they would estimate the existential risk attached to the scenarios where we have 1) only unaligned AGI, 2) only aligned AGI, or 3) both. The scenario you portray is a subset of 1). I find it plausible. But most relevant discussion on this forum is devoted to 1) so I wanted to think about 2). If some non-zero probability is attached to 2), that should be a useful exercise.
I thought it was clear I was referring to Aligned AGI in the intro and the section heading. And of course, exploring a scenario doesn't mean I think it is the only scenario that could materialise.
My point is that plausible scenarios for Aligned AGI give you AGI that remains aligned only when run within power bounds, and this seems to me like one of the largest facts affecting the outcome of arms-race dynamics.
Thanks! There seems to be an openness towards error correction which is admirable and unfortunately uncommon.
I've started browsing and posting here a bit so I should introduce myself.
I've been writing online for around five months and put some draft chapters of a book on my website. The objective is to think about how to immunise a society from decline, which basically means trying to find the right balance between creativity and cohesion (not that they are inversely related—it’s quite possible to have neither). Because I can’t buy into any worldview out there today, I’ve tried to systematise my thoughts into a philosophy I call Metasophism. It’s a work in progre...
When it costs 20$ to transport a kg to low-earth orbit we might find a way to to mine palladium that can be sold for $34,115 per kg on earth or gold that can be sold for $60,882 per kg.
It would be interesting to see some kind of analysis of what the effect of asteroid mining could be on the prices of these commodities. For example, the global supply of palladium is just over 200 tonnes, so if asteroid mining could match that the price could fall quite dramatically.
The support provided in the book is purely anecdotal (along the lines of what I discussed above) and doesn't really discuss any other models. The alternative explantions I discuss such as re-religiofication due to material conditions are not mentioned in the book, which is wrote in a somewhat impressionistic manner.
Thanks for elaborating.
I agree with the point about utilities, and the fact that for utility-like services (more specifically, those with overwhelming network effects and economies of scale) it should be illegal to prevent access unless the person to whom service is being denied is doing something illegal.
Thanks for this very comprehensive review. It raises many interesting questions.
If people needed self-actualization, why choose anti-technology crusades? Why not self-actualize through invention, or art?
I think part of this is that you react against a system that doesn't give you much status. If the social system allocates most status and resources to people who can master the creation of technology and the allocation of capital, but you're not capable of that, then you will tend to criticise that system. And of course, most people are not capable of...
Thanks.
While there is probably is value in getting the broader population to become more risk-tolerant, I agree with the general gist of your first point.
Regarding your second, something that prevents people from speaking freely is the fear that unorthodox opinions will prevent them rising in hierarchies where selection is performed by those above them. Most people like to be flattered and have unquestioning followers, and will promote those in turn. This could also be the case in non-organisational hierarchies, such as academia. I try to address thi...
It does seem that the Trachtenberg reference basically relies upon individual recollections (which I don't trust), and the following extract from a 1944 letter by Szilard to Vannevar Bush (my bold):
... (read more)