1 min read

1

This is a special post for quick takes by DialecticEel. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
2 comments, sorted by Click to highlight new comments since:

Global Wealth Redistribution to Mitigate Arms Races

Is it irrational for North Korea to try and build nuclear weapons? Maybe. However if your country is disenfranchised and in poverty, it does seem like one route to having a say in global affairs and a better life. There are certainly other routes, and South Korea offers an example of what countries can achieve. However, as the world does not have a version of a 'safety net' for poor countries, there remains some incentive to race for power. In other words: if you are not confident that those in power are looking out for your interests, it might make sense to start seeking power through one mechanism or another.

Therefore the case that I would state is that by not having convincing mechanisms to make the world fair in terms of justice and economic opportunity, the global situation is made far more dangerous for all actors. If we are talking about extinction in the face of arms races, then global enfranchisement and global wealth redistribution is something to seriously consider in order to take the edge off arms races. To reassure everyone that their interests will be considered, that their needs will be met, and it isn't just about who wins (even if winning destroys humanity and whoever 'won').

I've given a more thorough background to this idea in a presentation here https://docs.google.com/presentation/d/1VLUdV8ZFvS_GJdfQC-k7-kMhUrF0kzvm6y-HLEaHoCU and I am trying to work it through more thoroughly. The essential point is to consider mutualistic agency as a potentially desired and even critical feature of systems that could be considered 'friendly' to humans and self-determination as an important form of agency that lends itself to a mathematical analysis via conditional transfer entropy. This is very much an early stage analysis, however what I do think is that our capacities to affect the world are increasing much faster than how precisely we understand what it is we want. In some sense I think it's necessary to understand very precisely the things that are already really obviously important to all humans. Otherwise in our industrial exuberance it seems quite likely we may end up with and engineer worlds that literally no-one wants.