Review

During the American Revolution, a federal army and government was needed to fight against the British. Many people were afraid that the powers granted to the government for that purpose would allow it to become tyrannical in the future.

If the founding fathers had decided to ignore these fears, the United States would not exist as it is today. Instead they worked alongside the best and smartest anti-federalists to build a better institution with better mechanisms and with limited powers, which allowed them to obtain the support they needed for the constitution.

Is there something like a federalist vs anti-federalist debates of today regarding AI regulation? Is there someone working on creating a new institution with better mechanisms to limit their power, therefore assuring those on the other side that it won't be used a a path to totalitarianism? If not, should we start?

New Answer
New Comment

1 Answers sorted by

jacob_cannell

2-14

Right now we have a very centralized AGI race dominated by a few large US tech firms, with China in a distant second and not much else for known contenders. If the US/west makes a serious regulatory attempt to slow progress but China does not agree, that would probably only slow the race down by a few years and allow China's chosen winner(s) to take the lead. Seems more likely the US will not take that chance.

In the unlikely scenario where there is true full international cooperation to slow progress, I expect that to simply allow some new decentralized crypto org to take the lead. In general multipolar scenarios are probably better.

AGI is coup complete regardless and probably implies regime change eventually - but hopefully subtle rather than overt.

[+][comment deleted]1-8
[-]Lichdar-1-15

Incorrect, as every slowdown in progress allows alternative technologies to catch up and the advancement of monitoring solutions also will promote safety from what basically would be omnicidal maniacs(likely result of all biological life gone from machine rule).

1jacob_cannell
I said slowing down progress (especially of the centralized leaders) will likely lead to safer multipolar scenarios, so not sure what you are arguing is 'incorrect'.
1Lichdar
AGI coup completion is an assumption; if safer alternatives arise, such as biosingularity or cyborgism, it is entirely possible that it could be avoided and humanity remains extant.