We don't know, but isn't that kinda the point? If you're gambling essentially with some 10 bn human lives, any probability of drastically unpredictable, possibly negative, paradigm-obliterating outcomes above 0.00% is morally unacceptable, and should be seen as just cause to slow this headlong rush toward AGI. The biggest decision in history with consequences for so many, is the least appropriate decision to be made so utterly unilaterally, by so few. If this goes ahead, and a handful of Silicon Valley CEOs and shareholders decide to impose a different and unpredictable future on 10 billion humans without consulting them, without asking anyone's permission to gamble with the future of our entire species, it will be the most atrocious, profound disenfranchisement of human beings, the biggest breach of democracy and human rights, in history, by a long way, because of how many are affected. The immorality and unacceptability of imposing AGI on humanity, without first knowing more about what might happen, is in this case not determined by the % likelihood of a bad future. It is made unacceptable by the scale of human (and other Earthling) life that would be affected by any radical outcomes, and permutations thereof, at all. Given that nobody is yet able to predict what will happen, surely the most sensible thing to do is slow this bullet train a little, have long, and many conversations, run complex simulations, build ethical frameworks necessary for the safe emergence of sentient AI so we don't mess up first contact (if we haven't already) and so forth. Except for the demands of the profiteers, why the rush? It is flagrantly irresponsible, criminally risky behaviour for any company to accelerate all of us toward this precipice, when nobody can yet say with any assuredness how safe it will be when we get there. Logic dictates the many stand up and makes ourselves heard. You do not get to decide the fate of billions, in the individualist pursuit of wealth and power. Not this time. There's absolutely no risk to slowing down, and all the risk in thinking this is a race. Such foolhardy myopic illogical behaviour cannot be permitted to chart the course for the rest of us.
We don't know, but isn't that kinda the point? If you're gambling essentially with some 10 bn human lives, any probability of drastically unpredictable, possibly negative, paradigm-obliterating outcomes above 0.00% is morally unacceptable, and should be seen as just cause to slow this headlong rush toward AGI. The biggest decision in history with consequences for so many, is the least appropriate decision to be made so utterly unilaterally, by so few. If this goes ahead, and a handful of Silicon Valley CEOs and shareholders decide to impose a different and unpredictable future on 10 billion humans without consulting them, without asking anyone's permission to gamble with the future of our entire species, it will be the most atrocious, profound disenfranchisement of human beings, the biggest breach of democracy and human rights, in history, by a long way, because of how many are affected. The immorality and unacceptability of imposing AGI on humanity, without first knowing more about what might happen, is in this case not determined by the % likelihood of a bad future. It is made unacceptable by the scale of human (and other Earthling) life that would be affected by any radical outcomes, and permutations thereof, at all. Given that nobody is yet able to predict what will happen, surely the most sensible thing to do is slow this bullet train a little, have long, and many conversations, run complex simulations, build ethical frameworks necessary for the safe emergence of sentient AI so we don't mess up first contact (if we haven't already) and so forth. Except for the demands of the profiteers, why the rush? It is flagrantly irresponsible, criminally risky behaviour for any company to accelerate all of us toward this precipice, when nobody can yet say with any assuredness how safe it will be when we get there. Logic dictates the many stand up and makes ourselves heard. You do not get to decide the fate of billions, in the individualist pursuit of wealth and power. Not this time. There's absolutely no risk to slowing down, and all the risk in thinking this is a race. Such foolhardy myopic illogical behaviour cannot be permitted to chart the course for the rest of us.