I'm not sure most people would have a near-zero chance of getting anywhere.
If AGI researchers took physical security super seriously, I bet this would make a malicious actors quite unlikely to succeed. But it doesn't seem like they're doing this right now, and I'm not sure they will start.
Theft, extortion, hacking, eavesdropping, and building botnets are things a normal person could do, so I don't see why they wouldn't have a fighting chance. I've been thinking about how someone could currently acquire private code from Google or some other cu...
Has this been discussed in detail elsewhere? I only saw one other article relating to this.
I'm not sure if a regular psychopath would do anything particularly horrible if they controlled AGI. Psychopaths tend to be selfish, but I haven't heard of them being malicious. At least, I don't think a horrible torture outcome would occur. I'm more worried about people who are actually sadistic.
Could you explain what the 1% chance refers to when talking about a corrupt businessman? Is it the probability that a given businessman could cause a catastrophe? I think th...
Even if there's just one such person, I think that one person still has a significant chance of succeeding.
However, more importantly, I don't see how we could rule out that there are people who want to cause widespread destruction and are willing to sacrifice things for it, even if they wouldn't be interested in being a serial killer or mass shooter.
I mean, I don't see how we have any data. I think that for almost all of history, there has been little opportunity for a single individual to cause world-level destruction. Maybe during the time around the Col...
I'm not worried about the sort of person who would become a terrorist. Usually, they just have a goal like political change, and are willing to kill for it. Instead, I'm worried about the sort of person who become a mass-shooter or serial killer.
I'm worried about people who value hurting others for its own sake. If a terrorist group took control of AGI, then things might not be too bad. I think most terrorists don't want to damage the world, they just want their political change. So they could just use their AGI to enact whatever political or other chang...
Could you explain how you come to this conclusion? What do you think your fundamental roadblock would be? Getting the code for AGI or beating everyone else to superintelligence?]