All of keti's Comments + Replies

yet my probability of success would be absolutely tiny – like 0.01% even if I tried my absolute hardest. That's what I mean when I say that most people would have a near-zero chance. There are maybe a few hundred (??) people in the world who we even need to consider

Could you explain how you come to this conclusion? What do you think your fundamental roadblock would be? Getting the code for AGI or beating everyone else to superintelligence?]

 

How many people on the planet do you think meet the following conditions?

  1. Have > 1% of obtaining AGI.
  2. Have male
... (read more)
1Phil Dyer
  My fundamental roadblock would be getting the code to AGI. My hacking skills are non-existent and I wouldn't be able to learn enough to be useful even in a couple of decades. I wouldn't want to hire anybody to do the hacking for me as I wouldn't trust the hacker to give me my unlimited power once he got his hands on it. I don't have any idea how to assemble an elite armed squad or anything like that either. My best shot would be to somehow turn my connections into something useful. Let's pretend I'm an acquaintance of Elon Musk's PA (this is a total fabrication, but I don't want to give any actual names, and this is the right ballpark). I'd need to somehow find a way to meet Elon Musk himself (1% chance), and then impress him enough that, over the years, I could become a trusted ally (0.5%). Then, I'd need Elon to be the first one to get AGI (2%) and then I'd need to turn my trusted position into an opportunity to betray him and get my hands on the most important invention ever (5%). So that's 20 million to one, but I've only spent a couple of hours thinking about it. I could possibly shorten the odds to 10,000 to one if I really went all in on the idea. How would you do it?    Here we agree. I think most of the danger will be concentrated in a few, highly competent individuals with malicious intent. They could be people close to the tech or people with enough power to get it via bribery, extortion, military force etc.

I'm not sure most people would have a near-zero chance of getting anywhere.  

If AGI researchers took physical security super seriously, I bet this would make a malicious actors quite unlikely to succeed. But it doesn't seem like they're doing this right now, and I'm not sure they will start. 

Theft, extortion, hacking, eavesdropping, and building botnets are things a normal person could do, so I don't see why they wouldn't have a fighting chance. I've been thinking about how someone could currently acquire private code from Google or some other cu... (read more)

1Phil Dyer
Hi! Missed your reply for a few days. Sorry, I'm new here. I think our disagreement may stem from our different starting points. I'm considering literally every person on the planet and saying that maybe 1% of them would act malevolently given AGI. So a sadistic version of me, say, would probably be in the 98% percentile of all sadists in terms of ability to obtain AGI (I know people working in AI, am two connections away from some really key actors, have a university education, have read Superintelligence etc.), yet my probability of success would be absolutely tiny – like 0.01% even if I tried my absolute hardest. That's what I mean when I say that most people would have a near-zero chance. There are maybe a few hundred (??) people in the world who we even need to consider. I disagree. Theft and extortion are the only two (sort of) easy ones on the list imo. Most people can't hack or build botnets at all, and only certain people are in the right place to eavesdrop.  But OK, maybe this isn't a real disagreement between us. My starting point is considering literally everybody on the planet, and I think you are only taking people into account who have a reasonable shot.  How many people on the planet do you think meet the following conditions? 1. Have > 1% of obtaining AGI. 2. Have malevolent intent.

Has this been discussed in detail elsewhere? I only saw one other article relating to this.

I'm not sure if a regular psychopath would do anything particularly horrible if they controlled AGI. Psychopaths tend to be selfish, but I haven't heard of them being malicious. At least, I don't think a horrible torture outcome would occur. I'm more worried about people who are actually sadistic.

Could you explain what the 1% chance refers to when talking about a corrupt businessman? Is it the probability that a given businessman could cause a catastrophe? I think th... (read more)

Even if there's just one such person, I think that one person still has a significant chance of succeeding.

However, more importantly, I don't see how we could rule out that there are people who want to cause widespread destruction and are willing to sacrifice things for it, even if they wouldn't be interested in being a serial killer or mass shooter.

I mean, I don't see how we have any data. I think that for almost all of history, there has been little opportunity for a single individual to cause world-level destruction. Maybe during the time around the Col... (read more)

This is a good point. I didn't know this. I really should have researched things more.

I'm not worried about the sort of person who would become a terrorist. Usually, they just have a goal like political change, and are willing to kill for it. Instead, I'm worried about the sort of person who become a mass-shooter or serial killer.

I'm worried about people who value hurting others for its own sake. If a terrorist group took control of AGI, then things might not be too bad. I think most terrorists don't want to damage the world, they just want their political change. So they could just use their AGI to enact whatever political or other chang... (read more)

2Zac Hatfield-Dodds
Empirically, almost or actually no mass-shooters (or serial killers) have this kind of abstract and scope-insensitive motivation. Look at this writeup of a DoJ study: it's almost always a specific combination of a violent and traumatic background, a short-term crisis period, and ready access to firearms.
4Viliam
Sorry, this is infohazard. You don't want someone to read this text and think "actually, this is a cool idea".