This post is going to be downvoted to oblivion, I wish it weren't or that the two axis vote could be used here. In any case, I prefer to be coherent with my values and state what I think is true even if that means being perceived as an outcast.
I'm becoming more and more skeptical about AGI meaning doom. After reading EY's fantastic post, I am shifting my probabilities towards, this line of reasoning is wrong and many clever people are falling into very obvious mistakes. Some of them due to the fact that in this specific group believing in doom and having short timelines is well regarded and considered a sign of intelligence. For example, many people are taking pride at "being able to make a ton of correct inferences" before whatever they predict is proven true. This is worrying.
I am posting this for two reasons. One, I would like to come back periodically to this post and use it as a reminder that we are still here. Two, there might be many people out there that share a similar opinion and they are too shy to speak up. I do love LW and the community here, and if I think it is going astray for some reason it makes sense for me to say that loud and clear.
My reason to be skeptical is really easy: I think we are overestimating how likely is that an AGI can come up with feasible scenarios to kill all humans. All scenarios that I see discussed are:
- AGI makes nanobots/biotechnology and kills everyone. I am yet to see a believable description of how this takes place
- We don't know the specifics, but an AGI can come up with plans that you can't and that's enough. That is technically true but also a cheap argument that can be used for almost anything
It is being taken for granted that an AGI will be automatically almighty and capable of taking over in a matter of hours/days. Then, everything is built on top of that assumption, which is simply infalsifiable, because the you can't know what an AGI would do is always there.
To be clear, I am not saying that:
- Instrumental convergence and the orthogonality are not valid
- AGI won't be developed soon (I think it is obvious that they will)
- AGI won't be powerful (I think they will be extremely powerful)
- AGI won't be potentially dangerous: I think they will, and they might kill important numbers of people, they will probably be used as weapons
- AGI safety is not important, I think it is super important and I am glad people are working on this. However, I also think that fighting global warming is important but I don't think it it will cause the extinction of the human race, nor that we benefit in any meaningful way from telling people that it will
What I think is wrong is:
In the next 10-20 years there will be a single AGI that would kill all humans extremely quickly before we can even respond to that.
If you think this is a simplistic or distorted version of what EY is saying, you are not paying attention. If you think that EY is merely saying that an AGI can kill a big fraction of humans in accident and so on but there will be survivors, you are not paying attention.
I disbelieve that an AGI will kill all humans in a very short window of time
Most arguments for that are:
I am not convinced by those arguments
You can't, you are just fooling yourself into believing that you can. Or at least that's my impression after talking/reading what many people are saying when they think they have a plan for successfully killing humanity in 5 minutes. This is a pretty bad failure of rationality, I am pointing that out. The same people who think about these plans are probably not taking the effort to see why these plans might go wrong. If these plans go wrong, an AGI won't execute them, and that gives us time, which already invalidates the premise
This is totally true, but if is also a weak argument. I have an intuitive understanding on how difficult is to do X and this makes me skeptical of it. For instance, if you said to me that you have in your garage a machine that can put into orbit a satellite of 1000 kilos and it s made out of paper only, I would be skeptical of it. I won't say is physically impossible but I would assign to that a very low probability.
Yes. But put a naked human in the wild and it will easily killed by lions. It might survive for a while, but it won't be able to kill all lions everywhere in a blip of time